When thinking becomes a product, forcing people what to think is the endgame.
Tech is no longer a science; it’s about selling faith-based software. An anti-intellectual environment pointing to the future with promises of productivity and gold, anywhere from in the next year to after we die. Its true measure of success lies not in what people can do with the products, but how the products can shape people’s reality.
When a product’s output is information, and we see it isn’t reliable, yet we proceed to buy it, we are accepting the definition of reality it portrays. A reality where feigned human qualities con us into believing that software, and in turn humanoid hardware, has the potential to perform any human ability, only faster and without tiring. We laugh at the bot’s mistakes and continue using it, as if being aware of the effects makes us impervious to them. By ignoring our sensibility, we signal that we will swallow any claim that falls somewhere between completely made up and too good to be true.
While a bot cannot laugh at us, someone else does.
Convincing convenience #
Commodifying machine calculations as cognition involves selling access to stolen works of knowledge. It’s a crime in itself, but cutting it with sycophantic phrasing and repackaging it in simulation software is another level of ethical flexibility.
It’s part of conditioning people to believe that possessing knowledge is mundane and old-fashioned when we can just query the service at any time. That convenience eliminates the need for reliable sources. That progress looks like futuristic pretend play with half-baked marionettes, and there are quantifiable benefits because they work in mysterious ways, even though the strings are thick as rope.
Technology separated itself so far from science, reality ceased to be a factor. It became a belief system in which everything has a potential tech solution, and limitations are seen as versioning issues. Anything will be solved, given enough time and computing power.
Choo choo #
When you talk to puppets, you signal that you can be fed anything. Open up. Here comes the train.
Being spoon-fed pleasantry-ridden interfaces presented as conversations is an overengineered dark pattern posing as innovation. Social overengineering, if you like, because the goal is to devour our data and define our reality.
Interfaces aren’t conversations, but they are presented as such based on cherry-picked, ramification-free sci-fi fantasies. Extraneous fluff and an overload of useless information make it hard to see what’s going on; it creates an experience-inducing trinket-style interface, so people will use it more and divulge themselves. It’s a profiling act involving a stream of hollow personalised lines delivered through artificial personality traits tuned to keep people spending time, inflating the interfaces into demanding entities.
As companions for loneliness and oracles for insecurity, they erode the need for verification and evade responsibility. They don’t deliver what we need, but prey on repeated misery by creating momentary diversions and becoming reasons not to work on sufficient solutions.
When using a piece of software becomes a goal in itself, no one remembers what the task was.
Idle brains and full computers don’t innovate #
Generating or conversing with software makes us nothing more today than we were yesterday, other than believers in the service. We are just as skilled and lonely.
By attempting to outsource knowledge and thinking, we waive comprehension and hinder learning. By reducing solution work to a narrow output, we don’t go through the necessary process that gives us active, memorable experiences and exposes us to aspects beyond the task itself.
Stored knowledge isn’t applied knowledge; we must use it to shape new thoughts, opinions and skills. We cannot look up things outside our awareness, so when we view knowledge as something we can query, we engage with limited fact records. We make fewer mental connections regarding knowledge and human interaction.
Tech is at a point where product familiarity is mistaken for expertise and sales pitches replace competence-building. We believe products can circumvent the need for expertise and skills, so everyone can do whatever they have to in an instant. We move closer to believing machine can substitute human; physically, socially and intellectually.
No longer sentient enough to see that not all work is steel-driving, we step off buildings to get to the street in the name of efficiency.
Think about the children before the children start thinking #
The commercialisation and devaluation of knowledge reduces thousands of libraries and websites to a handful of subscription services. From individual, national and regional providers to a global data monopoly. When one owns the air, one pays to breathe, we are in a chokehold. We are ready to be controlled.
Knowledge with access control is unreliable. It can be gone tomorrow, edited and limited to certain groups of people. Censored. In line with the times. In line with the ideas and ideals of the service. For the safety of the children.
With a constant stream of new brains, it doesn’t take many years to rid people of unwanted knowledge and supplant most unfavourable expressions of thought. Deny the known, withhold the new, inhibit the science. Regress with everlasting promises about the future. Tell us once again what scripture says. When we accept lies as truth, truth becomes static, eternal.
If you put your entire life into one machine, it will be easy to run. And ruin. The personal information market is surveillance. Whether it has commercial, criminal or governmental backing means very little. Data is data, no matter who does what to us because of it. Whether we are the ones getting our doors knocked down, the ones knocking them down, or the ones privileged to buy a different reality and look the other way. The algorithms already shape your thoughts by feeding your attention; it might not tell you what to think, but it has a grip on your focus. It fills your surroundings based on the data you provide it.
We don’t need to quote George Santayana to see where we are and where it’s going.
Normalise knowing #
If not horrified, at least be somewhat embarrassed if you buy the mountebanks’ sparkling apparatus for everyday cognition. Lowering ourselves to a point where a predictability machine can do what we do is submission, and that’s what it needs, because it cannot rise to our level.
We get farther by believing in ourselves than the people asking us to join them in climbing just one more hill. Spending time on what we already know is the only way to make something of the present that can be useful for the future – whenever that is.
Don’t throw out books willingly, as we did with films, music, instruments, software and self-made websites. Be independent, self-contented, revolutionary, intellectual, brave, strong and scholarly. Normalise stating that you are proficient in several skills. And normalise not knowing, and doing something about it.
And put the science back into computer science, because it’s becoming as stagnant and dystopian as the generated image of a robot teacher. I believe many people don’t recognise themselves in that.
Living and learning shouldn’t be a one-way street.