If algorithms are rhythms in code, we got the notes right, but the rhythm is clumsy. Instead of music, we create noise. If we fix it, we would have enormous power to transform society.
Which is more important to music: the right notes or the right rhythm? Rhythm is more than playing a lot of notes correctly. Rhythm is like spoken calligraphy. Inaccurate notes, lack of confidence, and hesitation kill the rhythm. The more complex the pattern, the more critical the rhythm is. That’s what experts say.
If algorithms are rhythms in code, we got the notes right, but the rhythm is clumsy. Instead of music, we create noise. We are “out of time,” out of sync.
Behind every algorithm, there’s a philosophy, a worldview, and a mind that creates based on cognitive filters. They have a powerful presence in our lives, assisting us in decision making and ultimately shaping our social connections and relations.
Algorithms influence what we know (Google, Twitter), who we know (Facebook, LinkedIn), what we buy (Amazon), what we experience (Youtube, Netflix), who we date (Tinder, Bumble). Slowly, our lives are becoming very algorithmic.
In recent years, algorithms gained iconic status because of what they represent – the idea that we can surpass our human limitations and thrive in the digital realm that is too complex to understand on our own.
However, we shouldn’t forget that algorithms are closed systems (so far). They stretch as far as their maker’s understanding and imagination of the world. The question is, how much of it can they capture? And if they are so important, why aren’t we exposing their structure in the open?
We need to think about the powerful ways in which notions and ideas about the algorithm circulate through the social world.
The bad …
Algorithms can automate but shouldn’t formalize norms.
The algorithm doesn’t know you. It knows only what you give.
In 2019, Jessica Johnson won the Orwell Youth Prize with a short story, “A Band Apart,” about a dystopian future where an algorithm splits students into bands based on their background.
“Deep down, I already knew that whatever result I got, nothing I ever did, or could do would change the Band I was assigned.” A Band Apart
When the pandemic hit last year, Jessica discovered that she had “fallen into her own story.” With canceled high school exams in many countries, the educational authorities used AI to assign the final grades. The experiment was not a success. Jessica’s results were downgraded (from A to B). Her chance of getting a prestigious university scholarship to study English Literature was at risk.
Thousands of unhappy students and parents have since launched a furious protest campaign. They disputed not the graded papers but the AI assessment. And more frustratingly than that, there was no procedure to appeal against an AI decision. The algorithm was untouchable.
The pandemic accelerated digitization faster than anticipated. We are now exposed more than ever to situations when decisions made by an algorithm have life-altering consequences for the people involved. The current level of transparency is nowhere near what is required.
But before we fix data quality, we must question our decision-making practices. We need to learn to work with algorithms and not simply turn their outputs into new norms.
The most dangerous algorithms are the hidden ones.
“The algorithm is forever changing. But, rarely in your favor.” David Hieatt
Algorithms are the invisible gatekeepers of the Internet. We are free to navigate the opportunity space based on our physical proximity, but we can’t go further than that unless the algorithm lets us through. Getting a spotlight in the age of attention is very costly.
We know that algorithms are used to promote certain visions of calculated perspectives. Often, the uncertainty about what is in the black box makes us misjudge their power or overemphasize their importance.
It leads to two important questions:
- Do we have to comply with the algorithm to use the power of the Internet? Answer: Yes. Avoiding the algorithm may work for a while, but it isn’t a viable long-term strategy in a highly digital world.
- Do the invisible algorithms in use leave room for transparency? Answer: No. The black box is the most protected asset of software companies.
In an article published in the Information, Communication & Society Journal, David Beer argues that we shouldn’t approach the algorithm as lines of code or a technical and self-contained object. They are dynamic processes that exist in the social consciousness and create outcomes influenced by interests. If they make sense of the world on our behalf, we should know more about what is going on.
Government and industry decision-makers build new standards for algorithmic audits to measure the potential harm caused by an algorithm. Things are changing, but the pace is slow. We must keep up the good work done and do more.
We need to put more effort into fighting the blame culture. Unethical issues, especially on social algorithms, are high stakes with severe consequences. They are a big common enemy to make competitors join forces to act with accountability and integrity.
The good …
Algorithms help with predictable complexity.
It’s human nature to believe that our actions are unique and completely in our control. In reality, these actions often follow some predictable patterns that can turn into code.
Sports are often seen as the pinnacle of human ability and removed from the digital domain. But this is changing. Researchers from the University of Tübingen found that they can correlate goal scoring with eye movement based on how players tracked the ball with their eyes. They say the future of sports will be highly dependent on algorithms.
Algorithms have proven that they can handle predictable complexity by processing big data and computations on a large scale. They increase productivity, automate work processes, and improve our physical and cognitive skills.
Still, the unknown and the unpredictable are human domains. We may be susceptible to errors, but our capacity to learn and question the nature of our reality overpowers any algorithm. We have an indispensable role in the digital world, provided we rethink our relationships with AI and reinvent ourselves.
We need more “smart algorithms.”
Algorithms have no concept that more is not always better.
Creating content is a tricky business, with many variables that influence the conversion success:
- Channel: organic or paid reach on which platform
- Type: user-generated content or external links reshare
- Focus: entertainment or critical-thinking
Ultimately, the deciding power is with the algorithm and how well the content aligns with the platform’s internal guidelines. If it doesn’t, the organic reach is negligible. The entire process is not transparent, and platforms’ internal content guidelines remain a mystery.
What is obvious is that many algorithms favor positive emotions. Content for fun (inspirational, optimistic, easy-going) is better because optimism is powerful. Perhaps this is a sign of the times with so much disruption and change. Everyone wants to leave behind any worries and anxiety.
“I’m into selling to the masses. If you are going to sit in here and sing your art shit, I’m not into it. No one likes good music anymore, OK! People want something that they can go: Oh, I get it, I get it.Tell me how to think, tell me how to think, please …”, You in Your Weresong by Kevin Drew
Will things change? It is hard to speculate. We need more disruption and new business models around quality, not scale.
We shouldn’t blame AI for being biased. We are accountable.
In a recent article published in MIT Technology Review, “What is an algorithm? It depends whom you ask”, the authors argue that describing a decision-making system as an “algorithm” is often a way to deflect accountability for human decisions.
We are often too quick to judge the algorithm, pointing issues with coding, quality of the datasets, or training methods. Instead, we should compare the risk of bias related to the code with the bias coming from decisions made by humans. The technology itself is not to blame for any discrimination. It only mirrors systemic gender biases in the real world.
When models are trained, they are imperfect by definition because they must be flexible and intuitive. While algorithms cannot be perfect and shouldn’t be, we need more transparency to know how to mitigate bias risk to trust them.
“If you made the perfect system, it would imply we had no free will as human beings. And that’s a scary thought.” Rumman Chowdhury, Responsible AI Lead at Accenture
Current algorithms don’t favor quality content because the business model revolves around consumerism. But as the economy evolves to become more conscious and circular, algorithms can evolve too.
The beautiful …
The algorithm has a rhythm.
Rhythm follows patterns. It shifts and adjusts, never fully settling. Without a good rhythm, you will easily get lost, out of sync with context and reality. Having a bad rhythm will affect your ability to create and connect.
Rhythm is also about togetherness. Just like rails for a train, the rhythm gives a steady, seamless track for the band to ride along. When things completely fall apart in performance, there is a train wreck. If the rails are bad, the train is going nowhere.
The rhythm of algorithms (algorhythms) can be seen as an indicator of our limits and capacity to change and grow. We need to feed it with the right role models, language, and experiences. We need to give it responsibility but remain involved and accountable to make it better.
This is our chance. What is at stake is more than a train wreck. What we can gain is beyond imagination but not outside reality. It is up to us to make the best of it.