Should Everyone Code?

In the “fourth industrial revolution”, everything will be digitised. Does that mean programming is an essential skill for every job, asks Carolyn de Kock.

Everything is digitised. From the way we conduct our social lives to the way we book a cab, to the factory systems that are used to do something as innocuous as bottle milk, powerful algorithms are absorbing data and churning out new and improved ways of doing things.

The influence of digitisation is everywhere; Facebook’s Mark Zuckerberg believes data can be connected to form a fundamental mathematical law underlying human relationships, and there have been strong claims that voting patterns can be influenced by computer-controlled marketing. In manufacturing, AI and automation are creating incredible productivity gains: Apple and Samsung supplier Foxconn has replaced 60 000 factory workers with robots. Another factory in Dongguan, China, laid off 90% of its workers, and reported a 250% increase in productivity and an 80% drop in defects.

Economists describe this current process of digitisation, big data and analytics as “the fourth industrial revolution”, and the future seems to belong to those who develop the systems that can deliver these modern efficiencies.

So, should everyone learn to code?

History repeats itself

The analogy with the industrial revolution is apt – there are many parallels in skills development today with what happened in the 18th and 19th Centuries. With the rise of new manufacturing processes, many labourers lost their jobs. But with the improved communication networks and access to information, universal basic education systems were also created.

A recent article in the Economist, titled “Coming to an office near you”, contemplated this and argued that “although innovation kills some jobs, it creates new and better ones, as a more productive society becomes richer, and its wealthier inhabitants demand more goods and services”.

Or as Henry Ford, creator of the modern factory, put it: yes, shoe factories brought down the price of shoes and put cobblers out of business, but, as a result, people bought many more items of footwear and more than enough new jobs were created to replace those lost..

Is coding the essential skill for the jobs of tomorrow? Many governments seem to think so, and bringing computer coding into the standard curriculum is very much en vogue. In the UK, children from the age of five are taught coding in state schools. New York has embarked on an ambitious plan to bring computer science education to its 1.1 million public-school students, and Kenya has implemented a Digital Literacy Programme (DLP), which is distributing 1.2 million digital devices to public schools.

Proponents say learning to program doesn’t necessarily prepare one just for a career as a software developer. Coding is an exercise in logic, a discipline focused on step-by-step problem-solving. The late Steve Jobs, founder of Apple, summed it up when he said: “Everybody should learn to program a computer, because it teaches you how to think.”

There are certainly plenty of opportunities in the software development world, though, and South Africa has a desperate shortage of skills. According to a 2016 CareerJunction Index, software programming is the most in-demand skill in South Africa, which is in line with global trends. Careers24 maintains that data mining and Java developers were the top two in-demand skills.

Gareth Hawking, CEO of local software development company redPanda, says: “If we could hire 20 more software developers today, we would.”

Is everything about code?

So, what happens if you can’t code? After all, good programmers need a strong understanding of maths, incredible powers of concentration and fine attention to detail.

“Anyone can learn to code,” says Michael Choi, founder of programming academy Coding Dojo, but it takes “persistence, the right teachers and an optimal learning environment”.

Even if that’s true, there is still good news for those who find the idea of learning C++ terrifying. Not everyone has to learn how to code, after all. Lutz Ziob, dean of Microsoft’s 4Afrika Academy, says there’s more to prospering in the digital future than software skills.

“While there should be an important focus on STEM subjects, and those grounded in digital literacy, equally important are life skills,” Ziob says. “Often, young people have both talent and will, but lack life skills and other basic skills to bridge the gap between education and the working world.”

Writing in Wired magazine, computer science PhD candidate Emma Pierson goes further, arguing that the fetishisation of programming is becoming very dangerous.

“I’m constantly confronting questions that can’t be answered with code,” Pierson says. “When I coded at Coursera, an online education company, I developed an algorithm that would recommend classes to people in part based on their gender. But the company decided not to use it when we discovered it would push women away from computer science classes.”

These kinds of dangerous nuances are often missed, leading to algorithms that entrench prejudices.

What’s more, there’s always the risk that coders themselves could become obsolete as automation takes over. Microsoft and the University of Cambridge announced in February this year that they have developed a self-coding AI system called DeepCoder. Google’s Brain AI research group has similarly created an AI system that has designed its own machine learning software.

The rise of pop-coding

For all that, though, learning the basics of coding is probably a good idea for most. Douglas Rushkoff, author of Program or be Programmed: 10 Commandments for a Digital Age, argues that some knowledge of coding is essential if we want to understand how things such as social media and shopping recommendation engines can unconsciously control us.

“Even if we don’t all go out and learn to program,” Rushkoff says, “we must at least learn and contend with the essential biases of the technologies we will be living and working with from here on.”

Fortunately, this really is easier than ever, and it can be fun too. Platforms such as Ready or Scratch let us create games and apps without knowing a computer language, but, in doing so, teach the basics of how they work.

David S Bennahum, co-founder and CEO of Ready, says: “Learning basic coding is a form of self-expression, of storytelling. In this new world, learning coding is about moving away from computer languages, syntax and academic exercises towards real-world connections: game design and building projects that tie into other subjects like science and social studies.”

Traditionally, computer science used to be an impersonal, mathematical exercise. Not anymore. It’s become creative, collaborative and fun. Not like coding at all.


You might be interested in these articles?