What’s Wrong With Technical Education And How To Fix It

What’s Wrong With Technical Education And How To Fix It
Macrovector/Shutterstock
Summary: Colleges and universities (especially technical ones) are facing a major challenge today. This article shares some possible areas for improvement.

The Mirror With No Reflection

What does a college degree in IT mean today? Does it guarantee a stable job, as it used to? Is it a promise of faster employment? Does it give an alumnus a huge privilege? The answers to these questions vary in different countries but have tended to switch from positive to negative in general. I’m convinced that colleges and universities (especially technical ones) are facing a major challenge and failing to address it. The situation is only going to get worse. I’ve spent several years investigating this problem, and now I’d like to share some possible improvements with you.

Distorted Image

This article takes its title from a mirror metaphor because, ideally, universities should reflect the outside world. They should show this reflection to their students, explaining how the world works and preparing them to meet it fully armed by giving them all the essential tools. Universities should equip students with skills that they can immediately start applying when they get a job. And, of course, these skills should increase a person's value on the labor market and make the job-seeking process smoother and faster. This is how the system used to work. High-school teens fought to get into the best college available, expecting all of the effort to pay off later. It mostly did, and for some industries, this is still the case. But for how long?

The world started to change faster, and IT was one of the biggest drivers of the transformation. The pace of technological advancement always surpasses our boldest expectations. For example, every 18 months, the speed of computer processing doubles (hello, Moore’s law!). Programming languages evolve, new devices appear and software development models change. This is why students end up unprepared for the current job market. At university, they’re learning information that could be obsolete by the time they get their degree. So what’s the point of spending several years studying something you won’t need in the future?

An Ever-Changing World

Today's market speed is so overwhelming that even the most progressive professors struggle to keep up. They’d have to update their curricula and course content several times a year to keep up with the ever-changing world. It seems impossible. Moreover, new professions appear faster than universities can develop and implement new programs and courses, which means that universities can't prepare employees for those jobs.

It seems a lot of Americans have come to similar conclusions. In a Gallup study conducted in 2013, 70% of adult respondents said that a college education was “very important.” In 2019, that number had dropped to 51%. Also, only 13% of U.S. adults and 11% of C-level executives believe that college graduates are prepared for work. At the same time, the cost of higher education is extremely high, and it’s risen by 400% since 1990!

Another idea to take into consideration is that some areas of expertise require practical skills rather than theoretical. Software development is one of them. You can’t become a qualified programmer just from reading books and solving tasks on paper. You need to code, and you need to start as early as possible. In the book Outliers, Malcolm Gladwell presented the rule of 10,000 hours, which states that to become great at something, you need to invest that amount of time in practice. That also means that the earlier you start, the faster you’ll succeed.

Unfortunately, traditional education doesn’t work that way. The academic approach sometimes stands in the way of modern demand. Because of this, students often obtain the skills they need for work from somewhere else, such as online courses, boot camps, and internships. If you want a good job, you’ll need expertise and experience, but you can’t get experience just from studying. It’s no wonder that many students start working during their university years. They want to become professionals as soon as possible.

Also, universities usually don’t prepare you for job interviews, help you write CVs, or teach you how to ask for a raise. Even if professors try to cover these topics, the real situation can be completely different because professors usually don’t work for private companies. Students need to turn elsewhere for this information, and it’s another sign that colleges and universities are falling behind. Let’s not forget that colleges and universities have a limited capacity to customize the educational process. As we know, people learn differently. One person might prefer to listen to lectures and write notes while someone else might need gamification to get the best results.

There are several classifications of learning styles. For example, in the VARK model there are visual, auditory, reading and writing, and kinesthetic styles. But university courses are made for an "average Joe." Professors don't care whether you learn better by watching, reading, or doing. If you need flexibility, colleges and universities aren't the places for it. Some people might disagree with me because it's common knowledge that universities "produce" more intelligent people. But is it really the only way to become a competent, smart, and erudite person in the age of the internet, when we have access to limitless amounts of high-quality content? I doubt it.

So…Let’s Shut Them All Down?

Let me be clear: I don't think that colleges and universities are useless. They’re essential for society, but they’re not a panacea, especially for growing IT specialists. From my perspective, universities perform at least two important functions.

First, the world needs professionals, and it will need even more highly qualified people in the future. After all, if robots take all the low-paying jobs, humans should switch to more complicated positions. People will need skills and knowledge, and colleges and universities should be where this knowledge accumulates, especially when it comes to science. It's hard to imagine a physicist who hasn't spent at least five years studying complicated theories at university. At the same time, different amounts of time are needed by different occupations to become a professional. For some jobs, three or four years is more than enough, while for others (e.g., doctors) eight can be the bare minimum. The one-for-all approach isn’t an option.

Second, in today's world, teamwork is a must-have skill. At college, students attend lectures and parties, meet people and mingle, work on projects together, network, and build connections that can last a lifetime (and might help you get a job at some point). To sum up, the university is a great place to socialize and learn the unwritten rules of living amongst people. It will almost certainly remain that way.

Fixing Education

What’s the primary purpose of educational institutions? As we established earlier, it’s to prepare young people for their future and to help them get the best possible job. But what if modern universities focused on different goals (like supporting scientific research or discoveries) and left preparing students for the labor market to other companies?

I have a background in Java development and have been teaching people how to code for many years. It started as helping family and friends (my first "client" was my sister). I came up with a system that allows people to learn the basics in three months (via 25 lessons of 4 hours each) and immediately get a job. All of my students found their first positions after taking the course. Later, I based my business on the same principles, and now thousands of people in different countries learn Java with my business. Moreover, 90% of my employees switched careers at some point after taking online courses. That's why we believe in short-term, focused, goal-oriented online education. I'm sure it can take over some of a university's functions (actually, it already does).

What’s crucial for an educational program, and what can universities learn from online courses? First, I'm sure that practice is the key. Most colleges and universities should revisit their courses and make them oriented toward task solving. Of course, the tasks should be real, not artificially created just for the sake of learning. Also, every class must have a clear goal. Every student should understand how the course will help them in the future.

Flexibility is another key to transforming higher education. The world is changing, and education should follow suit. Otherwise, it will prepare students for a world that no longer exists.

Let's not forget about motivation. I won't tiptoe; traditional education is often boring, full of long, monotonous lectures, unnecessary reading, and not much feedback. Such an environment doesn't make learning fun or easy. Today, universities communicate the message, "We don't care if you're motivated. It's your problem." Maybe it's time to change that approach. For example, we could add elements of gamification and storytelling or focus on the joy that people get from studying.

Conclusion

The pandemic forced even the most old-fashioned institutions to implement tools for remote studying, such as Zoom lessons, online discussions, and virtual boards. But again, this is just the beginning. Technologies are changing, and universities should be amongst the early adopters if they want to create real value for the job market.

To summarize, I believe that universities must change to fulfill the needs of students, employees, and the labor market in general, or surrender to private companies which can do better. To achieve this result, time spent at university should be worth it. In other words, five years at a university should provide more significant value than five years at work.