How Python Became a New C Language?


If you are thinking, what the hell, why is this guy comparing two different programming languages and paradigms? Then you have the right to ask that question.

This article will help you understand a little bit about the history of programming and how we are in the middle of another shift in software engineering.

Even though it is very convenient to ignore hardware running our code, to some extent we cannot and shouldn’t do that.

Back in the days of the first computers and microcontrollers, everything was dictated by hardware architecture. Before C, we wrote code in assembly. What is even worse, every chip and every architecture had its own set of instructions that could be used while writing code. It wasn’t like today, you learn to code in one language and then can do everything, just the matter of library you use. You actually had to know the set of instructions for specific chips, know memory restrictions, since it was very limited and know a lot about chip peripherals to program them in a good way. I had a “pleasure” to work 3 months with assembly, for one of my first clients when we had to rewrite this code to C. Even though the experience was very humbling I learned a lot, but most importantly I started to appreciate C and higher-level programming languages. Back then I felt like C is the high-level programming language.

With the invention of C in the early 1970s, humans learned how to detach themselves a bit from computer architecture. Keyword “a bit”. What I mean by that is that we learned how to write code and then use compilers to optimize it for specific hardware architecture, which over the decades became a standard, and now if someone knows how to influence bit alignment in C by compiler directives, knows what is padding, then a person like that is considered experienced low-level engineer.

Most importantly the change was gradual and progressed over the years with the expansion of electronics. Every device that we are using right now, whether it is a phone, microwave, smart band, or car navigation display, all of it has C code in production codebase that was compiled into assembly or in other words machine code.

Okay, Adam, great story, but how does it compare to Python, and why we are talking about hardware so much?

I bet you know this word is coming. Artificial intelligence.

Python over the last few years become the go-to tool for most of the engineering work, not only in AI but overall. It has a relatively easy syntax and a low entry level in terms of skills. Infrastructure code is being written in Python, automated tests are being written in Python, the backend of the websites is being written in Python, robotics software is being written in Python, and desktop apps are being written in Python. The point of listing these is to show how popular it has became already.

The next shift has already started due to advancements in AI. If we go through the list of the last 3 years of the startup’s funding we can see how things will change. Apple announced their AI in the phones, and LLM’s companies are racing each other. The humanoid startups are growing. We have companies that generate videos and pictures.

Everything is fueled by the enormous amounts of GPU farms. The companies use data to train models on these GPUs, and that’s why I believe we entered the new age when everything that is going to be written will be written on the next level of abstraction.

Python, despite its high-level nature and convenience, will still be influenced by hardware advancements, particularly in AI and machine learning applications. The rise of specialized AI hardware like TPUs and GPUs necessitates a deeper understanding of how code interacts with these components, even if we're not writing in assembly or C.

As AI continues to evolve, the demands on software will increase, pushing us to rethink how we write and optimize code. Python's role in this new era is not just as a programming language but as a bridge between human logic and machine efficiency. The ease of writing Python code, combined with its powerful libraries and frameworks, makes it ideal for rapid prototyping and deployment in AI-driven applications.

Yet, just like with C, a new generation of engineers will need to appreciate the underlying hardware that powers their Python code. Understanding how to optimize code for AI processors, how data flows through neural networks, and how to efficiently manage memory and processing power will become crucial skills.

In summary, the shift we are witnessing is not just about the languages we use but about the paradigms we adopt. Python is leading the charge in the AI revolution, but as with any technological leap, those who understand the fundamentals, from hardware to high-level abstractions, will be best positioned to innovate and excel. The future of programming is here, and it's an exciting blend of simplicity and sophistication, driven by the promise of AGI.

After all I also have one more learning. In the same way as engineers back in the days could worry that compilers will make their knowledge obsolete, we can rethink if AI will make us obsolete. My answer to this is obviously no. It will change how we write code and how much of it is going to be produced. The role of software engineer will evolve but we will not be replaced. We will be replaced by other humans using AI.