This page may be out of date. Submit any pending changes before refreshing this page.
Hide this message.
Quora uses cookies to improve your experience. Read more
Robert Walker
On a modern computer most instructions take billionths of a second. You can loop around a loop doing something a million times and still only take a millisecond or so.

Most of the CPU time is spent doing things like 3D graphics, or 2D graphics or streaming audio, or in code that others have written which your program calls.

So for instance if you are writing a 3D graphics intensive program, you'll use OpenGL or DirectX or some 3D engine. Most of the CPU time is spent doing that graphics - so it hardly matters what language you write in.

What does matter is that you call those routines in an optimized way.

In the earliest computer programs, you counted up the number of multiplications and reducing them - that was before the fast multiplication alogrithms, when addition was far faster than multiplication Multiplication algorithm. Nowadays they take about the same time.

Also compilers are so good at optimizing the code - that you don't need to give much thought to optimizing. Sometimes though.

E.g. if you have a routine that often calculates the same thing over and over - it can speed the program up massively to store the answer in a look up table, - and re-use previous answer if called again with same input - that of course only works if you can access the relevant part of the look up table quickly, but if you can make a small enough look up table to do that - it can lead to amazing speed ups sometimes.

But - that sort of optimizing doesn't need a low level type programming language.

The sorts of optimizations that need you to code at a low level - most of those are now done by the compiler when it builds your program.

Other optimizations are done by the computer chip itself at run time.

End result is - that just about all the optimizations that programmers used to need low level coding methods for - are no longer needed. Rare to need to code in assembly or even C.

I write in C myself, at a low level. But just because I like working that way and can work as fast or faster with this low level appraoch as others who use high level languages - and accurately with no more bugs than those who work at a high level, so see no need to change.

But - unless the high level language is very inefficient or is restricting in what it lets you do, don't see any performance reason for using one.

About the Author

Robert Walker

Robert Walker

Writer of articles on Mars and Space issues - Software Developer of Tune Smithy, Bounce Metronome etc.
Studied at Wolfson College, Oxford
Lives in Isle of Mull
4.8m answer views110.4k this month
Top Writer2017, 2016, and 2015
Published WriterHuffPost, Slate, and 4 more