When someone asks you "Which programming languages do you know?" to judge your skill in programming, it actually reflects his skill: Bottom-tier.
Why? Because once you learn the logic behind a programming language, you realize it applies to all languages, since all 'real-world application' languages are object-oriented languages. Only the syntax is different. So, if you are truly "skilled" in programming, you can learn any language in a day. I put this to the test recently, by learning java in a day, and finishing a basic chess terminal application from scratch, in 3 days.
"Which language should I learn then?"
Depends on the framework/engine you will use. Which depends on what you want to make (or where you intend to work)
If you want to make a videogame for example, you have 3 choices:
- Godot: "python"
- Unreal: C++
- Unity: C#
But if you want to learn programming for programming's sake, I suggest C. Oldschool choice, but it teaches you the fundamental programming logic, without burdening you with the concept of classes,interfaces,polymorphism and other bloated features. C also teaches you some low-level concepts like memory allocation (e.g. memory leaks) or serialization.
If you dislike C as the first choice, I suggest picking anything but Python and javascript, because declaring variables without a type, makes types hard to understand for a new programmer.
"If all languages are fundamentally the same, how can you judge the programming skill of someone?"
Simple answer is portfolio. Long answer is skills which transfer in any language.
So, let me write a list of skills from "basic" to "advanced" a programmer typically learns:
-
Object-Oriented Programming
Understanding inheritance, encapsulation etc etc is greatly beneficial. The idea behind OOP is that it results in cleaner code than using plain C. For example, with inheritance, you can write the code of the parent, and recycle/reuse it for all its children, so you type it only once, and that also means less bugs. Using C, sure, you can achieve "inheritance", but it would be a mess to write and debug, especially when you reach `real-world application` where there is a nest of inheritances. Basically, objects lead to cleaner structure than functional programming, and hence, cleaner code.
-
Design Patterns
Design Patterns are "convenience hacks", standardized techniques. You don't even have to learn them by name, but by practicality. A lot of design patterns are used without their name being known by many programmers and that's normal. A good programmer doesn't just know and recognize a design pattern, but knows when to use one (no silver bullet)
A classic design pattern is the Singleton. Easy test for a programmer by asking his opinion on it.
- "singletons bad, NEVER USE!" -> newbie playing safe
- "singletons good, always use them" -> newbie abusing them, will get obliterated in a month
Singletons are ideal for entities which will definitely be exclusive/one in the project, no matter the updates and maintenance. E.g. AudioManager, or GameManager
-
UML
When I speak of UML, I speak mostly of UML by sketch (class diagram), not anything else like UML by timeline (sequence diagram) or by blueprint.
So, I define UML as designing a system on paper or board; having "finished" it before manifesting it. This often includes the class names, the variables, the functions, and most importantly how they all link together.Of course, making a UML never translates 1:1 to the final program, but the more accurate and easier the transfer from paper to real life, the better you are.
-
Data-Oriented Programming
If you make a big project, you will realize OOP is bloated. There is an excellent video explaining why OOP is fundamentally flawed, which perfectly expresses all the problems I had while I was writing OOP, and even many I missed! I can blindly recommend it as top 3 videos on programming of all time, even though it is so lengthy.
I could write walls of text on why OOP is horrible, but that text will never be as good as the video above. Anyway, the solution to OOP is Data-Oriented Programming.
And more specifically...
ECS - The Bloat Killer
If you understand the concept of ECS, you can apply ECS on any system, even random OOP projects. And when I refer to the concept of ECS, I do not refer to Unity's ECS or other data-oriented programming languages or frameworks.
So, instead of having classes which hold both data and logic, you split everything into the following:
- Entity: An empty instanced object. The only data it has is its unique instance ID, and which components it holds.
- Component: Essentially a struct, which holds only variables - a snapshot.
- System: Pure logic, having a list of all entities and components responsible for, and applying logic to them.
The concept of ECS is essentially what all game engines use. An empty gameobject (usually has a Transform component), where the gameobject's ultimate behaviour is made by composition of behaviours instead of by single inheritance of a behaviour combined with slapping some interfaces on it.
And the System responsible for the logic of components of type X, iterates on all Entities which have a component of type X. Components X themselves hold no logic, as the System does all logic.
Complicated? Let me give you a real example, a chess game! In OOP, you would have the logic embedded onto the pieces themselves, and have each piece be its own class (e.g. Pawn, King, Bishop) inheriting from abstract class Piece.
In ECS, instead of having the logic embedded to the pieces themselves, the pieces have no logic on them, and have only a component with the following 2 variables: color (bool), piece-type (enum). The pieces don't even need a reference to the board/grid, or a Vector2 of the tile they occupy! The system each turn runs the logic of every piece in the board, and depending on the piece-type (enum), applies logic on the piece with static functions.
This makes for a more modular codebase, with a smaller (game) state, which is naturally cleaner to read and debug too!
If you have to keep something from this article, please keep this: Seperate your data from logic, and always start designing with the data first. Only when you have finalized your data, start designing the logic around the data. Aside of modular code resulting in less bugs, there is a great benefit which is not obvious:
Most of the times, when you refactor your code, almost everything refactored is the logic (think of the functions processing the data or parameters)
So, the majority of your code (data) will be left untouched by refactoring. The conclusion is effortless refactorings. By extension, this also allows you to work with other programmers more easily (as the output of the system you are responsible for, will be fixed and won't have to heavily change based on the refactors on logic). Think of protocol outputs which are determined in advance.
ECS can be applied on all programming languages, and I shill ECS, because there is no more beneficial skill I have learned as a programmer. I knew the theory of ECS, just like I assume you now know, but once I coded it and truly grasped it, I feel all code I write ever since, is more clean and has less bugs.
How to git gut
Sure, you know the theory, but if theory alone was enough, everyone would read 10 books, watch some youtube videos, and go to work as systems engineer for 100K$ a year. You must write systems, and when you finish them, identify what can be improved and try better alternatives. Of course, to write systems without a monetary incentive, you must be hyped to write them, so pick an interesting project where you can apply whatever new technique/practice/experiment you want. If you watch random youtube videos or read tutorials, and don't apply them (write code!) you will forget about them in a few days, giving you false confidence that you learned something useful.
Practice makes Perfect
I remember when I was first taught OOP, I thought I got it, until I was asked to write a small program in it (pretty much the factory design pattern) and that is when I noticed I didn't quite understand it properly, yet I was confident I had understood OOP.
I also remember my first implementation of ECS was a mess, resulting in a refactoring worse than the original OOP code I tried to replace it with. But when I refactored my 2nd system to work with ECS, it was great, the 3rd was excellent, and so on. Trial&Error solves all misunderstandings, because if you haven't accurately understood something in programming, it will run, but never correctly.