Enter C++ and OpenGL. I chose to use these because it gives me greater control over the graphics pipeline, and allows me to write more efficient code (every developers dream… right?) which can be executed on a larger scale. It adds an extra layer of complexity for sure – I need to be more careful in my designs and avoid issues such as memory leaks.
After repurposing some old code (my final project for University) and breaking it down from 3 dimensions to just 2, I had a nice little set up that can create polygons and render them to screen. Perfect, what’s next? Well instead of jumping straight into the neural networks, I decided to flesh out the parts which were lacking in the prototype.
First up is a User Interface! The prototype was lacking any proper UI, and having one would be beneficial as there’s a lot of information in the simulation that needs to be easily visualised. I need draggable windows, I need control boxes for different variable types, I need… a library. I came across ImGui and it was everything I wanted plus much more! Having a UI makes it much better to debug and develop the system, and also makes it easy to visualise how the system works.
In the prototype the DNA was just a bunch of variables that could be mutated and that was it. This time I want the genes to have more depth to them and make it more realistic (still only scratching the surface). You can see above how these genes are represented in the UI. Each trait of a creature, ie size, comes from 2 genes. To determine what size the creature will be we simply take an average of the 2. In addition to this I’ve implemented basic dominant & recessive traits to the genes. This means simply that if a creature has 2 genes with different dominance (1 dominant and 1 recessive) then we will ignore the average and just take the value of the dominant one. All of these genes are stored in the genome structure, which is essentially just a list of genes.
One of the key ingredients in evolution is mutation. Mutations on the genome allow creatures to evolve to be fitter (of course mutations can also be detrimental too). These mutations on our genome result in a gene shifting a value or flipping dominant to recessive (and vice versa). This provides a nice fundamental basis for genes to evolve and create a more diverse set of creatures.
I’ve implemented my own body generation algorithm (came up with it myself, haven’t found anyone else who has used this method) which allows creatures to have different shaped bodies which are determined through their genes. The way this is achieved is displayed above:
- Sweep 180 degrees around the origin and plot n points. The distance between the point and origin is determined using Perlin Noise.
- Connect all these points together.
- Mirror the image and connect all points together again. This is mirrored because… well symmetry occurs in nature.
The parameters for the Perlin Noise are kept inside the genes which means they are open to mutation. The body will generally evolve slowly, however due to recessive genes some bodies can change suddenly. Noise parameters stored in the genes include offset, octaves & number of steps.
This about wraps up all I’ve accomplished so far on this project. My plans moving forward are to try figure out some methods for locomotion (I’m thinking about using Box2D) and to begin implementing the NEAT algorithm. As always my code is up on GitHub so the code is available to view.
I’m a software developer and recent graduate from the University of Hull. I’m fascinated by machine learning, artificial intelligence & procedural generation, and love sinking into exciting projects such as games, simulations & websites!