Sean Merricks

Designer | Programmer | Data Guy

Modelling

I love messing about with 3d modelling software and I currently use Blender.

The Earth From Space


This one is uses a bit of artistic licence on how the Earth would actually look from space including the Sun, rayleigh scattering and the vividness of the street lights. This image is created using textures and light maps from NASA's database of aerial photographs and then applied to a model.

After setting up the initial model I then applied the texture of the Earth onto the sphere as well as a bump map that gives the planet some height as needed. I then added a slightly larger sphere with the cloud texture applied to it and rotated it until I found a happy setup. A further sphere is set up slightly larger with a near transparent blue layer to emulate the rayleigh scattering of light through the atmosphere, again not scientifically correct, more stylised.

The Sun was added as a light emitting sphere on a separate render layer which allows me to edit it in the compositor later on. The image is then rendered out allowing me to move into the compositor where I apply several blur effects to the little sun sphere that gives it a haze around the edge as well as the appearance of Sun rays.

Also in the compositor I added a sharpen filter to crisp up the image as well as apply a barrel distortion to emulate a camera lens as well as a level of chromatic aberration, mostly visible on the lights on the right hand edge of the Earth, which gives the effect of taking the photo in a less than perfect (and therefore more natural) lens. Finally a small and subtle amount of lens flare was added to the composition to tie it all together.

The Earth From Space
The Earth From Space 2

Basic Camera Track


A little video of my kitchen that I then camera tracked. This is the process of designating points in the video that get tracked from frame to frame. I chose some of the dots on the counter, a corner of my coffee machine, the top of the door handle, the light on the kettle and you get the idea.

When I set up ten points that were fully tracked from the start of the video to the end, i.e. they smoothly followed the point I placed them on from the start of the video to the end, the fun can start. Using these ten points I can designate a minimum of three of the points that are on the kitchen surface and then three on the back wall. Using these Blender then has the ability to attempt to calculate where the camera was in 3D space, this only took around 3 seconds for this whole video and the result had a very low solve error which is ideal. What this means is that the software has managed to take the 2D points that I tracked in the video and based on knowing which three where the horizon and which three were the wall, it can take that info and the other points and 'solve' where the camera moved in real life. This translates the 2D points into 3D points.

With the camera solved you can assign the virtual camera in Blender to a camera solver constraint. This basically means that the 3D location of the camera that was calculated can be used to move a virtual camera around the 3D workspace in Blender to interact with a modelled scene. This allowed me to import my tree models and gate from a past project and some quickly made grass islands and set them up in the 3D space. So now the camera can move around the models in realtime in the same manner I did with my phone camera in the original video. I can then render the the model with a transparent background as it would appear in real life for each frame of the video and overlay it onto the orignal video which is added as a 2D background.

The one thing I notice after the video had finished rendering (around 12 hours of render time) is that I had not added a plane in the model for the work surface that did not show up in the render. This may not seem a big deal but although the plane would not show up in the render it would capture the shadows of the models making them seem more 'in' the video. Not a big thing but it really would have added a lot.

PBR Wall Texture Test


The following image is a single flat plane with a set of PBR textures applied. PBR textures allow for 'physically based rendering'. In short this means the textures will render in a manner that is much more natural than with other texturing methods.

The image below is a render of a single blank plane that has the following image textures applied to it through a variety of render nodes:

  • Diffuse/Albedo texture for the basic colour running into a diffuse shader
  • Ambient occlusion texture multiplied down onto the diffuse texture
  • Gloss texture running into a gloss shader
  • Reflection texture to control the factor between the diffuse and gloss texture
  • Normal map to control the hight difference across the plane
PBR Wall Test

Zergling Head Sculpt


This piece was a test to further my progress with the sculpt tools in Blender. I started with a 1m block of 'clay' and ahved it down to the rough shape of the head. From there I dug out the eye sockets, carved out the scales and carved out the mouth.

As separate pieces I created teeth and fangs as well as the spikes on the scales. The eyes were also added from separate spheres and given a glowy emission finish.

Using a mixture of clay strips and the pinch tool I then refined the model until I was happy with it as a training piece. The next step wold be to add variation and proper textures to it but this served its purpose for what I wanted to achieve.

Zergling head sculpt

Grass Path - First 'Photo-Realistic' Project


Featuring 10,000,000 blades of grass in a particle system, two modelled trees and a weather worn fence this is my first larger scale 'photo-realistic' Blender project.

The scene features several elements that I have worked on previously on top of a particle simulation of grass set on a wavy floor to create a somewhat natural look.

The very back of the image is a photo I took in Wales recently just to set the render off.

Photo-realistic path through the grass.

Tree Modelling


As I think about moving forward with photo-realistic work trees pop into mind. They are big and complex in nature. Luckily there are some tools in Blender that help make trees within a defined set of parameters that you can change to get the shape you want.

The saplings add on lets you create a near inifinite variation of trees with leaves but it only gets you so far. The output after tweaking through the hundreds of options you can change is a set of curves for branches with square or hexagonal leaves.

From here it is all about modifiers and textures. After finding out a seamless bark texture and a few photos of leaves I soon created bump maps, translucency overlays and alpha masks in photoshop. Back in Blender these are then imported into the cycles engine as textures and set up through a complex set of nodes to apply textures in the correct direction based on the geometry of the model at any given point and then to make it look as realistic as it can.

After making one I started on a second one so I had a couple of trees on file for any future compositions that need a few varying trees.

Tree 1

Photo-realistic tree

Tree 2

Photo-realistic tree 2

Squirtle Sculpt/Liquid Simulation


For a slightly different challenge after the glass of water simulation I decided to crank up the water pressure sideways and just see what happens. It kind of looked like a water pistol blast, which with all the recent Pokemon Go hype just led to me thinking of Squirtle.

I then had the problem of modelling something that didn't look terrible, which only having dealt with low-poly models until now, was hard to get my head around. A quick look through the Blender docs pointed out there is a virtual sculpture mode that acts like traditional clay sculpting with the benefit of being able to undo. I started with a simple sphere and then carved out spaces for the eyes and thickened the eyebrows. Following the same process I hollowed out the mouth and filled out the cheeks.

The rest was simple texturing and composing lights and a backdrop to finish out the scene. I know it's by no means a great piece, and there is a lot that could be improved but the processes were all good practice.

Squirtle sculpture and water simulation.

Glass Of Water


Using the cycles engine in Blender I started looking into water simulation and modeling. Making a simple glass and icecubes model I then set up a domain for the water simulation to run in then configured the inflow and the obstacles.

The simulation had quite a long baking time as I ran the simulation over a 5 second time frame. When it was complete I chose one frame from the simlation that I like the composition of and extracted it to be its own model in the scene.

The ice cubes looked a little flat so I modelled a single tiny bubble then configuerd a particle system that filled the ice cubes with various shapes and sizes of bubbles within a given set of parameters. A simple set of lighting and material changes later and this is the finished result.

Glass of water with ice cubes rendered through Blender cycles.

My First Animation


Although it is nothing special, I made a simple low-poly model of a boat tied up on a little island and animated it. I'm happy with how it came out and I learned a lot in the process.

Low Poly Mini Render & Model


I made a quick model of a Mini based on the one I used to own. One of the best cars I have had the pleasure to drive and I had fun modelling it.

Low Poly Mini Render

Fence Textures Case Study


A quick case study I went through modelling a basic fence and then applying a complex set of textures to over time.

AT-AT Walker


This was the first proper 3d model I worked on in about 8 years and I had just watched episode 7. I tinkered with a few quick pieces before but that was just to get my head around the controls in Blender.

Looking at it now there are a thousand things I would change or model differently, but you have to get back on the horse at some point. I may go back and make it again at some point and texture it rather than leave it just as a basic model.