The group Miro board. I forgot to use this more and more as the year went on. I was just too comfortable sharing my work in our group Discord. Although it is much nicer to have a visual representation of your work in one place.
In the beginning, the 3 of us each came up with our own stories to quickly expand our ideas and inspiration. This strategy has been great to us in the years before and still works well for us because we each have different creative thinking. This creates a wider variety of ideas and options, which only helps to strengthen each other. My initial thinking was to ask myself, what kind of impact can you really create from a short story without the ability to have a long, slow-burning film with complex character development and backstory, etc? I concluded that it needed to be funny or sad to make the most of the short length, and it would help to have a valuable message.
I never tried to make something sad before, so I thought it would be fun to give it ago, I also wanted to add a well-known and loved theme that really gets people excited, which was “bad guy doing something good” like the character Zuko from “Avatar: The Last Airbender”. I wanted the setting to be about robots in some way, and for it to be post-apocalyptic, this was the basis of my story, so I needed some way to make my robot character bad, but not just any kind of bad, the most favourited villains in the world usually have something in common.
“Villains aren’t born, they’re made”, this often comes in the form of a tragic backstory that makes the villain’s actions or goals “justified”. For my story this was the violent destruction of the main robots innocent friend by the humans because of the fallout, Now I needed to make it sad and having a character the audience could understand opened up a way to create that sadness through self-sacrifice and the reason I believe it would be sad is to see a character who has been wronged, has endured hatred, and has the opportunity to achieve revenge at their fingertips still choosing to do the right thing at their own peril, I think creates a very respected character that audience might feel something when that character is lost. I planned for the robot to choose to save a human child from a collapsing “slum shack”.
These are some examples of the slum shack structures in which the people would live.
We later chose Ellie’s story about robots saving a pet creature using a magic flower also set in post-apocalyptic time but with more vibrant colours, this was later changed after feedback to exclude the multi-legged creature because it would be too much to animate and we kept the magical flower and the goal changed from the little robot wanting the flower to save the creature to the little robot wanting the flower to add it to his collection while the other robots(mine and Daniel’s) tag along his journey against his wishes. I like this story because the point is to show the social development of this grumpy, angry robot who learns the joy and benefits of interacting with others and receiving help from them, making up for his “short” comings.
We gathered reference for the art style we wanted to do, I was really fond of the game “Far Cry New Dawn” because of its apocalyptic setting after a nuclear bomb and extremely saturated colour due to radiation effecting nature. I thought it would be cool if we could create large, mutated, colourful flora.
Another really nice reference I came across was an Artist called Tokyo Genso who does these really stunning depictions of overgrown Tokyo in his book, I especially like any showing flooded areas.
https://allabout-japan.com/en/article/8294/
I painted this concept art of a diner buried under dirt after years of plate shifting for a potential scene that our robots might walk through inspired by the film “The Day After Tomorrow” which has a scene of a shopping mall covered by so much snow that the characters don’t even realise that the ground they are walking on is a glass roof.
We really wanted to have a painterly look, but having never done it before I was unsure about what could realistically be achieved, I found this tutorial while researching painterly styles of anime style trees and followed along a bit, I then purchased his Patreon to gain access to an example blender file so I could take screenshots and quickly show the group of an interesting method we could potentially use to create the trees.
Some common forest trees.
I found a plugin that calculates panel gaps for extra detail on the robot’s surfaces. We didn’t end up using it, and it might come in handy in the future.
My first robot concept was heavily influenced by construction, which allowed me to give it powerful tools and abilities that could have been required to overcome obstacles in the story.
My second idea was my favourite, which I might have been a bit biased towards because I wanted to make a cool robot. He is a military robot made for war inspired by Bastion from Overwatch, and he has a passive personality, making him ashamed of his intended function. I also thought it could give a slice of insight into a possible backstory of how the apocalypse came to be, without actually going into much detail. His main features are a large tank-type body with efficient hind legs and attached weaponry that he wraps a tarp around to hide.
I had another idea to create an animal-themed robot combining a gorilla’s body with a rhino’s head. I like the sketch I did, but I didn’t go much further with this design because I couldn’t imagine a reason for it to exist in our story.
We had our pre-production presentation followed by feedback on our pitch and ideas. We received valuable feedback from our lecturers, specifically valuable to me was an issue about my robot and why his concept would be confusing. My military robot had an interesting backstory with his covered weapons, but the problem was that it would never be explained in the story, hence why it would confuse the audience as they would be constantly distracted waiting for something to happen.
This led me to design a new robot that had a background in emergency services to link it to Daniel’s robot. I chose ambulance services and designed him as a support robot, large in size, capable of lifting all types of patients like a carer would. I also wanted him to be bubbly in appearance to make him approachable and put people at ease, much like Disney’s Baymax.
Started on the previs, making a rough layout of the garden area using Ellie’s sketch as a guide for the pathways.
I worked on the first half of the previs, working out camera angles and landscapes following the script on our group’s Miro board. When we came back, we had our presentation, and we got lots of great feedback on things to fix or change, like timings, some moments were held for too long or too short, some were completely unnecessary and would save us time and energy if we just cut them, some camera movements were unnatural and we should treat the cameras as if they are real to keep their movements more subtle and less distracting.
Once I was happy with my blocking out, I started converting the pieces into usable meshes bit by bit.
Some of the pieces were complex in shape, like his chest, which made them difficult to make, but I really wanted to have good, optimised topology without cutting corners unless I really had to.
I wanted to minimise the gaps between the moving parts as much as possible to really show how much effort I put into designing his parts so that they belong and fit together like puzzle pieces and not random shapes.
The best example is the gap between his torso and legs, because there is little tolerance. I matched and aligned the contours so that the legs glide along the surface of the torso as far as I needed them to. This does restrict his movement for extreme poses, but he still can pull off most of what you throw at him.
Because I wanted him to look strong, I needed to incorporate hydraulics in his arms instead of the magic servos on the rest of his joints. I looked at heavy machinery like excavators for reference.
The boots and shin guards are inspired by Fallout power armor.
When I got round to unwrapping, I tried to keep each loose part as one piece to help later with texturing and keep everything tidy.
My first attempt at making ambulance-themed textures using colours from the local services.
Most of the rigging was straightforward and went smoothly, and it helped that Mike’s videos were also with a robot. It was my first attempt at using damped tracks, and I’m happy with how it turned out although there’s an issue that I could never really resolve. When my robot moves, the damped tracks lag behind, this gets fixed mostly when rendered, but the odd time it would happen even in render. All the results I found online said it was because of a dependency loop, which I tried to fix by adding more bones, but I still don’t think I got it just yet.
I spent more time on the shed than I should have, and allocated my time to something else. It was the first prop I made, and I wasn’t thinking about the priorities of what would be seen and what wouldn’t be. I thought, what if we decide to do a close-up shot inside the shed? This caused me to go around beveling a lot of spots to make it look like the pieces of wood were separate and not one piece. I was using Ellie’s sketch as a reference, and after discussing it with her, I moved the window to the same side for it to work with the end scene.
I encountered a problem in Substance Painter because of the way I unwrapped the wood pieces, all the horizontal pieces needed to be separated from the vertical ones so that their orientation would align with the wood grain.
The fixed UV map.
I painted green, representing grass residue around the shed in places where grass would build up, like on the floor at the door, as if the little robot was walking in the grass as he came in and out.
Initially, we had a shared Excel to keep track of the props and assets, but as we progressed, it was used less because we adopted a “make it when we need it” approach instead of creating props we think we might need in the future. I also made Daniel panic when I listed a prop called “hanging rope” for a roll of rope I wanted to hang on the wall.
Some props I made for the Garden shed, and the hat for my robot.
I made these buildings for the background of some scenes by creating one floor and using an array modifier to stack it up as much as I wanted. I duplicated the building scaled it down a tiny bit and applied a wireframe modifier to get the rebar effect then using a boolean modifier on a random shape to cut into the building giving it the destroyed look.
I designed this broken road asset to be a universal tool that would allow it to be duplicated and placed in any scene that we needed it to. After showing it to the group, they gave me feedback that it was too realistic, and I agreed. I have a habit when making new things, whether it’s 3d or 2d, to get tunnel vision and forget about the art style, which makes me default to a more realistic style.
As I have a background in welding, for making the metal flower that replaces the magic flower, I thought about how I would go about this if I were making it for real, because the robots would have to be able to make it. I made a washer and bent it. I then duplicated it a bunch to form the head of the flower before placing it on a simple piece of rebar. I wanted it to have detailed welds, so I customised a brush to paint welds on a high-poly version, then baked it onto the low-poly. It transferred over quite well, and for the texturing, I added a little blue and purple as if the metal had been heat-treated.
We needed buildings to fill up the scenery, but to stay flexible, I opted for making what I called the “ruins kit”. It was a collection of building pieces that could be placed to form different buildings to help stop repetition. To make them, I first made one brick and, using array modifiers, I built up the walls and pillars, but these were not optimised, so I built low-poly shapes around them to the same dimensions and baked the high-poly to low-poly again like the flower.
We had an issue unique to the hedges for the garden when viewed at angles the layered planes method didn’t work because they needed to be uniform and couldn’t just be rotated to face the camera, I remember Mike mentioned about doing this to the tree leaves during our presentation so I tried it with the hedges by selecting all and randomly transformed rotation only to make the hedge viewable from different angles. When viewed straight on from the side, some of the planes are quite obvious, but in the actual camera shots used in the animation, it didn’t seem to be as much of a problem.
I got to work on lighting as soon as possible, and I pushed for our group to use Eevee from the start for 2 main reasons. The first reason was that over the summer, Daniel was working on a small animation involving a campfire and a UFO. Because this was a small animation, nothing was very complex or intensive, everything was low poly and low res, but the render times in Cycles were still hours long with optimised settings. I didn’t want to be held back by long render times when we had to re-render to fix mistakes. The second reason was that new versions of Eevee started supporting RTX ray tracing, and I would see videos about making it look as good as Cycles, so it seemed like the obvious choice for having a deadline and dealing with the unknown of attempting to create things we’ve never made before. Over summer, I learned about HDRI’s and how easy they can make setting up lighting. I searched Poly Haven for a HDRI that was only sky and didn’t have any buildings or structures in it because I wanted to use it as our skybox, but later we changed it to a skybox because it was too realistic against our artstyle.
I tried to make a landscape around the garden, it looked terrible, was poorly optimised, and had so many issues. It was very blurry, but if I scaled the resolution up, the tile would repeat. I thought of finding a way to remove the tiling like this video above, which uses a custom node made by the creator to blend and randomise the tiling a little bit. Then you throw colour variation on top, and it should make it look good, but it didn’t. It still looked and ran really poorly, so I changed to staging the scene for just what the camera can see, which I still struggle with as I constantly find myself forgetting and trying to build the entire environment. I really wanted the same thing you get when playing a video game, a big open world that you can move around because it would mean I wouldn’t have to set up background and props for every camera angle, it would just already be there. It’s made me really curious about how game worlds are built.
I experimented with backgrounds using landscape pictures I got online to show the group that we needed something behind our sets, and it gave an idea of what it could look like. From there, we discussed, and I really like the forest in the background of one of Daniel’s references, and asked Ellie if she could make something similar using her assets in the form of a PNG for optimisation. Ellie did a great job making a forest that fit in really well as the backgrounds.
I kept working on the lighting, testing different objects like my robot and Daniel’s shop until I was happy with it. Then I applied ambient occlusion to one of Daniel’s shelves and showed the group so that we had a reminder to add ambient occlusion to our textures since the newer versions of Blender changed how it functions.
Our work was too realistic and not painterly as we wanted it, so I was experimenting with ways to achieve that. I tried noise and Voronoi textures on the normal map, thinking that if it worked, it could easily be applied to existing textures, but unfortunately, it didn’t look great and negatively affected how it reacted with lighting.
We went back and redid our textures in Substance Painter with an oil painting filter and increased the saturation to brighter colours. It’s really effective and I think it looks great.
Fixing the dark lighting and swapping the intensive grass hill with PNG variants. It runs much better and looks nicer, but the downside is that only the foreground grass has wind animation.
The first animation I worked on was the walk cycle for my robot. Here, I’m focused on calculating the walking speed to eliminate foot slip. I had a problem using Rachel’s method at first because it requires foot roll so the main foot bone could stay flat, and you would measure the distance between the 2 contact points on the axes of whatever direction it is, but my robot didn’t have foot roll and the bone is angled so it couldn’t give a definitive measurement. The solution Rachel gave was to measure between the first flat-footed contact points instead, which worked well.
This year, I think I’ve gotten a lot better at using the graph editor. I was able to adjust the walk to add a slight foot slam before each contact to emphasise his heavy weight. I made my robot with tight tolerances, but I also didn’t want his movement to be restricted because of it, so just in case when rigging I added invidualy moving hip bones for each leg that allow them to extend out, which I did end up using for his walk cycle.
It didn’t cross my mind how difficult it would be to animate paper. It took a lot of effort just to make this folding poster look acceptable. Because of the transition between taking the poster off the wall and folding it, I created 2 posters that seamlessly swap. The reason was to remove conflicting shape keys and child of modifiers, it was easier to have separate posters, one for pulling off the wall and the other rigged for folding.
The original forest path scene was quite heavy to run, so I made some optimisations and adjusted for the camera, which I now have a better understanding of. The first thing I did was remove the grass hills and trees from one side because I planned on sticking to the 180 rule for these scenes, so I didn’t imagine I would use them. I crafted the hill and small rock edge using Daniel’s rocks and adjusting Ellie’s ground. I then placed my road pieces, attempting to make it look like a severely shifted road surface. I removed any of the grass geometry that wasn’t in frame to make it easier to run on my viewport. I’m still unsure if having objects and entities outside of the camera frame affects render times. I will look into that later.
A reference I shared for street-side building fronts that could go alongside the road in the apartment scene, we were talking about how we didn’t need to worry so much about the backside of the buildings because they wouldn’t be seen.
I experimented with a few ways to make rubble. I thought if I could make a good enough base layer from a stone texture and scatter bricks on top, it would work, but it wasn’t enough, they all looked too flat and blurry. Maybe I will try again in the future with creating my own height map and see if that works.
For the apartment scene, my solution was to just scatter the brick everywhere, It wasn’t amazing, but it was the best so far out of my attempt to make rubble.
Decorating the apartment scene with my road pieces.
I used Photoshop to make the poster. I would have liked to have a background that wasn’t so similar to the little robot’s garden had we had the time, but thankfully, it isn’t on screen for very long.
After feedback, we changed the story, removing the confusing interest in hats my robot had for the distraction, and replaced it with a cardboard cutout of a pre-apocalypse new version of himself. I drew the ambulance and added the weathered look in Photoshop.
When it came to placing the skyscrapers, I realised I forgot to add growth to them. I just placed a bunch of small intersecting green planes.
I used the shrinkwrap modifier for the large search pulses, but because I used images for the landscape, I created a hidden mesh for them to stick to, making it look like it’s traveling over the hills.
I tried the same shrinkwrap method for the search pulses in the garden, but it didn’t work as well, and it would have issues following the surface of the hidden mesh. I tried increasing the geometry of both the torus rings and the ground, but it changed nothing. In the end, I used a bunch of shape keys to guide the pulses over the hedges.
It needed 33 shape keys.
I used the website freesound for most of my sounds because it offers different levels of copyright-free sounds. I only used sounds tagged with “ Approved for Free Cultural Works.” which meant I’m allowed to edit and use them without having to credit the creator, which some do ask for. I researched ways that I could edit the sounds to further shape them into what I needed. I had zero experience with sound editing so the first thing I tried was a website called Audiomass, hoping it would give me enough tools to tweak the sound ever so slightly.
I got lost quickly and asked my brother to help me because he has lots of experience with music. He uses a software called Ableton Live, which is a bit too extreme for a beginner, but watching him use it did help me understand what I needed to change about my sounds. The MP3 below is what he quickly made from a description of what I wanted. I really liked it, but I couldn’t use it because I didn’t make it myself, or have enough knowledge to explain how it was made.
I spent a day learning this software called Audacity, which, from what I can tell, is the most popular free sound editing software. I quickly grasped enough to do what I needed to, opening up my possibilities massively. The sounds below are the before and after of a drill sound I downloaded after I changed it for the servo noise in my robot’s walk cycle.
I am a bit ashamed of using Clipchamp for putting my sounds in, but I wasn’t confident I could learn another software in the short time I had left, and Clipchamp is very user-friendly. I thought the best, most efficient way for us as a group to complete the sound was to change from sections to objects. What I meant was, instead of us working on each of our own sections of the animation, we allocate objects and do the sounds throughout the entire animation for that object, this way, there would be no need to trade files that we used to stay consistent. For example, I did the sound effects for my robot from start to end. I then threw in the starting music, the ambient background sounds, the pulse wave sound and beep for the scanning search, and the metal pipe impact noises when he gets high-fived.