top of page
Screenshot 2023-03-30 011742.png

Developing the Project

Failed Agisoft Metashape Experiments:
Our next step to making this project was figuring out how we could get the scans and Point Clouds of the locations we wanted, for this I had thought of a couple methods we could use from my previous research, first one was to use the method from the tutorial where you take a video or a lot of pictures and put them into Agisoft Metashape this is called photogrammetry, the second was to create a 3d scan and then convert it into Point Cloud using CloudCompare, the way I found we could get these 3D scans was I saw that in the EMS they had 3d scanners, the main one I thought would work was the Intel Realsense 3D camera (it seems these have been removed from the EMS store recently), but I was not sure if these were the right camera so I emailed Connor the person in charge of the EMS but he was taking a long time to answer as he on break.

In the meantime while we were waiting for a response we decided to try the photogrammetry method of photogrammetry, and so James went to the Barbican and got some videos of interesting looking parts. I then took those into After Effects turned the footage to 2 frames per second and exported it as jpegs then took those into Agisoft Metashape and then followed the tutorial to create a Point Cloud here is where issues started arising, firstly some scans would be completely unrecognizable messes of points everywhere, then the ones that did work were somewhat deformed, which would have been fine and interesting looking but they were also very short for example the one part with the plants, see below, in the video was a long stretch of road but the scan cut it short which meant it would be a extremely short clip. The last issue was that for the scans that had neither of these issues, there was still a general issue where I could not follow one of the steps of the tutorial, "building a mesh cloud", we realized that this was due trying to do this on a computer with integrated graphics once we tried it on James's computer which worked a lot better but still had issues like before where some scans would come back completely wrong, but also new issues as sometimes building a dense cloud would delete the floor. I ended up having the time to only visualize one (shown in the next part) these and animate a camera as I was working on another project and by the time I was able to work on it again we had already talked with Connor and figured out a better way to create the scans, that way was using the iPad built in LiDAR scanner.

Below are the render of the Point Clouds I managed to make on my computer which look terrible.

Swapping Workflow to Unity 3D:
After we got the improved scans from James's computer I decided to use this chance to experiment and try to change my workflow from visualizing in Unreal Engine to using Unity 3D, this was risky as we were approaching the deadline and I had no clue where to even begin with Unity but I decided to do it firstly because I was still not very happy with the results I was getting using Unreal Engine and I had read that you can get a better look in Unity it just requires some coding but was not too hard to do and at the same time both the artist that had inspired us to use this used Unity. The other reasons were because I had also read that Unity had better performance, I wanted to do some of the snimating at Uni and I wanted to try animating the points so they move.

The first way I tried to use Unity was using the plugin that the tutorial unreal engine mentioned and as a general guide for how it works I used this tutorial: Point Cloud Visualization 01: Unity 2018.4, PCX, FPSController whilst this did work it didn't look like I wanted it to looking like this:

Screenshot 2023-03-26 180158.png

The next method I tried was following this tutorial: How to make a Point Cloud Renderer using Unity VFX Graph which used the VFX Graph and coding to create a shader for the Point Cloud, whilst I was looking at how I could work on Unity I found that RubenFro one of my inspirations for this project also used a custom shader for his work so I felt like this was the right direction I was going in (I have lost where I saw him say this I just remember he replied to comment asking him), so I followed the tutorial closely wrote the same code as he did fixed some issue in my code as I had misspelled some things, and in the end it didn't work at all the main issue was that whilst I could copy the code, I had very limited understanding of it as this was the first time I really tried to use Unity (there were other times in the past when I was a kid but they didn't help as I gave up quickly back then and I retained none of the knowledge), this meant that I didn't know how to use the code to import my own Point Clouds and more importantly I could not fix any of the errors that popped up in the code. I even tried simply copy and pasting the tutorial's code which he had in the description and putting it into my own project but that also didn't work which I realised was because of the different versions of Unity we were using which created a lot of inconsistencies in how VFX Graph code worked. I realised that the issue was the version of the program after I tried the first method again, basically the Plug-In in the first method came with some presets projects which showed examples of how the plug-in can be to used to animate the Point Cloud but when I tried opening these projects on my Unity version they wouldn't work, after a lot of trouble shooting I found the version of Unity that the plug-in had been made with/for, opened the presets again and they started working. I couldn't really use this but it was extremely useful knowing that this is an issue and how to fix it moving forward.

After a lot of looking around the internet for anything at this point as I was thinking of going back to Unreal Engine, I found this workshop by an artist named Yuma Yanagisawa in which he is presenting how to use this shader that he has created. [Workshop on Unity VFX Graph - Point Cloud + 2D Texture Sampling] whilst the audio was pretty bad but I understood how it worked for the most part and so I was able use his code and project and make it work for my idea. The video below is the product of a couple hours of tweaking with the settings on the VFX graph like he showed in the workshop and changing his shader to one that fit mine for example I made the points bigger, I made them all face the camera so they look like square pixels when they are really cubes I removed the extra particles he had messed with the lighting, the shine of the points the movement, until I had something I was happy with. Seeing it rendered there are a lot that can be improved like the points have a life span that is too short for the style we're going for the points are too big as my VFX teacher Zak that the points being big looks goofier and not what we wanted to achieve, however this was lot of progress and it was very good to have a template that I could now work off of.

Soho/Photographer's Gallery iPad Scan Success:
After we got a response from Connor he told us that rather than the intel realsense the best choice was a structure scanner and an iPad, so as we were still not happy with the results we got from photogrammetry we decided we would book these and try them to see if we could get a better scan. When James went to pick up the equipment he talked to Connor in person and was able to explain our project better (we had tried to do this multiples times but we could never find him in the EMS) and from Connor recommended we use one of the newer iPad models they had just gotten in the EMS, which were not on the store website yet, and an app called PolyCam, which uses the LiDAR sensor that new iPad come built in with.

To test these out we took the iPad to Soho as we also wanted to visit the Photographer's Gallery for research and we created some scans in and around the Photographer's Gallery and the results we got were exactly what we wanted, there were still some small issues but like the range of the scanner but we could work around this.

I then took these scans into Unity and used the shader/method I had found in my previous experimenting to visualize them I also took this chance to experiment with some ideas, firstly with the street scan I added some particles that would off of the model as I felt it was too flat, these were also more metallic so they shined a bit, I also tried a idea I had seen from Benjamin Bradou where in his scans he would have bigger particles around the point cloud that would move opposites the camera and created  parallax effect which he would use to cover any imperfections in the scan like holes or even the sky, but it seems he has his code from this and didn't know how to do it so the only way I could of doing this was to duplicate the particle system make it have a lot less particles and flip it 180 degrees so it's in the sky then animate it moving opposite to the camera direction, which ended up not looking great but was worth a try and I feel I could use with some tweeks to it. Second thing I tried was on the stairs scene I used a similar particle idea as the first where particles would move off it but his time I gave the movement more intensity, less frequency (by upping the frequency on the previous one it made them move sporadically and randomly rather than in a certain direction) and a different turbulence mode (this is how the points move with a update particle node and turbulence) which would make move in a certain flow like almost a spiral, this was inspired by some of Benjamin Bardou's later works. But after talking about it with James he wasn't a big fan of it as he liked a more simplistic and professional look so moving forward I still tried to use the effect but in more subtle ways to find a in between of what he wanted and what I wanted. Lastly I used this experiment to work on my camera animation and try to improve it by slowing it down a bit, even though it was still very fast, working with the animation speed graph and trying to create possibilities for transitions like make the camera spiral at the with the idea that I would continue the spiral in another clip and transition between them with a match cut. This still needed improvement but it was still a good amount of developments and I keep improving later.

Barbican Scans & Visualization:
As I was busy James got the scans of the Barbican alone and sent them to me, which I then put into unity using the technique I had learned and visualized them. I started rendering just one of them to send to James and see how he felt as I was trying to implement some of my ideas from the previews experiment like the "sky" particles which I fixed by not animating their location and just making them move around a bit at the place they were at, I also felt that the Point Cloud felt too static and simple compared to Benjamin Bardou's work and the way I fixed that was by adding the extra particles that floated off of the point cloud but in a more subtle look so that there was a moving elements, I also did this as in for example Benjamin Bardou's work the point cloud on the sides was more dense since films small street shops but because of how the barbican is we didn't have the same walls to work with so I used these techniques to fill the empty voids. James was happy with these additions so I moved on and did most of the rest. Throughout this project I continuously asked James what he felt about the stuff I was doing in Unity as we equal partners in this project and I wanted to make a final product which true to both our visions. 

After that I simply repeated the process for each scan with minor differences for each of them, like the tunnel scene where I added my own light sources cause I just felt it added a lot to the shot and the statue corridor needed to be pieced together as it was 2 separate scans, because PolyCam couldn't scan the whole thing in one. Some more general changes were the sizes of the points, how the extra particles moved, for example the church scene I wanted to keep more pure so I made the movement of the extra particles a lot less intense movement, I also had to do this because of the way the camera for the church was animated as most clips were one shot the church was three shots in one go and the camera would just move from end location to the new start location very fast and I would simply fix this in editing, because of this the particles would keep rising throughout the whole filming and would turn into a mess of pixels if the movement was too intense. Animating the camera was the hardest part for the barbican stuff as it was very easy to break something and the camera start doing stuff that you didn't want or make the camera move too fast as it was hard to gage the speed when all I had to off of were the amount of frames it would take to move.

There was a massive issue I had to solve, which was that for some reason the lighting of the scenes would get cut in half so everything behind the camera would not be lit which was fine at first but for scenes like the tunnel which curve at a certain point this would be visible on the camera. After a lot of trouble shooting I figured out it was the directional lights that I was using, which worked in very weird way as they would light the whole scene but only in one direction, there weren't any better lights to use in Unity but what I could do was duplicate the light and turn it 180 degrees which light the back, however I realised that there was still a line in the middle of these, this is quite visible on the church scene, and the way i managed to fix was to add 2 more lights at 90 degrees and -90 degrees but this still wasn't perfect as it was still slightly visible and all I could to was try and make it as unnoticeable as possible, which was very hard as it wasn't just a case of simply duplicating the lights the side lights had to brighter but not bright as then the line would become a bright line rather than dark line, and a lot of my time went into fixing the lighting but I got it to place I was happy with which was not very noticable. I feel that this was a brute force way of fixing the issue but I could not find any information online about anyone having the same issues as me and I was running low on time, with the deadline on the horizon.

Rough Cut:

Camera Animation in Unity 3D:
To figure out how to animate the camera I used this tutorial:
How To Animate Your Camera In Unity which was very straightforward and I got the hang of pretty quickly. However moving from Unreal Engine to Unity initially took some adjusting as in UE you could move the camera easily with WASD keys as controls which felt very intuitive whereas here it had to be done manually by moving the camera which was a bit harder. In addition I realised that especially for Unity it is very important to use the animation curves, which is the same as the graph editor in After Effects, as animations would move too slow or too fast, would be smooth or overshoot when they shouldn't and the curves was the way to fix that. I didn't have a lot of experience with this but as I was also learning to work with animation graphs for my VFX module too I could use what I learned from that for this project and vice versa. From a back and forth with James of animating and showing him the main feedback which I had to take into account was slowing the camera animation down a lot more, this was also the same feedback my VFX tutor gave as I also asked him for feedback.

When animating the camera for the final animation for the barbican and church I animated them in different styles to create more of a contrast and use the camera movement to create meaning. In the Barbican parts I took inspiration from the barbicans brutalist architecture which from my research and how it looks, brutalist architecture is built on the idea of modern architectures that designs with purpose as the forefront of the design ideas. When looking at brutalist architecture you can see how it is made for a purpose, to house people, as I saw in my research brutalist architects saw a house as only a ‘machine for living in’, so the designs aren't decorative. I tried to create the same feeling with my camera movements by having the camera emulate a first person POV shot as it walks around the scenes, one of the ways I did this was by animating a camera head bob by animating the y axis moving it up and down and repeating the keyframes, for the stair scene this was harder as I tried to sync it up with the steps of the stairs rather than simple repeating animation. I did this as, at least the way that I see it, when you are walking you are walking with the purpose of getting somewhere, at the same time this creates the question of yes the video is working with purpose to get somewhere but it has no destination to reach and no end so I wanted it to question if this modernity is a good thing or if it removes the meaning behind the purpose which we move through it. I didn't want to be fully critical of brutalism I just wanted to raise these question and create this emotions/feelings through the video and camera motions.

For the church parts I big swooping shots to try and signify how big it is and how it is a part of bigger purpose. In addition churches have very decorative architecture so I used these shots to show these unique decorations rather than the flatness of the architecture of the barbican. This would help create even more of a contrast and a more relaxing feeling when viewing it.

Editing in After Effects:
When editing the renders I got from Unity I used After Effects as I wanted to create effects and transitions using the plug-ins Pixel Sorter 2 and Displacer Pro which are after effects plug ins my VFX tutor recommended as Benjamin Bardou also used them for his work, I also did the Colour Correction on After Effects, which was mainly just adding more contrast. To create the Pixel Sorter transitions I used athis tutorial:  and just modified it so it looked better for what I wanted it. I also used the Pixel Sorter Effect in the scene with the statues as the corridor had not scanned properly so I used this to stretch the pixels that had rendered to make it less noticable and look more intentional. Lastly for the church parts I only used match cut transitions and no effects to make it more different to the barbican video.

Exhibition and Set-Up:
We were originally planning to just have the barbican and the church in one video that would play one after the other but when we went for our tutorial before our presentation we had a discussion with Jonny and we decided to change it so that we have 2 videos that play simultaneously and loop that are projected onto 2 walls with 2 projectors. Me and James also decided to create 2 seperate tracks for each video which would play simultaneously and loop as well and as you walked to different areas of the exhibition to view the idea was that you would hear a different track, when putting into practice whilst it did work it could have been done better as the space was quite small so we couldn't have very far apart. We liked the idea that since the videos and audio looped and very different lengths, depending on when you saw the exhibition you would have heard a different sound and the videos would have been at different parts so you would have a different experience compared to someone else saw it at a different time. To set up the exhibition we had the 2 projectors already set up a lot of groups wanted to use them and we used two media players that Jonny had and used them to play the videos on loop, this was a bit confusing as one remote controlled both so since both the usbs had both videos on them to select different videos at first I tried to have one of the usbs unplugged set one up and then do the other which for some reason only worked 50% of the time and the other method I found which was easier was to simply cover one of the media explorers and control the other. James was in charge of set up the audio which we did by getting some speakers and plugging them into Jonny's laptop since neither of us had one and the other issue was since we had two tracks but only one set of speakers james had to quickly edit the tracks together and have one track play only form the right speaker and one only from the left. And this was big success which exactly how we wanted and from the feedback we got after the exhibition we felt we accomplished exactly what we set out to, and the last minute decision to present our work in this exhibition style helped achieve what we wanted a lot.

Set-Up:

image00001.jpeg

Progression from initial point cloud to final look:

It was very satisfying to see how we slowly developed and learned more during this project and to see how the quality improved over time.

sJQoAj8s7o0.png
LZpuOwyM2yU.png
o5Dya4-ttYY.png

>

>

>

UwSEqbrnVFU (1).png
IjeH5lhUQ1E (1).png
IjeH5lhUQ1E.png
UwSEqbrnVFU.png

>

bottom of page