Search

ayushajoodhea

Blog about my first animation module.

Reflection.

MY THOUGHTS AND STRUGGLES.

This project is one of my greatest achievements in my studies. It tested me on many aspects in my studies. As from the first submission, I realized that it was the most demanding module this semester. However, the amount of work was never a problem. Being a science student, I was fascinated by the VFX breakdown that our lecturer, Mr Jirad Jhuboo, displayed on the projector. It included most of my favourites MARVEL and DC films. It was at that particular moment, I was determined to produce something of similar style. Little did I know how much work and effort the designers and animators of films like Transformer or Avengers put in the movie production. I know it was hard work but one thing was not still uncertain: HOW HARD IS HARD?!

I was soon to find out when the first assignments started to hit the pedestal. At that time, I had zero experience concerning the softwares to be use during my studies in Design. Consequently, I was intimidated by standard of the class. It was my first semester and I was surrounded by classmates who kept talking about layers and renderings, which made no sense at that time. While others did montage for their thumbnails, I was sketching (and I am glad that I improved a lot), brainstorming, searching for inspiration on the Internet; one thing was for sure, I was not going to supplicate. I had many sleepless nights. I sometimes had health issues. Nonetheless, whenever I look back at the semester, I don’t regret the time I spent in pushing myself to the limit instead of sleeping, because right now, I am having a great sense of satisfaction when viewing my final render. It does not match a great Micheal Bay movie, but I don’t recall pressing the replay button like I have been doing with my rendering just a few minutes ago.

In addition, the greatest struggle I had, was to film on a green screen. I had to watched out for the lighting, shadows, camera angles for the subject to not get out of frame. My video is mainly a fight scene and I tried to make it as fast paced as possible. It was very difficult to film the scenes on a 3.5mx3.5m green screen. My project would have definitely become better if it was shot on existing settings but I asked myself what would my other fellow classmates do. I tried not to follow the masses and it was a quite challenging but enriching experience. I also imparted the knowledge I acquired in relation with chroma keying with some friends. One thing I can be sure of it that I will never forget what I learnt as the amount of time I had to re-do something or correct my own mistake is very high.

THE THINGS I HAVE LEARNT.

After Effects was the software I hated the most. But Mr Jirad taught us about the basics, things went down the spiral. It has been my favourite software until now. I remember editing my Animatic for hours. I was taking everything into consideration, every little detail like a micro second delay of a panning. I can’t wait to learn more advance tutorial to implement in my future assignments.

Moreover, I learnt the importance of brainstorming. I now make sure to brainstorm as long as I can and pick at least the 20th idea. The realm of compositing is very fascinating as well. My perspective of viewing things has greatly changed, especially in the movie theatre. Everything I watched, any piece of motion graphic or even YouTube videos arises questions. These questions are then answered by research. I have also learnt to look for visual inspiration. I remember playing the Hitman video made by Freddie Wong and least ten time. Although, the cinematography and compositing techniques were top class, there was something that kept me riveted to the video. I finally realised that it was the acting that made all the difference. This is what pushed me to do a sword fight as the main focus in my composition because my brother, who is one of the most reliable person, is a great athlete. I avoided the delivery of dialogues. This is because it is difficult to find someone who can at least stand on front of a camera to act, let along deliver lines. To be honest, my final composition did not come out as I expected, however, the action is one thing I really wanted to put emphasis on and it finally paid off.

HOW I CONSTRUCTED THE PROJECT.

 First of all, I had numerous ideas that were a result of brainstorming. However, I had to look at the practicality of the realization of every idea. Inspiration came from Batman and Ironman, however, I don’t have a The Wayne Enterprise or Stark Company at my service. I sticked to a ‘low’ budget sword fight with Katanas used for display and demonstration only. Having a rough image in my mind, I started storyboarding. I had difficulty expressing the visual images in my head, as I wasn’t being able to draw well. Thus, I decided to photograph my actors on a green screen (whose availability was not mandatory) and to use the pictures as submission. When receiving my feedback from the lecturer, Mr Jirad, he told me that there was the possibility of me drawing the backgrounds instead of merging two live footages together. He advised me to vectorise the background. I had trouble with the pen tool at that time, therefore I tried to do it on Photoshop. I used a Graphic Tablet to draw. I was warned that it was not going to be easy and after A LOT of practice, I was finally being able to draw. Once the backgrounds ready, I dived into filming footage. I did a few video test and finally, I had a 10 hours cumulative of shooting, in one day, with my actors. Their patience was DIVINE. I was passionate about what I was doing and so were they. Once all I was the footage done, advised to put a climax  to eliminate the typical mainstream fight scenes. Mr Jirad gave me the greatest inspiration ever of climaxing the video by waking up one of the actors. It would just be a dream. I gladly took it and further meditated on it. I believe it would go well to show that the weaponry used would be with the protagonist to end it in a dramatic manner. Last but not least, the footages were imported in AE and the final composition was obtained after arduous and painstaking amount of time and effort.

P.S. I would like to thank my lecturer Mr. Jirad Jhuboo for giving such great ideas, advices, encouragement and support. He has been a guide, an inspiration, a mentor, Captain America and he was with me until the very end. Gratitude, Sir.

VFX Breakdown.

Final Composition: The awakening.

iLecture #10. Typography in Motion.

This iLecture is the final lecture in the compositing series. Jarrad Gittos explain about typography in composition and how when can use it to make movie intros and for our final project. The power of typography should not be overlooked. It can really influence the intro of a composition. It is not only words on the screen.

Typography is use in every sub motion production or any type of visual production that is done. Usually, many trials are done to best suit the type of video that is being introduced.

Using juxtaposition, music, colour grading, timing alongside with typography can make up a really powerful introduction.

One interesting method of introduction is by the use of kinetic typography. This includes a lot of motion graphics and the creation of visually stimulating artwork. Basically, it is graphics aided with music to enhance communication.

iLecture #9. Colour Grading and Stylising.

Summary.

This ilecture is about Colour Grading. This compositing technique can up the quality of a composition if done correctly. Colour grading is the adjustment of multiple layers of footage. This includes changing the contrast, brightness, so on and so forth. Colour grading can be considered as small tweaks in colours but can also be use to create a new style to put emphasis on emotions, enhancing the footage or giving a new atmosphere to the footage. Colour grading can be done digitally or chemically. Nowadays, almost all colour grading is done digitally to make multiple shot look similar in colour and contrast. It is obvious that any footage that is colour graded is definitely more appealing to the eye.

What I have learnt.

 

Colour grading is when the colour of footages or a composition is corrected to give it a better look. Apart from brightness or contrast, elements like white balance or filtration of the RGB colours can also be carried out. Moreover, it can not only be used to correct the colour differences but is also use in in green screening especially when there is green spillage on the actor.

Colour grading is also a technique that makes compositing a lot easier when adjustments in footage needs to be done and It saves us the time of re filming another footage.

Technical.

What makes colour grading possible? It is the fact that each pixel has data about the information on brightness, contrast and RGB colour. This is called colour depth. The maximum bit number for colour nowadays is 24 along with 8 bits used for opacity.

Another importance thing to consider when colour grading is when compressing footage, the bitrate of the footage in also compressed. Colour graded compressed footage can look pixelated. To prevent this from happening the uncompressed Animation mode can be used and the video can be compressed after working at a reasonable colour range.

NOTE ON CORRECTING A FOOTAGE:

The colour between shots must match each other and the tinting in white balance must be matching as well as the lighting condition. If the shot is too blue, some orange colour can be added and vice versa. A little tweaking in saturation can also be done.

For grading, there are no specific guidelines but some aspect must be kept into consideration:

  1. Keep as much details as possible.
  2. The skin tones of a subject must be closely looked after. They must look natural.
  3. Use video scopes.

One should bear in mind that when compressing a colour graded footage, the details in the video are lost.

iLecture #8. Special Effects in Compositing.

Summary.

 

This lecture is about V and how we can use them in a composition. It is important to note that Jarrad Gittos highlights the importance of making the VFX applied to look realistic. I remember watchin the VFX breakdown that Jarrad showed at the beginning of the iLecture in the first class we had this semester and I was so impressed and impatient to actually learn how to do it. Obviously, as we are almost through the semester, the VFX breakdown becomes more clear as we have an in depth knowledge in the various compositing techniques. It is easy to identify when a scene is colour graded or when 2.5D layering is being used. Also, at the start of the semester, many of us would not have paid particular attention to the music used in this VFX breakdown, nonetheless, now it’s importance is known. This iLecture gives us a basis on how we can create our own effects without depending entirely on computer simulation or 3D.

What I have learnt.

 

VFX, also known as particle-based effect, is the visual elements that has been added on an original footage which contains on the scene filmed directly from a camera.

One of the main advantages of using VFX is that it is safe. Blowing up a car in a scene has never been safe in a figurative sense. Another is that it reduces the production cost. People usually won’t blow up a building for a 3 second footage.

How are they created?

Apart from the similarity of being digitally generated, VFXs, in itself, can be produced using different ways:

  1. Matte paintings and stills.
  2. CGI.
  3. Models.
  4. Live action effects.

Paying attention to the small details: if this is done correctly, then the special effects will look really realistic. The depth of the element is also very important. Some elements can be situated in front of the element and other can be situated only. Creating a light saber is not enough. The light tint emitted from it must be included as well and this is very subtle.

Further Research.

 

Digital FX: The emergence of Digital FX has been found out to be relatively recent. It surfaced in the 1970s. One of the first film that inspired many great directors to use Digital FX is the Stars wars’ briefing scene

Miniatures: The use of miniatures is such that a small model of something is film. That object is then repositioned and scaled into a composition. It is really easy to create and manipulate effects like flames, fire or floods on a really cheap budget.

Matte Paintings: It is the paintings that are done to create an environment, which was not there initially. One way of using it is by painting on a glass and putting this glass pane in the shot. This was actually done first in 1907. It was not very effective to what we have today though. Matte paintings were of the same equivalence that 2.5D layering has. Nonetheless, the thought process of actually creating matte paintings in the 1900s is very impressive.

Live Actions: It arises from videos. The effects are generated from the video itself. Real flames, real explosion or real car accidents are shot. These footages can work marvelously well with clever planning and green/blue screening. Live action is there as an old school method for those who does not want to generate effects on a computer only. It is important to note that the effects generated from live action are to be separated and put into isolation on a different separate layer. To facilitate our task, actions which was already created can be downloaded and use. There is no point of creating our own.

TECHNIQUES FOR COMPOSITING EFFECTS

 

The goal in creating VFX is such that it would be fairly calculated to be merged in a live action footage and make every thing look like one footage. There are 2 ways in creating VFX on a transparent level. The first one is either by Chroma Key and the second one is done by digital means.

Alpha Matte: Alpha matte is a record of transparent pixels or a value of transparency in a pixel. Alpha mattes can be created from CG renders, Chroma Keying or rotoscoping solid objects to be used for cropping elements.

Luma Matte: These are generated from contrast generated from black and white. Foe example, if a picture contains yellow dots in a black background, and if you increase the brightness and contrast such that the particles are no longer yellow but turn into white and the black background is darker in some area that is was not before, this information can be use to make the white bits visible and the black parts invisible. Thus effects can be separated into different elements. Luma mattes are used mainly for VFX like flying or creating smoke on a white background and sparks. Basically, it involves anything that has some sort of contrast.

Garbage Matte: In simple terms. It is a rough outline around a subject, which was shot on a green screen and the unwanted elements are removed. They are not that effective and are very simple to use. When using garbage mattes, the foreground must be clear and the actor must strictly be in the area, which involve the green screen from the perspective of the camera.

iLecture #7. Tracking for Compositing.

This iLecture is about 2D and 3D tracking. As stated before, tracking is one of the five fundamental compositing techniques. Tracking is use almost every time and is complementary to all other compositing techniques as the likes of Chroma Key and Rotoscoping. This compositing technique is much more recent compared to Chroma Key or Rotoscoping which dates almost a century ago. Jarrad explains that tracking was initially used for military purposes.

What I have learnt.

 

Tracking is done when the digital compositing software generates some sort of algorithms, which will track a point from one frame to another. In relation to this, tracking markers needs to hold a degree of uniqueness so that there is no problem in tracking a specific point in a footage.

Types of tracking:

-2D tracking.

-3D tracking.

Both types of tracking are similar except for 3D tracking requires an extra dimension.

 

2D tracking can be use to composite backgrounds into green screen footage, to replace 2D planes oriented object, point tracking a text, making rotoscoping task a lot easier and finally, to stabilize camera footage.

3D tracking is used to put 3D objects and animations in a live action, to generate VFXs, to track 3D objects to the objects, which are already in motion, or to create 3D environments.

iLecture #6. Green Screening and Rotoscoping.

Summary.

 

This iLecture convers the two fundamental techniques and skill sets that is important to know for creating a good composition. These are the first two of the five compositing techniques of composition. According to the very first ilecture, composition was defined as the taking multiple elements and merging them together to create something new with a new appearance. This is called layering. Moreover, to creat effective layering, it is important that the layers are isolated separately as mentioned by Jarrad. Chroma Keying and Rotoscoping can do this isolation.

What I have learned.

  1. Chroma Key.

It is more commonly known as the Green or Blue Screen. Using After Effect, the green or blue color is removed and something new such as a background is inserted as a second layer. Chroma key looks quite simple but there are many important things to take into consideration when filming on a green screen.

The quality of the camera can decide the quality of the final product. Low quality cameras make the procedure involved in Chroma Key a bit more difficult.

Although there have been great development in technologies, the Chroma Key technique remains the same. As mentioned before, better camera technologies give better keying quality and there are software like Photoshop, After Effect and Premiere Pro that have Chroma Key options incorporated in them.

Some things to watch out when Chroma Keying:

-Reflective items on the subject will reflect the green colour of the green screen. Jarrad explains that light particles tend to absorb the colour of the surface it hits. Consequently, a light particle which has hit the green screen and also the subject will have the colour green transmitted to the latter. Thus, there will be unwanted spillage on the subjects.

-The subjects must not wear green as it is also be keyed out.

-Use a blue screen there are outdoor shootings that needs to be done.

-The screen must be evenly lit and so as to not have any shadows. In other words, there are different color intensities on the screen. This makes keying difficult.

-The backgrounds must be film first so as it becomes easier to place the subject in the scene.

-The light setups for both the actors and the green screen must be set up independently.

  1. Rotoscoping.

In simple terms, rotoscoping is the tracing over footage. It is use to create a more realistic animation and is also use to hide or mask unwanted things in a footage. The rotoscoping technique is achieved by using live action or the pen tool.

Rotoscoping can be achieved in After Effects by creating an empty mask and tracing over a footage. This trace material then becomes a matte. A matte is something that can be masked or used to apply certain effects to restricted areas.

iLecture #5. Video Rendering.

Summary.

 

This iLecture is about video rendering. Jarrad Gittos explains the format and settings that we should use for our final project. All video settings can be controlled in the software we are using.

What I have learned.

 

The first thing Jarrad highlights is that when editing videos, the files should be uncompressed. This applies for videos including VFX, SFX or any normal video editing. The problem with compressing footage is that the quality of the video is then consequently lost.

I have also learnt that whenever footage needs to be film, the highest resolution is always used and work with of editing software. When the final product is finalized, it is then that the settings are adjusted as per the requirement specified in the brief. As a result, when the footage is downsized, the actual quality further enhanced compared to the usual notion of losing quality.

However, one reason to lose quality of footage is by scaling and this must not be done.

There are two different filming modes: PAL (25fps) and NTSC (29.97).

NOTE: A null object must be used to alter the many footage a composition has all at once and thus, eliminates the problem of having to adjust everything manually.

Audio levels are very important to consider as people are often misled when using earphones or speakers.

It is very crucial that the camera is set up on the correct settings first. This is because it will make the video editing process easier to carry on. The highest resolution the camera can offer must be used. If the composition has some slow-mo scene, then it can be quite advantageous to film at 60fps.

Blog at WordPress.com.

Up ↑