Wednesday 16 March 2011

More Data

Over the last couple of weeks, I have had to divide my attention between multiple projects.

The 'Going Live' project has ramped up into production, with modeling and rigging now complete. Animation has started, and texturing is currently underway. My role as CG Supervisor has been demanding, as all of these CG elements have been happening in quick succession. On top of this role, I was also responsible for implementing a customised nCloth dynamics system for our 'character'. This was created alongside the rigging process, to ensure that these components would work together happily, and after resolving a few technical problems, the system is now working nicely. My next task was to create and organise the appropriate render layers in Maya, ready for rendering and then compositing to take place (hopefully late this week or early next). Although I had worked with render layers before, this project requires more variants than I am used to working with, so has taken a bit of time to configure and setup properly. Despite all this work taking up more time than initially expected, the project has made good progress, and continues to do so.

After my meeting with mathematics last week (and several more since), my cell visualisation workload has increased also. I have received new data from both students, and I am currently in the process of writing scripts that will translate these into 3D scenes inside Maya.

The first new data set contains fibres (to be added to cells), which are based on xyz locations and xy rotations. I had not scripted rotation values yet, so this was a good opportunity to expand my knowledge of MEL. I am currently awaiting the full data-set for this part of the visualisation, so will continue to work on this moving forwards.

The other data set adds oxygen density to a scene containing cancer cells and blood vessels. This file contains over 30 million lines of information, and weighs in at around 800mb - making it rather difficult to work with. I have tried different approaches in visualising this data efficiently, such as adjusting transparency on cubes based on the density value or scaling particle clouds. Unfortunately, there are about 10,200 points per frame, so these methods take far too long to calculate. I am currently testing a new method, which creates a single polygonal plane, with the required number of vertices. The script then runs through each vertice, and moves it in the y-axis based on the density value (between 0 and 1). A ramp shader then adjusts the transparency of the plane based on the height (where 0 density is fully transparent). This creates white, cloudy patches where oxygen density is high. Although this still takes a long time to process, it is considerably faster than the other methods.

Most of this work is still on-going, and has 'arrived' at the same time, making it difficult to balance. Fortunately, I have been able to allow extra time in working on these projects, so hopefully the worst of it is over now...

On a more exciting note, three of my videos were used at an event in Dundee on Saturday 12th March. The videos are 3D visualisations of mathematical models which are being used to predict cancer growth and development, and were developed in collabroation with a PhD student in the University's mathematics division. They were shown at an event called "Sensational Women in Science" as part of the Women in Science Festival 2011.

Also, some of the other data I am currently working with will be presented at a large conference later this year (in June), so I have a deadline which I can work towards.

No comments: