The First Conference I Have Ever Attended – Experience at SIGGRAPH (Part 1)

HPC Support Member Jack Chen reviews his first conference, the SIGGRAPH, and offers an insight into large-scale projects.

JackChen

Last week I attended the first conference in my life – SIGGRAPH (Special Interest Group on Computer Graphics and Interactive Techniques). It is THE conference for computer graphics (CG), a broad academic topic that includes special effects, 2D/3D animation, VR, computer vision, etc. I have always been interested in the field, and I also want to find a specific path of interest to start a research project. Luckily, it took place in Los Angeles again this year, so I took the Metrolink to Downtown and attended the conference!

SIGGRAPH 2019

It was such a great experience and exceeded my expectations for a conference. I like SIGGRAPH mostly for its richness in content. When I first took a glance at the schedule, I was overwhelmed by how many activities there were. For 5 days in a row, there were often 3 to 5 things happening at the same time. Below are two screenshots of the Sunday schedule. You can see they all took place starting at 9:00 am. Of course, I was only able to be at one place at a time, so I had to choose carefully which ones to attend.

     

 

 

 

 

 

 

 

 

I was totally blown away when I entered the LA convention center, and I was really impressed by this academic conference. They played super exciting music at the entrance, and it made me feel as if the center of the world was inside the convention center.

 

 

 

 

 

So what kinds of programs are included in the conference? In what ways is SIGGRAPH different from other academic events? Below is a picture of all the types of its activities. You can see some familiar ones like courses, a keynote, talks, technical papers, etc. But there are also some peculiar ones.

 

Real-Time Live was my favorite event throughout SIGGRAPH. Instead of presenting their research with slides, they showcased their technology in real time and on stage. The projects they chose were the most graphical and mind-blowing of all. For instance, a team developed a software called Gaugan that could transform the 6-year-old doodle to real-life image.

A huge area of the exhibition hall was devoted to VR. Each project was separated in a zone, and visitors would line up to give it a try. It was a bit surprising that they were using the same brand of VR headset as found at the In the Know Lab in Pomona, except some projects required the wireless ones.

VR Theater was a huge circular area enclosed by red curtains. There was a big circular screen on the wall, showing surreal star lights accompanied by music. Viewers wore a VR headset, sat in a circle, and engaged with five art pieces in a row.

Though I have tried a few VR short movies and games before, the projects at the VR Theater still expanded my understanding of storytelling in a visual format. However, I really wished there would have been some interaction between viewers. It seems that we still have a lot to explore in VR arts.

 

 

While at the conference, there were two projects which caught my attention:

For example, this cool project is an AR zombie-shooting game, developed by a Japanese team. You would wear an AR lens to see the zombies, and your back would shake a bit each time you fire or get hit. 

     

SIGGRAPH also offered hands-on workshops in its exhibition hall. They circled an area for around 30 computers. The sessions were mostly for companies to sell their software, by showing how easily you can do a cool-looking project with them.I partly followed a UE4 camera movement project, and a professor’s lecture about trouble shooting 3D printing. They were informative, for sure, but usually too long. I had too many activities to sit there for 2 hours.

All tickets came with an access to the computer animation festival. It took place in Microsoft Theater, which had 7,100 seats. On that day, the theater was full. This year, the festival had nominated works from both companies and students. They also had a huge diversity in styles, and were equally well-made. In the end, the audience were asked to vote their favorite on the mobile SIGGRAPH app. I was glad to find that the Chinese music video won the first place.

After the great show, the crowd was led to Xbox Plaza, and was fed some nice food and drink. I walked around and talked to people. Unfortunately, I was not bold enough to approach elderly-looking tech workers in the companies, so I confined my discussion with student volunteers and college researchers. It was a great view of the plaza, when it was filled with brilliant minds.

Not only was the artistic side of SIGGRAPH fantastic, but the academic side was also exciting as well. I will share my insight of the latter in the next blog.

 

By Jack Chen

Experimenting with Nanome

HPC Support Staff and Pomona College student Ekeka Abazie reviews the research he’s conducted while using the Virtual Reality environment Nanome, focusing on how the program can visualize and render molecules.

Capture of me using Nanome on the Oculus Tethered Headset

I’ve made one poster presenting my summer computational chemistry research with the chemistry department that I will be presenting on Stover Walk at Pomona College as well as at the SACNAS conference in Honolulu, Hawaii. However, I’m also interested in presenting the work that I did with Nanome while at HPC for the Pomona College summer presentations at Stover Walk. After getting permission from Jorgensen, who is coordinating the poster presentation at Stover Walk, I am now hoping to present two posters—one based on my work with Nanome and another based on my SURP research.

         It was difficult at first to decide what I would present on my poster concerning Nanome. I originally wanted to use Nanome as a research tool with the main focus being on competitive inhibition of the enzyme complex, Succinate Dehydrogenase, of the Citric Acid Cycle. I had previously worked on a similar and more experimentally focused project with resources provided by the Biology department, but this was going to be a different examination that was based more on distances and modeling. After talking to Senior Shklyar [sic], I decided against this and opted for a poster that looked more at the capabilities of Nanome; in other words, focusing more on the VR software. However, after internalizing my conversation with Senior Shklyar even more, I’ve realized that the blog posts that I write concerning Nanome are sufficient to display my work on the project. We already have 3 posters that will be representing HPC, and there will definitely be other opportunities for me to demonstrate my love and commitment to my work at HPC. Nonetheless, I still learned a lot more about Nanome by getting into it with the intent to make a presentation, so I think that deserves some discourse.

  

Normal Glutamic Acid positioning (healthy patient)

I modeled the mutation of Sickle Cell Anemia using Nanome. Sickle Cell Anemia is caused by a single point mutation at the 6th codon of the β-globin gene which causes GAG which codes for Glutamic Acid to be changed into GUG which codes for Valine. I thought this was a cool undertaking, because for me at the time, it represented a valuable use of VR in education. I’m imagining a genetics class that allows students to mutate sections of DNA and see how that would affect an organism. Imagine a whole class in VR working on mutating DNA molecules, sounds like Ender’s Game to me which is super exciting.

Mutated Valine positioning (sick patient)

I also modeled the binding of Carbon monoxide to myoglobin which causes Carbon Monoxide poisoning. Carbon Monoxide causes reduction of the central metal ion of myoglobin upon binding to Fe2+ turning the complex into Carbonmonoxymyoglobin which can’t perform the normal respiratory functions of the standard oxymyoglobin complex and leads to respiratory difficulty and, possibly, death. Pretty cool health education application in my eyes. It is also worth mentioning the attention to detail in the visualization that Nanome provides for these molecules. It’s astounding.

Normal Oxymyoglobin Fe Complex

Lastly, I learned that Nanome could run a lot of functions which I didn’t expect, such as running a multitude of commands or accept different extensions and attachments to make it easier for the user. I find the addition of extensions and attachments to be the most promising because it seems the most boundary extending for what Nanome could be used for, similar to how games that accept mods can be used for so much more.

Aberrant Carbonmonoxymyoglobin Fe Complex

I also learned how to render unique modified molecules using Nanome. For example, I rendered a modified human Lactate Dehydrogenase where the first 20 amino acids of each chain had been replaced with the GFP (Green Fluorescence Protein) from Aequorea Victoria jellyfish. I wonder what else I thought Nanome couldn’t do that it actually can.

– By Ekeka Abazie

Impressions of Nanome Curie

Ekeka Abazie uses Nanome to see molecules in a Virtual Reality environment. Here, he discusses his impressions of using Nanome Curie.

Right off the bat, I noticed a reduction in quality from Nanome when using the Oculus S tethered headset and Nanome Curie using the Oculus Quest headset. The best way I can compare it is akin to that of the original version and the lite version where you would find on a lower-tier console, which leads me to question whether it is the platform that is partially to blame.

The graphics quality on Nanome Curie was very pixelated and blocky, making analysis very difficult. Further, there were slow reaction times that added vibrations, making the careful analysis I was able to perform using Nanome on Oculus S challenging. There were also significant delays, and at times, the molecule would be moved very far into the screen which necessitated moving it to myself like I was fishing.

I wasn’t even able to analyze the same molecules as I was with the Oculus S, because many wouldn’t even load properly. For example, when I tried to load graphene from the “Featured” menu, a text box would appear with a green checkmark indicating that the molecule had loaded, and then I would get an error message. For others, that I searched using the database, I would often just start with the error message.

I also noticed that I couldn’t just focus on one molecule and enlarge or work with it. Any changes in size that I made to one molecule would be carried to the other molecule, which, yes, keeps everything to scale, but I imagined that selecting a molecule would allow me to make specific edits to that molecule rather than the entire workspace. This was particularly annoying when I had a 2HBS, a rather large molecule that takes up a lot of space, within the same workspace as a small hexacarbonyl ring. I would have to move 2HBS quite far if I wanted to enlarge the hexacarbonyl and focus on it.

I spoke with Professor O’Leary in the Chemistry Department at Pomona College (I originally came to inquire about any past assignments I could use to test the VR software) about the feasibility of VR for molecule visualization compared to already dominant software in the market like SPARTAN, Chemdraw, or GAUSSIAN. He told me that he didn’t see a need for VR in the market because the present software seemed much more suited for the task and easier to work with. However, he said that he could see how VR would aid in the comprehension of molecule manipulation and in education. He said this would especially be the case if you could input specific files for molecules using a variety of different file extensions. This is something I mentioned in my review of Nanome using Oculus S tethered, and I easily understood how I could upload files for molecules of interest and use it to work with them on that platform. However, on Nanome Curie using the non-tethered Oculus Quest headset, I don’t see how this same file upload process would work.

By Ekeka Abazie

Virtual Reality Tools at HPC Support Team’s Weekly Meeting

During HPC’s weekly Friday meeting, Director of HPC Asya Shklyar transformed the entire meeting room into an interactive project space for HPC Support members. “Let’s move the chairs to make room,” Asya said, “that way we can use the board to document all the programs associated with different fields of study in VR.” The students began moving chairs and collaboratively working together to get the space to function. Soon, the meeting room became its own VR work space.

Configuring Alienware laptops
Students interacting with Oculus Go, a wireless VR headset.


Students were able to actively learn how to connect all the equipment properly and how to begin engaging with every individual program. Asya then requested the students to organize the programs by their specific fields of study, ranging from Biology to Economics to Linguistics.  The idea was to curate various VR experiences  in order to prepare for demonstration to faculty and students. Throughout the process, students learned how to troubleshoot individual issues with the equipment while also engaging with the programs that are associated with various fields of study.

Engaging with the VR experience.


Through the interaction, students had their questions answered about how to link the VR equipment properly and how to run the programs. In the end, it was a highly engaging Friday that saw the participation of multiple students and the engagement from those who’ve never used VR before (like me).


Ekeke setting up HTC Vive.
Asya overseeing the process.