Seyoung Kim

CLASS 13 (12/15)

RealSound from Seyoung Kim on Vimeo.




CLASS 11(12/1)



CLASS 10(11/17)

 The main object in my app will look like this. (particles, shader objects, cubes, etc...)

 In the concert hall or venues like(Terminal 5) people gathered to enjoy the show.

 Everyone is holding their phone up to take pictures, and videos. Blocking your sight.

 The user takes out her phone and start the SoundViz app.

 Screen recording shot. Everyone dances and the object is moving based on the sound too. 

  Scene changes and shows lot of situations that it could be used. Like when people are busking in the subway. 




CLASS 9 (11/10)


CLASS 8 (11/3)

Troubleshooting
For this week, I focused on finishing MVP(Minimal Viable Product). For my case, I thought making sound visible with simple object is my MVP so I tried to make a sphere bigger and smaller depends on the volume. However, the beat detection I used last week had a problem that it only detects the "beat"(which is pretty obvious by its name). It worked like a drum, so the output will only be energy, kick, snare, hit hat. I tried to modify the code little bit, but it was too complicated using difficult math and frequency and failed to get the range of frequency(beat). Also, I was struggling with making only one cube reacts to the all four kinds of beat and transforming scale, not the color of the object.






Shaders
Meanwhile, I decided to start studying shaders in Unity since my project's visual part is crucial and spent most of my week on this. I did some research and shader would be a nice way to add the fancy effect to my objects because I wanted the whole scene to be little wobbly and coveys fluid feelings. I watched lot of tutorials about shaders and practiced a little bit. There are a lot more to study but it was quite interesting. I decided to used wiggly effect on the objects. 










* how my object looks like without any input

Microphone Volume Input
As I said earlier, beat detecting isn't the perfect solution for my MVP. So I found another example of code which uses the volume and applied it to my scene. Unlike the "beat", I could use the range of the volume and could catch all the sound from 0 to maximum volume. I related the volume to the scale and each range had different textures. I used mostly water color textures for the material.

     

And here is the this week's result:





CLASS 7 (10/27)






I started working on getting phone's microphone input and making simple interactions on the simple cube. I found the way to do it on laptop without ARKit plugin but still figuring out how to make it work on the phone.

CLASS 5 (10/13)






















CLASS 4 (10/6)

Homework


AR Kit











CLASS 3 (9/22)

Homework


Load Unity scene on public space






I was walking on the street and just realized there are lots of NYU flags all over the place. I thought it would be interesting to use this flag as a marker. It worked for both flat graphic logo, black and white printed logo and most importantly, it worked with the picture of the flag on the street.


First, I used this flat graphic image to work on unity. For the overlay, I used a human model made in Adobe Fuse and Mixamo. 
Unfortunately, the texture mesh(skin) wasn't imported to Unity. I couldn't fix the problem but kept working.





I made the center white box of the logo as a dance stage and made the avatar dance on it. I wanted to give more animation to the scene but since I am not that familiar with Unity yet, it was so hard and had lot of issues while doing this work. But I liked using my avatar in AR, so I would like to expand this idea and want to see how far I can go. 




CLASS 2 (9/15)

Homework

Printed Image AR




















This week, I worked on an AR project using a printed image as a trigger. I came up with the idea of using a brand's identity because I wanted to use an image that is already familiar with people and they can see easily. I picked Bombay Sapphire gin's logo.

I wanted to explore how AR could be used in brand identity design. So I decided to expand the brand's impression to users and give deeper experience. 

I focused on emphasizing jewels because the name of the drink is Bombay "Sapphire". I added some sparkles and shining effects. Then I moved letters side to side just like people read words when they're drunk. Finally, the blue liquid(gin) will fill up the bottle and make the letters and jewels float around. 


I used Aurasma for creating AR. I had some trouble adjusting the trigger image. Also matching the marker with overlay video took some time. But overall it was easy and fast.







CLASS 1 (9/8)

Homework

1. Experiencing an AR app


I downloaded a mobile app called "WallaMe". The users of WallaMe can draw/write anything on top of the picture of any environment around you. The drawings are only visible through the app. You can make it either public or private so that only your friends could see them. 


I really enjoyed using this app and had so much fun playing with it. I like the idea of using AR for private message with my friends. Not only I have to send the invitation to my friend but also we should share the same memory - the one who wants to find out my message must recognize the location by seeing the picture. I think this could be a great way to use AR in social media.

I am assuming that this app is using marker based AR. I tried to take picture of a white plain wall and it didn't work. Also, the drawing function should be improved since drawing with fingers on the screen is so difficult. 












2. Memory







I created a scene from my memory from last trip to Puerto Rico in spring break. My friend and I went on night kayaking in tropical water area. 


No comments:

Post a Comment