Background:
So, as per my bathroom post (link), I’m working with VR/360 stereoscopic solves in different rooms in my house. I wanted to get the details of this project posted, so that we can talk turkey about how it’s coming together. This post is kind of a retcon to catch up with the activity in the house as I work out the details of this newfound ability to solve scenes with my single Ricoh Theta S camera.
Intent:
- Capture images and video of another larger space (in this case, my living room, dining room, kitchen, and office and recreate it in Stereoscopic Virtual Reality.
- Create a little CG scene and render it out with stereoscopy.
- Solve my Lat-long dilemma.
Yep, I know it’s like three projects and a retcon, but it’s how this past week or two transpired. I’m feeling a bit chagrined by how many ideas I’m getting all at once for VR projects, but loving riding the wave (snake?) and seeing where it goes.
Tools:
- Hardware:
- Ricoh Theta S 360 camera for the overall capture of the space in 360
- iPhone 7 Plus for activating the Ricoh remotely. No need for tracking is what I discovered on the bathroom project.
- My desktop computer, with its NVidia GTX 1080 Founder’s Edition & HTC Vive
- My laptop, a MacBook Pro (Retina, 15-inch, Late 2013) w/ 16GB RAM
- Software:
- Ricoh Theta S stitching software
- The Foundry’s Nuke
- The Cara VR toolset from The Foundry, a plugin for Nuke
- HDR 360, a super cool app that allows you to capture 360 HDR’s with the Ricoh
- Maxwell Render Personal Learning Edition 3.2.1
Side Note: it really sucks that Next Limit, the folks that make Maxwell Render, no longer offer a Personal Learning Edition or any other Non-commercial Edition for folks that don’t have the money to invest in the full version of the software. I mean, the Personal Learning Edition cost me $250 USD. It’s not a free download. To Next Limit’s credit, they offer updated plugins for use with the latest programs and have a special download area for folks with the old license, but I don’t know how long that’ll last as new software versions came out.
That said, I LOVE Maxwell Render. No fiddling with esoteric settings; it’s all physics and real-world unit based, right down to fStops and film speed. They setup algorithms that model how light actually works. You set it up and you wait. I like the no-fuss approach, but I probably need to learn how to optimize my workflow a bit. I actually got one of my images (which I think I’ll sneak into the CG project), featured in their gallery… but it’s no longer up since they are launching their new version. Le sigh.
Regardless, another of the amazing things that Maxwell Render can do, as of v3 and above, is render out Stereo Lat-Longs for Equilinear projection. In short, VR from a render engine. Cool.
Method:
- I’ll follow the same setup that I did for the bathroom, but see if I can push things and clean up the ceiling and floor a bit and get a better stitch; see if I can get rid of the seams.
- I started fiddling with a sample scene in Maxwell. I’ll tweak it about and see if I can get a decent looking space that feels real enough for VR, rendered out in stereo and composited in Nuke.
- Figure out how to use a number of stitched lat-longs with Nuke and CaraVR to “solve” and stitch, so that HDR’s can be used instead of the lower-res video that the Ricoh shoots, since the video can come out as two fisheyes, but the photos come out prestitched.
Deliverables:
(At least) Two VR movies. I’ll post them to YouTube and embed them on the blog. If the third portion of the project gets any traction, I’ll post more of the bathroom and perhaps the downstairs. I have some other Ricoh footage to work with, but need to get these things worked out if I can first.
Deadline:
I’m giving myself a until next week to work on the projects. I’ve actually been working on all of this concurrently as I segued from the bathroom project. I just wanted to post to keep up the habit. Work is good!