EF22 – 360 Videos!

TL;DR – LINKS:

Fursuit Parade: https://www.youtube.com/watch?v=E71qcETUu10

80s Dance: https://www.youtube.com/watch?v=6qdiVpRRxak

 

Post Content:

EF22 is over *sad-face*, epic fun times were had. Now I process and (slowly, the processing takes time) release videos I recorded there (I was the goofball walking around with 6 GoPros mounted to a selfie stick, at dances, con, firepit, parade, etc)

The first full one, of the “fursuit parade” is available (rough) here:

here I say “Rough cut” because there is still some processing to do to make perfect (minor stitching errors are evident in video, and the “virtual horizon” has not been correctly set; both of these need to be manually configured each time the camera rig makes a large movement in the vertical plane, and, well, that happens alot!), for now I hope you’ll be forgiving and put up with slightly weird view/orientation/minor motion-sickness until fixed!

Dance (80s Dance) is being processed by youtube now (first version lacked the correct metadata, and was not processed to 360 rectangular-viewport; instead remains as a orthographic projection, sort-of like world maps, with a sphere flattened to rectangle. Is here is you want to see: https://www.youtube.com/watch?v=6qdiVpRRxak )

I’ll make a post with details on how these were made, and software used. I’m being a little lazy at the moment, and using AVP (AutoPano Video Pro; commercial software from Kolor, now a subsidiary of GoPro), it has the benefit of being quick and semi automated; has the drawback of being too automated at times. PTGui and VideoStitch are beastly (and geeky, more my style 😉 ) but with a backlog of 400Gb of videos, quick first, then proper! I will note that the workflow for PTGui / VideoStitch allows for some truly funky things to be done during the frame-by-frame processing, e.g. calling OpenCV with Python and identifying / tracking objects *grins* (I’ll post on that at some point too)

 

Meaw for now,

Meawmix