FAQ

Can I embed PanoMoments on my own website?

Yes, but only Pro members can embed PanoMoments. It's just a matter of copying a simple iFrame code snippet (just like a YouTube video) and pasting it into your website. You can find this iFrame code snippet on your user gallery page (it's the little arrow icon). You may also want to add this small JS library which provides a workaround for iOS gyro issues with iFrames. Please feel free to reach out via email if you have any issues with embedding.

Why shoot PanoMoments vs. regular 360 video / photos?

This is a great question, especially with the influx of easy to use 360 degree cameras coming onto the market (Ricoh Theta S, Nikon KeyMission 360, Gear 360, Vuze, Sphericam 2, etc.) PanoMoments are not meant to replace 360 video / photos; each have their own strengths and weaknesses. We see the primary difference being their creative uses, and that’s why we’re bringing PanoMoments to the world - we want to see what your imagination brings.

Can I use a smartphone to capture PanoMoments?

Yes, but you’ll need to find a compatible fisheye lens attachment and also an app that allows you to shoot video with the exposure locked. Unfortunately, most fisheye lens attachments for smartphones aren’t full circular, and they often are designed as more of a low quality toy lens. Over the next few months we will be talking with lens manufacturers about the possibility of releasing a high quality circular fisheye for popular phones.

Why would I want to shoot with parallax?

One of the biggest hurdles of standard 360 degree photography is the requirement of either specialized camera equipment (eg. Ricoh Theta S, GoPro Omni, etc.) and/or software to stitch the panoramas together. In fact, the only way to perfectly stitch a panorama without visible parallax artifacts/errors is to rotate a single camera around a special point known as the No Parallax Point (NPP). Otherwise, you will always end up with parallax stitching errors, and even then you will have to use masking to deal with temporal stitching errors (ie. when subjects move).

Stitching error example

Most importantly, no matter how good you or the software/camera is at stitching, the resulting stitched panorama will always be at a single fixed perspective (or 2 fixed perspectives in the case of stereo 3D stitched panoramas). This fixed perspective (ie. lack of motion parallax) means you can’t "see around" objects as you pan left and right (or up and down) and oftens feels very unnatural. While PanoMoments only capture parallax along the horizontal axis, the experience is much more natural to view compared to stitched panoramas and can actually provide a sense of depth/immersion without stereoscopy. That being said, PanoMoments captured in 3D are going to be even more of an incredible experience. See below.

How do you capture PanoMoments in stereo 3D?

The underlying principle behind PanoMoments is in fact very well suited for stereoscopic viewing due to the natural horizontal parallax, large number of virtual camera viewpoints, and lack of stitching errors. To capture in 3D you’ll need two cameras that are triggered simultaneously while they are rotated. And for completely static scenes, it will be possible to capture 3D PanoMoments with just a single camera and one rotation. Stereo 3D PanoMoments should be available sometime over the summer of 2017.

Can you create a PanoMoment with photos captured without parallax (ie. shot on the NPP)?

Yes... but we don’t really recommend this as it negates one of the large advantages of PanoMoments - the visualization of depth as you pan left and right. That being said, it’s entirely up to you, and one advantage of shooting a PanoMoment without parallax is the lower frame requirements. Ie. you can effectively shoot a PanoMoment without parallax in as few as 12 frames using a full circular fisheye lens. Keep in mind that the lower number of frames you use, the more likely angular deviation error in the capture will be visible (ie. visual jump when frames transition).

Can I convert stitched 360 video/timelapse to the PanoMoments format?

Yes, and in fact Koen Hufkens, an ecologist at Harvard, has already written a script available on GitHub to help with the conversion. However, there are downsides and some specific requirements to consider: there will be no motion parallax as 360 cameras can’t capture parallax like a rotating camera, and any stitching errors will be visible. You'll also need to ensure that the pre-stitched 360 video content was captured while stationary - content that was captured on a moving (ie. handheld, attached onto a bike, etc.) will not convert to the PanoMoments format. That being said, perhaps PanoMoments most defining feature is how they couple time with space, so we do understand that this conversion could become a great way to make video/timelapse content coming from 360 cameras such as the Gear 360 much more consumable and engaging.

What about the Zenith/Nadir (top and bottom) regions?

PanoMoments aren't stitched like traditional 360 content. This means that they don't contain the full 360 degree image in each frame. Most likely you'll be capturing around 180x180 degrees compared to a full spherical 360x180. This does mean when you look up/down on a mobile phone, you'll notice regions where there is no image data. This is in fact just part of the trade-off PanoMoments make to gain their ability to show movement and motion parallax. The truth is that most people don't spend much time viewing these regions; it's just not a comfortable head position. However, we will likely be adding a mirror+blur option of this region to reduce the contrast which will make it easier on the eyes as well as opens up more compatibility with photo sets that that contains <180 degrees vertical coverage.

How does a robotic panorama head control the camera's shutter?

Most robotic panorama heads come with a 2.5mm camera connector (some have two or more) that allows the camera to be remotely triggered. However, not all cameras have this feature, so make sure to check. If your camera doesn't have wired trigger compatibility you can still use a robotic panorama head, but you will need to find another way to control the shutter. This can be as simple as a rubber band holding down the shutter button, or if you're using a video camera you'll just begin the recording before starting the rotation.

Can I rotate a 360 camera like a Samsung Gear 360 or Ricoh Theta to capture PanoMoments?

While you’d only be using half of the 360 camera (ie. one lens) it should theoretically be a reasonable solution for people looking to get into capturing PanoMoments in addition to regular 360 photos/videos. This isn't something we can fully recommend at the moment due to a lack of manual exposure controls on current 360 cameras, but it is certainly possible to capture using the auto-exposure modes if you have relatively even lighting around the 360. You'll also be limited to a max of 4k (around 2k by 2k usable resolution) and it will already be much more compressed compared to a still camera capturing JPEG or RAW photos.

Can I capture PanoMoments with greater / less than 360 degree rotation?

We’ve done captures like this and it’s something we are strongly considering offering. Let us know if you’d like to see this supported.

What about fixed perspective and linear “Moments” rather than 360 panoramas?

We've actually played around with a few tests like this. Depending on feedback, we may decide to build it into the platform.

Can a PanoMoment be used as an alternative to software like Hugin, Autopano, or PTGui?

Yes and no. While PanoMoments don’t rely on stitching, they currently do require the photos to be uploaded in Equirectangular projection. To accomplish that, you’ll need to use the same software used for stitching (Hugin, PTGui, Autopano, etc.) but the workflow is significantly easier. You can easily build a template file that makes it a 4 click process (load images, apply template, adjust crop, export). It’s important to understand that unlike a stitched panorama, a PanoMoment can’t be printed or shared in a way that shows all 360 degrees at once. PanoMoments are only viewable on digital devices, so they can’t fully replace panoramic stitching software. We do hope to one day provide support for native fisheye and rectilinear photos. At that point, you’d be able to upload straight from your camera, bypassing the Equirectangular conversion step.

What if there are errors / angular deviations during the capture rotation?

PanoMoments rely on frames being equally spaced around the 360 degree rotation. Any deviations will result in alignment errors during viewing. That’s why working with a robust and accurate rotating panorama head is extremely important. Minor alignment errors aren’t usually much of an issue due to the large number of photos captured. Down the road, we’re looking to add software stabilization to the viewer so that any alignment errors captured can be corrected during the rendering process.

As the photos are captured over a period of time, isn’t there a jump in time at the beginning / end?

You’re right, and you can see that in this example (pan around the clouds on the street where the man is sitting). This temporal disparity / cut-point, does result in a "jump" between the last and first frame as you rotate across this point, and is part of trade-off PanoMoments make compared to stitched panoramas. However, it’s only visible if there is motion located at this point, and it can be used in creative ways allowing you to play tricks on the viewer. We may end up offering features for content creators to fade/blur the transition if they choose, as well as potentially offering a way to create PanoMoments with >360 degrees and <360 degrees captured.

How big are PanoMoment files?

It depends on the number of frames and the quality selected at playback, but typically it ranges from 30MB - 150MB for the web viewer (future native mobile apps will be initially optimized for performance resulting in larger files). On the desktop viewer we start the experience by downloading a small but complete 360 degree set of images, and then download additional frames. This allows you to start interacting with PanoMoments quickly. The mobile web viewer doesn't have this feature yet so it must download the entire file before viewing.

I’m having playback/performance issues. How can I fix this?

We calculate your device’s capabilities by first running a benchmark, and then we configure the playback rate accordingly. However, sometimes this benchmark guesses wrong and you might see the player “jitter” when you pan. If this happens, or if the playback is sluggish, first try reloading the page. If that doesn’t help, try selecting a lower quality. If that still doesn’t work, try another device. We’re working on improving performance and compatibility, but not all devices (especially older ones) will be compatible with PanoMoments. The best viewing experience is currently on a modern Android or iOS 10 device. The Android PanoMoments web-based viewer actually runs a little bit differently than iOS and Desktop Chrome. It renders the virtual camera at 60fps while the frame updates happen at a lower rate. This results in a much smoother viewing experience but means that you can “overrun” the frame if you pan too quickly. Viewing a PanoMoment rendered at 60fps is a special experience, but not all computers will be able to achieve this, especially as rendering in a browser adds a bit of overhead.

This is why we’re so excited to get PanoMoments running on native applications. Most modern mobile/desktop devices will be able to render both the virtual camera and frame updates at 60fps - the threshold where you can achieve “presence” and crucial for a good VR experience. If you’d like to test 60fps on your Desktop computer, you can try selecting one of the lower qualities and hitting the 1 key on the keyboard (the top number keys, not the right-side numpad) after the download is finished. This will enable 60fps playback, but depending on your computer’s capabilities and size of the browser window, it might break the experience (you’ll see lots of “jittering”). Reload the page to go back to the automatically calculated playback rate.

It says my computer doesn't support WebGL even though I know it should. Can I fix this?

In order to view PanoMoments you'll need to have a modern computer that supports WebGL. Most computers do, however, there was a recent change in Chrome that blacklisted the Intel HD3000 GPU found in many older Macbooks (and other computers). Check out this Google support ticket for some additional background information on the issue. If this issue is affecting you, it is possible to override the blacklist. Type in chrome://flags/ into the address bar, and enable "Override software rendering list" and restart Chrome. This should allow WebGL apps including PanoMoments to work.

Do you have any more details on desktop computer compatibility?

Almost any computer built in the past 5-7 years should be capable of viewing PanoMoments in either HD or UHD resolutions. The most important factor is whether your computer is capable of GPU h.264 video decoding. Most consumer GPUs such as Intel HD Graphics, Nvidia GeForce, and AMD Radeon include h.264 video decoders. Some professional Nvidia Quadro cards currently do not have GPU video decoding enabled in Google Chrome, and AMD Radeon cards only support GPU decoding for videos <1080p - see here for the Chrome bug. We are working with the Google Chrome engineers and hopefully these issues will be resolved in the near future.

How can I auto-rotate / play PanoMoments as shown in the Kickstarter video?

Yes! Sorry for not making this clearer. It’s only possible on the Desktop viewer currently and it will only be available on computers that can render PanoMoments at greater than 15fps (as configured by an automatic benchmark algorithm). To try it out, you need to wait until the download is fully complete, and then you can hit the left/right arrows to auto rotate. The more times you tap the key the faster it will rotate. Hit the spacebar to stop rotation. If you’re feeling like living on the edge, you can try forcing the player to 60fps while auto-rotating on either the SD/HD qualities by hitting the 1 key on the keyboard (the top number keys, not the right-side numpad). Be aware this is not fully supported and there’s a very good chance this will break the viewer (you’ll see it “jitter”) so be prepared to select a higher frame skip (hit 2,3,4,etc.) or reload the page. We’re working on ways to allow for 60fps auto-rotation on all devices.

Would you consider open sourcing the core PanoMoments viewer?

This is something we’d love to do, but we first want to get the medium off the ground and we believe that our current approach gives it the best chance for success. We’ll revisit this very important question in the coming months.

What about a virtual tour using PanoMoments?

We think that could be really neat and it’s something we are considering. However, it’s important to note that regular 360 photos require much less data and may be more appropriate for some types of tours. A hybrid approach may be the smartest where you could integrate PanoMoments into an existing 360 photo virtual tour. We're hoping to build a SDK / library that developers can use to integrate PanoMoments into their own applications.

Could you add sound to PanoMoments?

Absolutely! We are thinking about ways to add both ambient and captured sound to the experience.

What about partnering with a stereo 3D camera maker like LucidCam?

We’d love to! We are actively exploring several such opportunities.

Wouldn’t lightfield technology such as Lytro be a better solution compared to a rotating camera?

Lightfield technology is truly incredible, however, we are still a long way away before a consumer Lightfield camera that can capture video in 360 degrees becomes available. The Lytro Immerge doesn’t quite count as “consumer” :) The beauty of the rotating camera method is that you can use any camera, a whole range of lenses, and it doesn't need to be physically as large as a 360 lightfield camera array. It is possible that one day we’ll be able to support data formats that Lightfield cameras output.