1. #16
    Senior Member Follow User Gallery
    Join Date
    Apr 2003
    Location
    Ventura, CA USA
    Age
    46
    Posts
    799

    Default

    3D monitors are overkill, really.

    If you have a Direct3D signal, any NVidia card can (at a loss of framerate) move a duplicate camera a few virtual inches and display the stereo views in alternating or interlaced patterns for use with a $30 pair of LCD glasses. And there are similar drivers for OpenGL.

    Most 3D display solutions leverage those same drivers, so again it becomes a question of whether the self-contained presentation is worth such an increase in cost. Hint: it's not. Just go with the LCD glasses.

    That said, those drivers don't work in a windowed environment. Which is something every sculpting platform I'm aware of requires. You need to find one which is truly full-screen, with menus drawn in 3D space rather than overlaid by the operating system. Or you need a custom solution to be created from scratch by the makers of your sculpting platform.

    Sadly, neither of those are ever going to happen.
    --Aaron Levitz
    Web / FB / G+ / Twitter / YouTube / Vimeo / Indaba / HitRecord

  2. #17
    Senior Member Follow User Gallery
    Join Date
    Apr 2003
    Location
    Ventura, CA USA
    Age
    46
    Posts
    799

    Default

    One other possibility...

    Take a pair of cheap sunglasses. Poke out one of the lenses. Wear and enjoy.

    It's not a perfect solution, but it works more often than you'd expect. Try orbiting around your model. If the effect doesn't kick in, switch lenses, or orbit the opposite direction.

    This is also good for turning normal broadcast television 3D (and those technologies which claim to convert 2D to 3D in realtime pretty much work on the same principal).

    The idea is that you're introducing a delay before light reaches one of your eyes, such that each eye is looking at different frames of the same realtime animation. When an object rotates in Z, or moves in X, you already got depth cues through parallax distortion, but the forced time disparity takes it a step further, simulating what you might have seen from a dual-camera setup.

    Of course, when it isn't working, you'll get headaches and eye fatigue, and you'll look like an idiot. But for some of us, that's not much of a change.
    --Aaron Levitz
    Web / FB / G+ / Twitter / YouTube / Vimeo / Indaba / HitRecord

  3. #18
    Senior Member Follow User Gallery
    Join Date
    Jul 2002
    Location
    Canada
    Posts
    226

    Default hmmmmm...

    Seems I do not get automatic notification of replies...

    Original poster here. The best solution I've heard of, but have not seen in action, is the SpatialView plugin for C4D. Apparently everything on screen is properly displayed in stereo 3D. Ya need an autostereoscopic monitor, but I think the plugin also does anaglyph (red/green) and polarized. IZ3D is one of the autostereoscopic monitors; it will work with Direct3D. SpatialView offers one of their own. Way outta my league for total cost. I am still asking around about Chromadepth. I did find a post filter for Max, but post work is not what I want.

    Thanks for the input. I was surprised by the number of replies. Usually, stereo 3D for a 3D app as a forum subject gets a poo-poo, won't work, a yawn, or no replies period. I just think it's such a cool concept. I still think it'll happen one day and headaches won't be an issue.
    Last edited by shokan; 05-03-08 at 07:15 PM.

  4. #19
    Senior Member Follow User Gallery
    Join Date
    Apr 2003
    Location
    Ventura, CA USA
    Age
    46
    Posts
    799

    Default

    Quote Originally Posted by Ctrl-Z
    One other possibility...

    Take a pair of cheap sunglasses. Poke out one of the lenses. Wear and enjoy.
    I swear, I didn't make that up. It's okay to try it.

    Side note: I just stumbled across something I wrote about this back in 2003.

    "This is called the Pulfrich effect, because German physicist Carl Pulfrich discovered it back in 1922. Interesting bit of trivia: Carl was blind in one eye, and could thus never experience the effect named for him."

    Wikipedia has a lot more to say about it.
    --Aaron Levitz
    Web / FB / G+ / Twitter / YouTube / Vimeo / Indaba / HitRecord

  5. #20
    Senior Member Follow User Gallery
    Join Date
    Jul 2002
    Location
    Canada
    Posts
    226

    Default Chromadepth/ZBrush--explain, please

    Original poster here:

    I am exploring the Chromadepth possiblity first because I have the company's prism glasses coming soon.

    Can someone explain in some detail what would be needed (if it's possible) to have a preset color gradient applied to the active viewport so all distance from the viewpoint (center point of the picture plane) is indicated by color change... in the case of Chromadepth, pure red at the point where the line of sight emanating from the viewport camera/viewpoint first intersects an object/plane and then changing through orange, green and finally to pure blue for the furthest object in the viewport, along any radial line of sight from the middle point of the picture plane? This is for active editing, not post work.

    See again the sample Chromadepth example attached.

    Thanks for all the various comments and suggestions.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	Teapot3.jpg 
Views:	153 
Size:	43.5 KB 
ID:	93441  

  6. #21
    Senior Member Follow User Gallery
    Join Date
    Jun 2007
    Posts
    243

    Default

    I actually tried that busted out sunglass thing a few years ago. Was alot of fun but, when my eyes started crossing without them on I got a little freaked out and stopped doing it. It made me think of all the times mom said "keep doing that and you'll get stuck that way!"

  7. #22
    Senior Member Follow User Gallery
    Join Date
    Jul 2002
    Location
    Canada
    Posts
    226

    Default chromadepth

    I tried the Chromadepth glasses today and you can read about it it at:

    http://209.132.96.165/zbc/showthread...032#post456032

  8. #23
    Senior Member Follow User Gallery
    Join Date
    Jun 2005
    Location
    Hamm
    Posts
    188

    Default

    What about the movie function?
    is that possible?

  9. #24
    Senior Member Follow User Gallery
    Join Date
    Jan 2006
    Location
    Hampton, Virginia, USA
    Age
    38
    Posts
    115

    Default

    yes if your recording a movie and you have your shading set up properly then it will play back with it.

    I have pair of old chroma depth glasses

    My best guess is to make MatCap rainbow shader and a proper background gradient should get it done
    ZBrush - Narrowing the gap between Imagination and Manifestation DeviantART

  10. #25
    Senior Member Follow User Gallery
    Join Date
    Jun 2005
    Location
    Hamm
    Posts
    188

    Default

    ahh, a button would be great :>
    thx though...

  11. #26
    Senior Member Follow User Gallery
    Join Date
    Jul 2002
    Location
    Canada
    Posts
    226

    Default

    Original poster here: nice to see there is some interest in Chromadepth for a 3DCC app such as ZBrush. Chromadepth is the only means I could think of to possibly create depth while editing. Other methods would be stereo pairs or red green pairs offset. Neither of which ZBrush can do. The bust above didnb't work with the Chromadepth glasses, BTW.

  12. #27
    New Member Follow User Gallery
    Join Date
    Apr 2007
    Posts
    3

    Cool 3D stero with head tracking.

    Hear is to all ZBrush users programmers and Plugin dev's
    http://leonar3do.com/
    Some info for your inspiration.
    By: Moe.

  13. #28
    New Member Follow User Gallery
    Join Date
    Feb 2013
    Location
    Charlottetown, PE
    Posts
    10

    Default

    Quote Originally Posted by marcus_civis View Post
    You've hit the nail on the head when you say:


    I think it unlikely that there will be stereo view in ZBrush any time soon - it would be too much of a drain on resources. ZBrush makes the best possible use of your system resources and that's why it can do what it does.
    I think you would have to have a camera function just collecting and projecting pixels that the camera sees rather than rendering two screens of the same model or two rendered screens but only the view we see is rendered (I don't know if Zbrush currently renders from all sides within the software or if it has render "what you see" kind of mode)

Page 2 of 2 FirstFirst 12

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •