zSpace, Inc.

United States of America

Back to Profile

1-83 of 83 for zSpace, Inc. Sort by
Query
Aggregations
IP Type
        Patent 82
        Trademark 1
Jurisdiction
        United States 79
        World 4
Date
New (last 4 weeks) 1
2025 April (MTD) 1
2025 March 1
2025 February 4
2025 (YTD) 6
See more
IPC Class
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 28
H04N 13/04 - Picture reproducers 27
G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance 20
G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks 18
G06T 15/00 - 3D [Three Dimensional] image rendering 17
See more
Status
Pending 5
Registered / In Force 78

1.

Screen Position Estimation

      
Application Number 18492963
Status Pending
Filing Date 2023-10-24
First Publication Date 2025-04-24
Owner zSpace, Inc. (USA)
Inventor Ovechkin, Oleg

Abstract

Systems and methods for display position estimation, e.g., in a three-dimensional (3D) display system rendering interactive augmented reality (AR) and/or virtual reality (VR) experiences. A portable computer system may receive one or more images of a portion of the portable computer system captured via at least one camera, e.g., via at least one camera of a portable device. The one or more images may be received via wireless communications with the portable device and/or via an input/output bus and/or peripheral bus between the portable device and the portable computer system. The portable computer system may compare the one or more images to a set of cached images of the portable computer system and determine, e.g., based on the comparison, the angle of the display relative to the base of the portable computer system.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

2.

ZSPACE

      
Serial Number 99094615
Status Pending
Filing Date 2025-03-20
Owner zSpace, Inc. ()
NICE Classes  ? 09 - Scientific and electric apparatus and instruments

Goods & Services

Computer technology, namely, computer hardware, computer styluses, sensors for determining position, and recorded and downloadable computer software for enabling two-dimensional and three-dimensional images; computer hardware; computer styluses; sensors for determining position; cameras; two-dimensional and three-dimensional audio recordings for use with two-dimensional and three-dimensional images; two-dimensional and three-dimensional graphical user interface hardware and downloadable and recorded graphical user interface software; recorded and downloadable interactive multimedia computer programs for enabling direct hands-on interaction with two-dimensional and three-dimensional images and audio recordings in the science, technology, engineering and math (STEM) and career and technical education (CTE) fields

3.

SIX-DEGREE OF FREEDOM POSE ESTIMATION OF A STYLUS

      
Application Number US2024041199
Publication Number 2025/038344
Status In Force
Filing Date 2024-08-07
Publication Date 2025-02-20
Owner ZSPACE, INC. (USA)
Inventor Wilson, Mark

Abstract

Systems and methods for six-degree of freedom (6-DoF) pose estimation of a user input device, e.g., in a three-dimensional (3D) display system rendering interactive augmented reality (AR) and/or virtual reality (VR) experiences include the user input device capturing, via a camera disposed at a forward-facing tip of the user input device, images in a direction that the user input device is directed and determining, via an inertial measurement unit (IMU), motion of the user input device in three-dimensional (3D) space. The user input device may then determine pose information associated with the user input device based on the images and motion of the user input device. The determination of the pose information may be via usage of at least one of a neural network model, estimation model trained on a set of unique and identifiable patterns, and/or an estimation model trained on a dataset of images.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/033 - Pointing devices displaced or positioned by the userAccessories therefor
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computerOutput arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

4.

SIX-DEGREE OF FREEDOM POSE ESTIMATION OF A STYLUS

      
Application Number US2024041251
Publication Number 2025/038350
Status In Force
Filing Date 2024-08-07
Publication Date 2025-02-20
Owner ZSPACE, INC. (USA)
Inventor Wilson, Mark

Abstract

Systems and methods for six-degree of freedom (6-DoF) pose estimation of a user input device, e.g., in a three-dimensional (3D) system rendering interactive augmented reality (AR) and/or virtual reality (VR) experiences include the user input device capturing, via a camera disposed at a forward-facing tip of the user input device, images in a direction the user input device is directed and providing the images to a computer system. The user input device provides inertial measurement unit (IMU) data to the computer system as well. The computer system may then determine pose information associated with the user input device based on the images and IMU data of the user input device. The determination of the pose information may be via usage of at least one of a neural network model, estimation model trained on a set of unique and identifiable patterns, and/or an estimation model trained on a dataset of images.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/041 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
  • G06T 7/20 - Analysis of motion
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 17/00 - 3D modelling for computer graphics
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

5.

Six-degree of Freedom Pose Estimation of a Stylus

      
Application Number 18233126
Status Pending
Filing Date 2023-08-11
First Publication Date 2025-02-13
Owner zSpace, Inc (USA)
Inventor Wilson, Mark

Abstract

Systems and methods for six-degree of freedom (6-DoF) pose estimation of a user input device, e.g., in a three-dimensional (3D) display system rendering interactive augmented reality (AR) and/or virtual reality (VR) experiences include the user input device capturing, via a camera disposed at a forward-facing tip of the user input device, images in a direction that the user input device is directed and determining, via an inertial measurement unit (IMU), motion of the user input device in three-dimensional (3D) space. The user input device may then determine pose information associated with the user input device based on the images and motion of the user input device. The determination of the pose information may be via usage of at least one of a neural network model, estimation model trained on a set of unique and identifiable patterns, and/or an estimation model trained on a dataset of images.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • G06T 7/70 - Determining position or orientation of objects or cameras

6.

Six-degree of Freedom Pose Estimation of a Stylus

      
Application Number 18233152
Status Pending
Filing Date 2023-08-11
First Publication Date 2025-02-13
Owner zSpace, Inc. (USA)
Inventor Wilson, Mark

Abstract

Systems and methods for six-degree of freedom (6-DoF) pose estimation of a user input device, e.g., in a three-dimensional (3D) system rendering interactive augmented reality (AR) and/or virtual reality (VR) experiences include the user input device capturing, via a camera disposed at a forward-facing tip of the user input device, images in a direction the user input device is directed and providing the images to a computer system. The user input device provides inertial measurement unit (IMU) data to the computer system as well. The computer system may then determine pose information associated with the user input device based on the images and IMU data of the user input device. The determination of the pose information may be via usage of at least one of a neural network model, estimation model trained on a set of unique and identifiable patterns, and/or an estimation model trained on a dataset of images.

IPC Classes  ?

  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

7.

Cloud-based rendering of interactive augmented/virtual reality experiences

      
Application Number 18207998
Grant Number 12256052
Status In Force
Filing Date 2023-06-09
First Publication Date 2023-10-19
Grant Date 2025-03-18
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Hosenpud, Jonathan J.
  • Lu, Baifang
  • Shorey, Alex
  • Kalnins, Robert D.

Abstract

Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • H04N 13/194 - Transmission of image signals

8.

Cloud-based rendering of interactive augmented/virtual reality experiences

      
Application Number 18208014
Grant Number 12256054
Status In Force
Filing Date 2023-06-09
First Publication Date 2023-10-12
Grant Date 2025-03-18
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Hosenpud, Jonathan J.
  • Lu, Baifang
  • Shorey, Alex
  • Kalnins, Robert D.

Abstract

Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • H04N 13/194 - Transmission of image signals

9.

Cloud-based rendering of interactive augmented/virtual reality experiences

      
Application Number 18208007
Grant Number 12256053
Status In Force
Filing Date 2023-06-09
First Publication Date 2023-10-05
Grant Date 2025-03-18
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Hosenpud, Jonathan J.
  • Lu, Baifang
  • Shorey, Alex
  • Kalnins, Robert D.

Abstract

Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • H04N 13/194 - Transmission of image signals

10.

Cloud-based Rendering of Interactive Augmented/Virtual Reality Experiences

      
Application Number 18197983
Status Pending
Filing Date 2023-05-16
First Publication Date 2023-09-28
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Hosenpud, Jonathan J.
  • Lu, Baifang
  • Shorey, Alex
  • Kalnins, Robert D.

Abstract

Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/194 - Transmission of image signals
  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

11.

Cloud-based rendering of interactive augmented/virtual reality experiences

      
Application Number 18197999
Grant Number 12192434
Status In Force
Filing Date 2023-05-16
First Publication Date 2023-09-14
Grant Date 2025-01-07
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Hosenpud, Jonathan J.
  • Lu, Baifang
  • Shorey, Alex
  • Kalnins, Robert D.

Abstract

Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • H04N 13/194 - Transmission of image signals

12.

Cloud-based rendering of interactive augmented/virtual reality experiences

      
Application Number 17340901
Grant Number 11843755
Status In Force
Filing Date 2021-06-07
First Publication Date 2022-12-08
Grant Date 2023-12-12
Owner ZSPACE, INC. (USA)
Inventor
  • Champion, Clifford S.
  • Hosenpud, Jonathan J.
  • Lu, Baifang
  • Shorey, Alex
  • Kalnins, Robert D.

Abstract

Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.

IPC Classes  ?

  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/194 - Transmission of image signals
  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

13.

Pi cell drive waveform

      
Application Number 17888804
Grant Number 12099264
Status In Force
Filing Date 2022-08-16
First Publication Date 2022-12-08
Grant Date 2024-09-24
Owner zSpace, Inc. (USA)
Inventor
  • Cheng, Hsienhui
  • Huang, Jianhong

Abstract

Systems and methods for providing an electrical waveform to a pi-cell polarization switch. The electrical waveform may reduce/limit ion accumulation in and/or light leakage associated with the polarization switch. The electrical waveform may include multiple segments, e.g., a first segment may drive the polarization switch to a first polarization state and may be defined by a first portion having a first voltage level and a first polarity and a second portion having the first voltage level and a second polarity opposite the first polarity and a second segment, occurring after the first segment, that may drive the polarization switch to the second polarization state. The second segment may be defined by a second voltage level having the first polarity. An absolute value of the first voltage level may be greater than an absolute value of the second voltage level.

IPC Classes  ?

  • G02B 30/25 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type using polarisation techniques
  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour
  • H05B 47/10 - Controlling the light source

14.

Trackability enhancement of a passive stylus

      
Application Number 17190212
Grant Number 11287905
Status In Force
Filing Date 2021-03-02
First Publication Date 2021-07-08
Grant Date 2022-03-29
Owner ZSPACE, INC. (USA)
Inventor
  • Yamada, Kevin S.
  • Hosenpud, Jonathan J.
  • Larsen, Christian R.
  • Chavez, David A.
  • Berman, Arthur L.
  • Champion, Clifford S.

Abstract

Systems and methods for enhancing trackability of a passive stylus. A six degree of freedom (6DoF) location and orientation of a passive stylus may be tracked by a tracking system via a retroreflector system disposed on the passive stylus. Additionally, characteristic movements of one of a user's finger, hand, and/or wrist may be recognized by the tracking system. The passive stylus may useable to interact with a virtual 3D scene being displayed via a 3D display. A user input via the passive stylus may be determined based on the tracked 6DoF location and orientation of the passive stylus and/or the recognized characteristic movements. The retroreflector system may include multiple patterns of retroreflectors and one of the patterns may be a spiral pattern of retroreflectors disposed along a longitudinal axis of the passive stylus.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/04883 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

15.

Intelligent stylus beam and assisted probabilistic input to element mapping in 2D and 3D graphical user interfaces

      
Application Number 17190265
Grant Number 11645809
Status In Force
Filing Date 2021-03-02
First Publication Date 2021-06-17
Grant Date 2023-05-09
Owner zSpace, Inc. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.
  • Chavez, David A.
  • Yamada, Kevin S.
  • Lelievre, Alexandre R.

Abstract

Systems and methods for implementing methods for user selection of a virtual object in a virtual scene. A user input may be received via a user input device. The user input may be an attempt to select a virtual object from a plurality of virtual objects rendered in a virtual scene on a display of a display system. A position and orientation of the user input device may be determined in response to the first user input. A probability the user input may select each virtual object may be calculated via a probability model. Based on the position and orientation of the user input device, a ray-cast procedure and a sphere-cast procedure may be performed to determine the virtual object being selected. The probability of selection may also be considered in determining the virtual object. A virtual beam may be rendered from the user input device to the virtual object.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06N 20/00 - Machine learning
  • G06T 15/06 - Ray-tracing
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06F 9/38 - Concurrent instruction execution, e.g. pipeline or look ahead
  • G06T 15/10 - Geometric effects
  • G06N 7/01 - Probabilistic graphical models, e.g. probabilistic networks

16.

PI cell drive waveform

      
Application Number 16712011
Grant Number 11435602
Status In Force
Filing Date 2019-12-12
First Publication Date 2021-06-17
Grant Date 2022-09-06
Owner ZSPACE, INC. (USA)
Inventor
  • Cheng, Hsienhui
  • Huang, Jianhong

Abstract

Systems and methods for providing an electrical waveform to a pi-cell polarization switch. The electrical waveform may reduce/limit ion accumulation in and/or light leakage associated with the polarization switch. The electrical waveform may include multiple segments, e.g., a first segment may drive the polarization switch to a first polarization state and may be defined by a first portion having a first voltage level and a first polarity and a second portion having the first voltage level and a second polarity opposite the first polarity and a second segment, occurring after the first segment, that may drive the polarization switch to the second polarization state. The second segment may be defined by a second voltage level having the first polarity. An absolute value of the first voltage level may be greater than an absolute value of the second voltage level.

IPC Classes  ?

  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour
  • H05B 47/10 - Controlling the light source
  • G02B 30/25 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type using polarisation techniques

17.

INTELLIGENT STYLUS BEAM AND ASSISTED PROBABILISTIC INPUT TO ELEMENT MAPPING IN 2D AND 3D GRAPHICAL USER INTERFACES

      
Application Number US2020048964
Publication Number 2021/046065
Status In Force
Filing Date 2020-09-01
Publication Date 2021-03-11
Owner ZSPACE, INC. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.
  • Chavez, David A.
  • Yamada, Kevin S.
  • Lelievre, Alexandre R.

Abstract

Systems and methods for implementing methods for user selection of a virtual object in a virtual scene. A user input may be received via a user input device. The user input may be an attempt to select a virtual object from a plurality of virtual objects rendered in a virtual scene on a display of a display system. A position and orientation of the user input device may be determined in response to the first user input. A probability the user input may select each virtual object may be calculated via a probability model. Based on the position and orientation of the user input device, a ray-cast procedure and a sphere-cast procedure may be performed to determine the virtual object being selected. The probability of selection may also be considered in determining the virtual object. A virtual beam may be rendered from the user input device to the virtual object.

IPC Classes  ?

  • H04N 13/10 - Processing, recording or transmission of stereoscopic or multi-view image signals
  • H04N 13/388 - Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
  • H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
  • H04N 13/366 - Image reproducers using viewer tracking
  • G06T 15/06 - Ray-tracing
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

18.

Intelligent stylus beam and assisted probabilistic input to element mapping in 2D and 3D graphical user interfaces

      
Application Number 16562944
Grant Number 10943388
Status In Force
Filing Date 2019-09-06
First Publication Date 2021-03-09
Grant Date 2021-03-09
Owner zSpace, Inc. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.
  • Chavez, David A.
  • Yamada, Kevin S.
  • Lelievre, Alexandre R.

Abstract

Systems and methods for implementing methods for user selection of a virtual object in a virtual scene. A user input may be received via a user input device. The user input may be an attempt to select a virtual object from a plurality of virtual objects rendered in a virtual scene on a display of a display system. A position and orientation of the user input device may be determined in response to the first user input. A probability the user input may select each virtual object may be calculated via a probability model. Based on the position and orientation of the user input device, a ray-cast procedure and a sphere-cast procedure may be performed to determine the virtual object being selected. The probability of selection may also be considered in determining the virtual object. A virtual beam may be rendered from the user input device to the virtual object.

IPC Classes  ?

  • G06T 15/06 - Ray-tracing
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06F 9/38 - Concurrent instruction execution, e.g. pipeline or look ahead
  • G06T 15/10 - Geometric effects
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06N 7/00 - Computing arrangements based on specific mathematical models

19.

TRACKABILITY ENHANCEMENT OF A PASSIVE STYLUS

      
Application Number US2020043038
Publication Number 2021/016334
Status In Force
Filing Date 2020-07-22
Publication Date 2021-01-28
Owner ZSPACE, INC. (USA)
Inventor
  • Yamada, Kevin S.
  • Hosenpud, Jonathan J.
  • Larsen, Christian R.
  • Chavez, David A.
  • Berman, Arthur L.
  • Champion, Clifford S.

Abstract

Systems and methods for enhancing trackability of a passive stylus. A six degree of freedom (6DoF) location and orientation of a passive stylus may be tracked by a tracking system via a retroreflector system disposed on the passive stylus. Additionally, characteristic movements of one of a user's finger, hand, and/or wrist may be recognized by the tracking system. The passive stylus may useable to interact with a virtual 3D scene being displayed via a 3D display. A user input via the passive stylus may be determined based on the tracked 6DoF location and orientation of the passive stylus and/or the recognized characteristic movements. The retroreflector system may include multiple patterns of retroreflectors and one of the patterns may be a spiral pattern of retroreflectors disposed along a longitudinal axis of the passive stylus.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G02B 5/13 - Reflex reflectors including curved refracting surface plural curved refracting elements forming part of a unitary body

20.

Trackability enhancement of a passive stylus

      
Application Number 16683717
Grant Number 10942585
Status In Force
Filing Date 2019-11-14
First Publication Date 2021-01-28
Grant Date 2021-03-09
Owner ZSPACE, INC. (USA)
Inventor
  • Yamada, Kevin S.
  • Hosenpud, Jonathan J.
  • Larsen, Christian R.
  • Chavez, David A.
  • Berman, Arthur L.
  • Champion, Clifford S.

Abstract

Systems and methods for enhancing trackability of a passive stylus. A six degree of freedom (6DoF) location and orientation of a passive stylus may be tracked by a tracking system via a retroreflector system disposed on the passive stylus. Additionally, characteristic movements of one of a user's finger, hand, and/or wrist may be recognized by the tracking system. The passive stylus may useable to interact with a virtual 3D scene being displayed via a 3D display. A user input via the passive stylus may be determined based on the tracked 6DoF location and orientation of the passive stylus and/or the recognized characteristic movements. The retroreflector system may include multiple patterns of retroreflectors and one of the patterns may be a spiral pattern of retroreflectors disposed along a longitudinal axis of the passive stylus.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

21.

Identifying replacement 3D images for 2D images via ranking criteria

      
Application Number 16722269
Grant Number 10701347
Status In Force
Filing Date 2019-12-20
First Publication Date 2020-04-23
Grant Date 2020-06-30
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.
  • Lelievre, Alexandre R.
  • Berman, Arthur L.
  • Yamada, Kevin S.

Abstract

Systems and methods for replacing a 2D image with an equivalent 3D image within a web page. Content of a 2D image displayed within a web page may be identified and 3D images may be identified as possible replacements of the 2D image. The 3D images may be ranked based on sets of ranking criteria. A 3D image with a highest-ranking value may be selected based on a ranking of the 3D images. The selected 3D image may be integrated into the web page, thereby replacing the 2D image with the selected 3D image. Further, a user input manipulating the 3D image within the web page may be received. The user input may include movement of a view point of a user relative to a display displaying the web page and/or detection of a beam projected from an end of a user input device intersecting with the 3D image.

IPC Classes  ?

  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 15/20 - Perspective computation

22.

Replacing 2D images with 3D images

      
Application Number 16722185
Grant Number 10701346
Status In Force
Filing Date 2019-12-20
First Publication Date 2020-04-23
Grant Date 2020-06-30
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.
  • Lelievre, Alexandre R.
  • Berman, Arthur L.
  • Yamada, Kevin S.

Abstract

Systems and methods for replacing a 2D image with an equivalent 3D image within a web page. The 2D image displayed within a web page may be identified and a 3D image with substantially equivalent content may also be identified. The 3D image may be integrated into the web page as a replacement to the 2D image. Further, at least one user input manipulating the 3D image within the web page may be received. The at least one user input may include movement of a view point (or point of view) of a user relative to a display displaying the web page and/or detection of a beam projected from an end of a user input device (and/or an end of the user input device) intersecting with the 3D image.

IPC Classes  ?

  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • G06F 16/9535 - Search customisation based on user profiles and personalisation
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 15/20 - Perspective computation

23.

3D user interface—360-degree visualization of 2D webpage content

      
Application Number 16698119
Grant Number 10863168
Status In Force
Filing Date 2019-11-27
First Publication Date 2020-03-26
Grant Date 2020-12-08
Owner ZSPACE, INC. (USA)
Inventor
  • Champion, Clifford S.
  • Baraf, Eduardo
  • Lelievre, Alexandre R.
  • Hosenpud, Jonathan J.

Abstract

Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.

IPC Classes  ?

  • H04N 13/398 - Synchronisation thereofControl thereof
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/02 - Input arrangements using manually operated switches, e.g. using keyboards or dials
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 1/16 - Constructional details or arrangements
  • G06F 16/957 - Browsing optimisation, e.g. caching or content distillation
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

24.

Replacing 2D images with 3D images

      
Application Number 15947140
Grant Number 10523921
Status In Force
Filing Date 2018-04-06
First Publication Date 2019-10-10
Grant Date 2019-12-31
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.
  • Lelievre, Alexandre R.
  • Berman, Arthur L.
  • Yamada, Kevin S.

Abstract

Systems and methods for replacing a 2D image with an equivalent 3D image within a web page. The 2D image displayed within a web page may be identified and a 3D image with substantially equivalent content may also be identified. The 3D image may be integrated into the web page as a replacement to the 2D image. Further, at least one user input manipulating the 3D image within the web page may be received. The at least one user input may include movement of a view point (or point of view) of a user relative to a display displaying the web page and/or detection of a beam projected from an end of a user input device (and/or an end of the user input device) intersecting with the 3D image.

IPC Classes  ?

  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 15/20 - Perspective computation
  • G06F 16/9535 - Search customisation based on user profiles and personalisation

25.

Identifying replacement 3D images for 2D images via ranking criteria

      
Application Number 15947180
Grant Number 10523922
Status In Force
Filing Date 2018-04-06
First Publication Date 2019-10-10
Grant Date 2019-12-31
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.
  • Lelievre, Alexandre R.
  • Berman, Arthur L.
  • Yamada, Kevin S.

Abstract

Systems and methods for replacing a 2D image with an equivalent 3D image within a web page. Content of a 2D image displayed within a web page may be identified and 3D images may be identified as possible replacements of the 2D image. The 3D images may be ranked based on sets of ranking criteria. A 3D image with a highest-ranking value may be selected based on a ranking of the 3D images. The selected 3D image may be integrated into the web page, thereby replacing the 2D image with the selected 3D image. Further, a user input manipulating the 3D image within the web page may be received. The user input may include movement of a view point of a user relative to a display displaying the web page and/or detection of a beam projected from an end of a user input device intersecting with the 3D image.

IPC Classes  ?

  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 15/20 - Perspective computation
  • G06F 16/58 - Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually

26.

User input device camera

      
Application Number 16394406
Grant Number 11284061
Status In Force
Filing Date 2019-04-25
First Publication Date 2019-08-15
Grant Date 2022-03-22
Owner ZSPACE, INC. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Berman, Arthur L.
  • Tu, Jerome C.
  • Morishige, Kevin D.
  • Chavez, David A.

Abstract

Systems and methods for capturing a two dimensional (2D) image of a portion of a three dimensional (3D) scene may include a computer rendering a 3D scene on a display from a user's point of view (POV). A camera mode may be activated in response to user input and a POV of a camera may be determined. The POV of the camera may be specified by position and orientation of a user input device coupled to the computer, and may be independent of the user's POV. A 2D frame of the 3D scene based on the POV of the camera may be determined and the 2D image based on the 2D frame may be captured in response to user input. The 2D image may be stored locally or on a server of a network.

IPC Classes  ?

  • H04N 13/398 - Synchronisation thereofControl thereof
  • H04N 13/337 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
  • H04N 13/334 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spectral multiplexing
  • H04N 13/366 - Image reproducers using viewer tracking

27.

Transitioning between 2D and stereoscopic 3D webpage presentation

      
Application Number 16393450
Grant Number 10866820
Status In Force
Filing Date 2019-04-24
First Publication Date 2019-08-15
Grant Date 2020-12-15
Owner ZSPACE (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.

Abstract

Systems and methods for displaying a stereoscopic three-dimensional (3D) webpage overlay. User input may be received from a user input device and in response to determining that the user input device is interacting with the 3D content, at least one of a plurality of render properties associated with of the 3D content may be modified. The at least one render property may be incrementally modified over a specified period of time, thereby animating modification of the at least one render property.

IPC Classes  ?

  • G06F 9/451 - Execution arrangements for user interfaces
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • H04N 13/286 - Image signal generators having separate monoscopic and stereoscopic modes
  • G06T 13/20 - 3D [Three Dimensional] animation
  • G06F 3/147 - Digital output to display device using display panels
  • G09G 5/08 - Cursor circuits
  • H04N 13/356 - Image reproducers having separate monoscopic and stereoscopic modes
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
  • H04N 13/366 - Image reproducers using viewer tracking
  • H04N 13/398 - Synchronisation thereofControl thereof
  • G06T 15/50 - Lighting effects
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G09G 5/34 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators for rolling or scrolling
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

28.

3D User Interface—360-degree visualization of 2D webpage content

      
Application Number 16374100
Grant Number 10587871
Status In Force
Filing Date 2019-04-03
First Publication Date 2019-07-25
Grant Date 2020-03-10
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Baraf, Eduardo
  • Lelievre, Alexandre R.
  • Hosenpud, Jonathan J.

Abstract

Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.

IPC Classes  ?

  • H04N 13/398 - Synchronisation thereofControl thereof
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 16/957 - Browsing optimisation, e.g. caching or content distillation
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

29.

Pi-cell polarization switch for a three dimensional display system

      
Application Number 16203048
Grant Number 10613405
Status In Force
Filing Date 2018-11-28
First Publication Date 2019-03-28
Grant Date 2020-04-07
Owner zSpace, Inc. (USA)
Inventor
  • Cheng, Hsienhui
  • Nguyen, Thanh-Son

Abstract

Techniques are disclosed relating to the transmission of data based on a polarization of a light signal. In some embodiments, data may include 3D video data for viewing by a user. Systems for transmitting data may include a display device and a device for switching the polarization of a video source. Systems for receiving data may include eyewear configured to present images with orthogonal polarization to each eye. In some embodiments, the rate of switching of the polarization switcher may introduce a distortion to the optical data. A Pi-cell device may be used in some embodiments to reduce distortion based on switching speed. In some embodiments, polarization switchers may introduce a distortion based on the frequency of transmitted light. In some embodiments, optical elements including in the transmitting or receiving devices may be configured to reduce distortions based on frequency.

IPC Classes  ?

  • G02F 1/139 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/26 - Other optical systems; Other optical apparatus for producing stereoscopic or other three-dimensional effects involving polarising means
  • H04N 13/302 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
  • H04N 13/337 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
  • H04N 13/366 - Image reproducers using viewer tracking
  • G02B 27/01 - Head-up displays
  • G02B 27/22 - Other optical systems; Other optical apparatus for producing stereoscopic or other three-dimensional effects
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

30.

Segmented backlight for dynamic contrast

      
Application Number 16159853
Grant Number 10338303
Status In Force
Filing Date 2018-10-15
First Publication Date 2019-02-14
Grant Date 2019-07-02
Owner zSpace, Inc. (USA)
Inventor
  • Nguyen, Thanh-Son
  • Cheng, Hsienhui

Abstract

Systems and methods for increasing dynamic contrast in a liquid crystal display (LCD) may include a segmented backlight that may include one or more segments and one or more sets of light emitting diodes (LEDs). Each set of LEDs may be configured to illuminate a corresponding segment and each segment may include a notch(es) configured as a light barrier to reduce light leakage to non-adjacent segments. The notch(es) may be of variable length, depth, and width and may be three-dimensional, having a width the varies along the depth and length of the notch and a depth that varies along the width and length of the notch. In some embodiments, the notch(es) may be reflective, some degree of opaque, and/or blackened.

IPC Classes  ?

  • F21V 7/04 - Optical design
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • H04N 13/302 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
  • G02B 6/00 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings

31.

3D user interface—non-native stereoscopic image conversion

      
Application Number 16157305
Grant Number 10623713
Status In Force
Filing Date 2018-10-11
First Publication Date 2019-02-07
Grant Date 2020-04-14
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Baraf, Eduardo
  • Lelievre, Alexandre R.
  • Hosenpud, Jonathan J.

Abstract

Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.

IPC Classes  ?

  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/293 - Generating mixed stereoscopic imagesGenerating mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/106 - Processing image signals
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 1/16 - Constructional details or arrangements
  • G02B 27/01 - Head-up displays
  • H04N 13/324 - Colour aspects
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

32.

Zero parallax drawing within a three dimensional display

      
Application Number 16003165
Grant Number 10739936
Status In Force
Filing Date 2018-06-08
First Publication Date 2018-10-11
Grant Date 2020-08-11
Owner zSpace, Inc. (USA)
Inventor
  • Ullmann, Peter F.
  • Champion, Clifford S.

Abstract

Systems and methods for digitally drawing on virtual 3D object surfaces using a 3D display system. A 3D drawing mode may be enabled and a display screen of the system may correspond to a zero parallax plane of a 3D scene that may present a plurality of surfaces at non-zero parallax planes. User input may be received at a location on the display screen, and in response, a surface may be specified, rendered, and displayed at the zero parallax plane. Further, additional user input on the display screen may be received specifying drawing motion across the rendered and displayed surface. The drawing motion may start at the location and continue across a boundary between the surface and another contiguous surface. Accordingly, in response to the drawing motion crossing the boundary, the contiguous surface may be rendered and displayed at the zero parallax plane along with results of the drawing motion.

IPC Classes  ?

  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computerOutput arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/366 - Image reproducers using viewer tracking
  • H04N 13/398 - Synchronisation thereofControl thereof
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • H04N 13/305 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
  • H04N 13/334 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spectral multiplexing
  • H04N 13/337 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing

33.

Stereoscopic 3D webpage overlay

      
Application Number 15406390
Grant Number 10257500
Status In Force
Filing Date 2017-01-13
First Publication Date 2018-07-19
Grant Date 2019-04-09
Owner zSpace, Inc. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.

Abstract

Systems and methods for displaying a stereoscopic three-dimensional (3D) webpage overlay. In some embodiments, user input may be received from a user input device and in response to determining that the user input device is not substantially concurrently interacting with the 3D content, interpret the user input based on a 2D mode of interaction. In addition, the user input may be interpreted based on a 3D mode of interaction in response to determining that the user input device is substantially concurrently interacting with the 3D content. The 2D mode of interaction corresponds to a first visual cursor, such as a mouse cursor, and the 3D mode of interaction corresponds to a second visual cursor, such as a virtual beam rendered to extend from a tip of the user input device.

IPC Classes  ?

  • G06F 3/033 - Pointing devices displaced or positioned by the userAccessories therefor
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
  • G06T 11/60 - Editing figures and textCombining figures or text
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

34.

Transitioning between 2D and stereoscopic 3D webpage presentation

      
Application Number 15406440
Grant Number 10324736
Status In Force
Filing Date 2017-01-13
First Publication Date 2018-07-19
Grant Date 2019-06-18
Owner zSpace, Inc. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Champion, Clifford S.

Abstract

Systems and methods for displaying a stereoscopic three-dimensional (3D) webpage overlay. In some embodiments user input may be received from a user input device and in response to determining that the user input device is substantially concurrently interacting with the 3D content, at least one of a plurality of render properties associated with of the 3D content may be modified. In some embodiments, the at least one render property may be incrementally modified over a specified period of time, thereby animating modification of the at least one render property.

IPC Classes  ?

  • G06F 9/44 - Arrangements for executing specific programs
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G06F 9/451 - Execution arrangements for user interfaces
  • H04N 13/286 - Image signal generators having separate monoscopic and stereoscopic modes
  • G06T 13/20 - 3D [Three Dimensional] animation
  • G06T 15/50 - Lighting effects
  • G06T 15/00 - 3D [Three Dimensional] image rendering

35.

Segmented backlight for dynamic contrast

      
Application Number 15872188
Grant Number 10146004
Status In Force
Filing Date 2018-01-16
First Publication Date 2018-07-19
Grant Date 2018-12-04
Owner zSpace, Inc. (USA)
Inventor
  • Nguyen, Thanh-Son
  • Cheng, Hsienhui

Abstract

Systems and methods for increasing dynamic contrast in a liquid crystal display (LCD) may include a segmented backlight that may include one or more segments and one or more sets of light emitting diodes (LEDs). Each set of LEDs may be configured to illuminate a corresponding segment and each segment may include a notch(es) configured as a light barrier to reduce light leakage to non-adjacent segments. The notch(es) may be of variable length, depth, and width and may be three-dimensional, having a width the varies along the depth and length of the notch and a depth that varies along the width and length of the notch. In some embodiments, the notch(es) may be reflective, some degree of opaque, and/or blackened.

IPC Classes  ?

  • F21V 7/04 - Optical design
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • H04N 13/302 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
  • G02B 6/00 - Light guidesStructural details of arrangements comprising light guides and other optical elements, e.g. couplings

36.

3D user interface—non-native stereoscopic image conversion

      
Application Number 15355333
Grant Number 10127715
Status In Force
Filing Date 2016-11-18
First Publication Date 2018-05-24
Grant Date 2018-11-13
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Baraf, Eduardo
  • Lelievre, Alexandre R.
  • Hosenpud, Jonathan J.

Abstract

Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.

IPC Classes  ?

  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 15/20 - Perspective computation
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • H04N 13/106 - Processing image signals
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/293 - Generating mixed stereoscopic imagesGenerating mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • H04N 13/324 - Colour aspects
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

37.

3D user interface

      
Application Number 15355240
Grant Number 11003305
Status In Force
Filing Date 2016-11-18
First Publication Date 2018-05-24
Grant Date 2021-05-11
Owner ZSPACE, INC. (USA)
Inventor
  • Champion, Clifford S.
  • Baraf, Eduardo
  • Lelievre, Alexandre R.
  • Hosenpud, Jonathan J.

Abstract

Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.

IPC Classes  ?

  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • H04N 1/00 - Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmissionDetails thereof
  • G06Q 30/02 - MarketingPrice estimation or determinationFundraising
  • G06F 17/22 - Manipulating or registering by use of codes, e.g. in sequence of text characters
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • H04N 13/04 - Picture reproducers
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/356 - Image reproducers having separate monoscopic and stereoscopic modes
  • H04N 13/293 - Generating mixed stereoscopic imagesGenerating mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • G06F 16/957 - Browsing optimisation, e.g. caching or content distillation

38.

3D user interface—360-degree visualization of 2D webpage content

      
Application Number 15355298
Grant Number 10271043
Status In Force
Filing Date 2016-11-18
First Publication Date 2018-05-24
Grant Date 2019-04-23
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Baraf, Eduardo
  • Lelievre, Alexandre R.
  • Hosenpud, Jonathan J.

Abstract

Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.

IPC Classes  ?

  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • H04N 13/398 - Synchronisation thereofControl thereof
  • G06F 17/30 - Information retrieval; Database structures therefor
  • H04N 13/275 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

39.

Integrating real world conditions into virtual imagery

      
Application Number 15298956
Grant Number 10019831
Status In Force
Filing Date 2016-10-20
First Publication Date 2018-04-26
Grant Date 2018-07-10
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Tu, Jerome C.

Abstract

Systems and methods for incorporating real world conditions into a three-dimensional (3D) graphics object are described herein. In some embodiments, images of a physical location of a user of a three-dimensional (3D) display system may be received from at least one camera and a data imagery map of the physical location may be determined based at least in part on the received images. The data imagery map may capture real world conditions associated with the physical location of the user. Instructions to render a 3D graphics object may be generated and the data imagery map may be incorporated into a virtual 3D scene comprising the 3D graphics object, thereby incorporating the real world conditions into virtual world imagery. In some embodiments, the data imagery may include a light map, a sparse light field, and/or a depth map of the physical location.

IPC Classes  ?

  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/04 - Picture reproducers
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 15/10 - Geometric effects
  • G06T 17/00 - 3D modelling for computer graphics

40.

Personal electronic device with a display system

      
Application Number 15223952
Grant Number 10019849
Status In Force
Filing Date 2016-07-29
First Publication Date 2018-02-01
Grant Date 2018-07-10
Owner zSpace, Inc. (USA)
Inventor
  • Berman, Arthur L.
  • Champion, Clifford S.
  • Chavez, David A.
  • Lopez-Fresquet, Francisco
  • Hosenpud, Jonathan J.
  • Kalnins, Robert D.
  • Lelievre, Alexandre R.
  • Sherman, Christopher W.
  • Tu, Jerome C.
  • Yamada, Kevin S.
  • Yeung, Chun Wun

Abstract

Systems and methods for interacting with a display system using a personal electronic device (PED). The display system may establish communication with and receive user input from the PED. The display system may use the received user input to generate and/or update content displayed on a display of the display system.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • H04N 13/04 - Picture reproducers
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/16 - Sound inputSound output
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

41.

Pi-cell polarization switch for a three dimensional display system

      
Application Number 15650117
Grant Number 10180614
Status In Force
Filing Date 2017-07-14
First Publication Date 2018-01-18
Grant Date 2019-01-15
Owner zSpace, Inc. (USA)
Inventor
  • Cheng, Hsienhui
  • Nguyen, Thanh-Son

Abstract

Techniques are disclosed relating to the transmission of data based on a polarization of a light signal. In some embodiments, data may include 3D video data for viewing by a user. Systems for transmitting data may include a display device and a device for switching the polarization of a video source. Systems for receiving data may include eyewear configured to present images with orthogonal polarization to each eye. In some embodiments, the rate of switching of the polarization switcher may introduce a distortion to the optical data. A Pi-cell device may be used in some embodiments to reduce distortion based on switching speed. In some embodiments, polarization switchers may introduce a distortion based on the frequency of transmitted light. In some embodiments, optical elements including in the transmitting or receiving devices may be configured to reduce distortions based on frequency.

IPC Classes  ?

  • G02F 1/139 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulatingNon-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/26 - Other optical systems; Other optical apparatus for producing stereoscopic or other three-dimensional effects involving polarising means
  • H04N 13/302 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
  • H04N 13/337 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/366 - Image reproducers using viewer tracking
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

42.

Stereoscopic display system using light field type data

      
Application Number 15407028
Grant Number 09848184
Status In Force
Filing Date 2017-01-16
First Publication Date 2017-05-04
Grant Date 2017-12-19
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Bell, Bruce J.
  • Lelievre, Alexandre R.
  • Tu, Jerome C.
  • Sherman, Christopher W.
  • Kalnins, Robert D.
  • Hosenpud, Jonathan J.
  • Lopez-Fresquet, Francisco
  • Champion, Clifford S.
  • Berman, Arthur L.

Abstract

Systems and methods for a head tracked stereoscopic display system that uses light field type data may include receiving light field type data corresponding to a scene. The stereoscopic display system may track a user's head. Using the received light field type data and the head tracking, the system may generate three dimensional (3D) virtual content that corresponds to a virtual representation of the scene. The stereoscopic display system may then present the 3D virtual content to a user. The stereoscopic display system may present a left eye perspective image and a right eye perspective image of the scene to the user based on the position and orientation of the user's head. The images presented to the user may be updated based on a change in the position or the orientation of the user's head or based on receiving user input.

IPC Classes  ?

  • H04N 13/04 - Picture reproducers
  • H04N 5/247 - Arrangement of television cameras
  • G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

43.

Virtual plane in a stylus based stereoscopic display system

      
Application Number 14880007
Grant Number 09703400
Status In Force
Filing Date 2015-10-09
First Publication Date 2017-04-13
Grant Date 2017-07-11
Owner zSpace, Inc. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Berman, Arthur L.
  • Champion, Clifford S.
  • Chavez, David A.
  • Lopez-Fresquet, Francisco
  • Kalnins, Robert D.
  • Lelievre, Alexandre R.
  • Sherman, Christopher W.
  • Tu, Jerome C.
  • Venkat, Murugappan R.

Abstract

Virtual plane and use in a stylus based three dimensional (3D) stereoscopic display system. A virtual plane may be displayed in a virtual 3D space on a display of the 3D stereoscopic display system. The virtual plane may extend from a stylus of the 3D stereoscopic display system. Content may be generated in response to a geometric relationship of the virtual plane with at least one virtual object in the virtual 3D space. The generated content may indicate one or more attributes of the at least one virtual object. The content may be presented via the 3D stereoscopic display system.

IPC Classes  ?

  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • H04N 13/04 - Picture reproducers

44.

Head tracked stereoscopic display system that uses light field type data

      
Application Number 14882989
Grant Number 09549174
Status In Force
Filing Date 2015-10-14
First Publication Date 2017-01-17
Grant Date 2017-01-17
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Bell, Bruce J.
  • Lelievre, Alexandre R.
  • Tu, Jerome C.
  • Sherman, Christopher W.
  • Kalnins, Robert D.
  • Hosenpud, Jonathan J.
  • Lopez-Fresquet, Francisco
  • Champion, Clifford S.
  • Berman, Arthur L.

Abstract

Systems and methods for a head tracked stereoscopic display system that uses light field type data may include receiving light field type data corresponding to a scene. The stereoscopic display system may track a user's head. Using the received light field type data and the head tracking, the system may generate three dimensional (3D) virtual content that corresponds to a virtual representation of the scene. The stereoscopic display system may then present the 3D virtual content to a user. The stereoscopic display system may present a left eye perspective image and a right eye perspective image of the scene to the user based on the position and orientation of the user's head. The images presented to the user may be updated based on a change in the position or the orientation of the user's head or based on receiving user input.

IPC Classes  ?

45.

Head tracking eyewear system

      
Application Number 15187164
Grant Number 09706191
Status In Force
Filing Date 2016-06-20
First Publication Date 2016-10-06
Grant Date 2017-07-11
Owner zSpace, Inc. (USA)
Inventor
  • Tu, Jerome C
  • Chavez, David A.

Abstract

In some embodiments, a system for tracking with reference to a three-dimensional display system may include a display device, an image processor, a surface including at least three emitters, at least two sensors, a processor. The display device may image, during use, a first stereo three-dimensional image. The surface may be positionable, during use, with reference to the display device. At least two of the sensors may detect, during use, light received from at least three of the emitters as light blobs. The processor may correlate, during use, the assessed referenced position of the detected light blobs such that a first position/orientation of the surface is assessed. The image processor may generate, during use, the first stereo three-dimensional image using the assessed first position/orientation of the surface with reference to the display. The image processor may generate, during use, a second stereo three-dimensional image using an assessed second position/orientation of the surface with reference to the display.

IPC Classes  ?

46.

Non-linear navigation of a three dimensional stereoscopic display

      
Application Number 15172732
Grant Number 09554126
Status In Force
Filing Date 2016-06-03
First Publication Date 2016-09-29
Grant Date 2017-01-24
Owner zSpace, Inc. (USA)
Inventor
  • Dolim, Scott M.
  • Lopez-Fresquet, Cisco

Abstract

Systems and methods for navigating a 3D stereoscopic scene displayed via a 3D stereoscopic display system using user head tracking. A reference POV including a reference user head position and a reference user head orientation may be established. The user head POV may be tracked, including monitoring user head positional displacements and user head angular rotations relative to the reference POV. In response to the tracking, a camera POV used to render the 3D stereoscopic scene may be adjusted based on a non-linear mapping between changes in the camera POV and the user head positional displacements and user head angular rotations relative to the reference POV. The non-linear mapping may include a mapping of user head positional displacements relative to the reference POV to translational movements in the camera POV and a mapping of user head angular rotations relative to the reference POV to rotations in the camera POV.

IPC Classes  ?

  • H04N 13/04 - Picture reproducers
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

47.

Modifying perspective of stereoscopic images based on changes in user viewpoint

      
Application Number 15074233
Grant Number 09684994
Status In Force
Filing Date 2016-03-18
First Publication Date 2016-09-15
Grant Date 2017-06-20
Owner zSpace, Inc. (USA)
Inventor
  • Vesely, Michael A.
  • Clemens, Nancy L.
  • Gray, Alan S.

Abstract

Modifying perspective of stereoscopic images provided by one or more displays based on changes in user viewpoint. The one or more displays may include a first display that is provided substantially horizontal for displaying 3D horizontal perspective images and/or a second display that is provided substantially vertical for displaying text or conventional images such as 2D images, or 3D vertical perspective images. The horizontal display surface may be typically positioned directly in front of the user, and at a height of about a desktop surface so that the user can have about a 45° looking angle. The vertical display surface may be positioned in front of the user and preferably behind and above the horizontal display surface.

IPC Classes  ?

48.

Detection of partially obscured objects in three dimensional stereoscopic scenes

      
Application Number 15078326
Grant Number 09704285
Status In Force
Filing Date 2016-03-23
First Publication Date 2016-07-14
Grant Date 2017-07-11
Owner zSpace, Inc. (USA)
Inventor Vesely, Michael A.

Abstract

System and methods for user interface elements for use within a 3D scene. The 3D scene may be presented by at least one display, which includes displaying at least one stereoscopic image of the 3D scene by the display(s). The 3D scene may be presented according to a first viewpoint. One or more user interface elements may be used. The 3D scene may be updated in response to the use of the user interface elements.

IPC Classes  ?

  • G06T 15/08 - Volume rendering
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/0485 - Scrolling or panning
  • H04N 13/04 - Picture reproducers

49.

Indirect 3D scene positioning control

      
Application Number 15075725
Grant Number 09864495
Status In Force
Filing Date 2016-03-21
First Publication Date 2016-07-14
Grant Date 2018-01-09
Owner zSpace, Inc. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Vesely, Michael A.

Abstract

Embodiments of the present invention generally relate to interacting with a virtual scene at a perspective which is independent from the perspective of the user. Methods and systems can include either tracking and defining a perspective of the user based on the position and orientation of the user in the physical space, projecting a virtual scene for the user perspective to a virtual plane, tracking and defining a perspective of the a freehand user input device based on the position and orientation of the a freehand user input device, identifying a mark in the virtual scene which corresponds to the position and orientation of the device in the physical space, creating a virtual segment from the mark and interacting with virtual objects in the virtual scene at the end point of the virtual segment, as controlled using the device.

IPC Classes  ?

  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

50.

Presenting a view within a three dimensional scene

      
Application Number 14952397
Grant Number 09824485
Status In Force
Filing Date 2015-11-25
First Publication Date 2016-03-24
Grant Date 2017-11-21
Owner zSpace, Inc. (USA)
Inventor
  • Vesely, Michael A.
  • Gray, Alan S.

Abstract

Presenting a view based on a virtual viewpoint in a three dimensional (3D) scene. The 3D scene may be presented by at least one display, which includes displaying at least one stereoscopic image of the 3D scene by the display(s). The 3D scene may be presented according to a first viewpoint. A virtual viewpoint may be determined within the 3D scene that is different than the first viewpoint. The view of the 3D scene may be presented on the display(s) according to the virtual viewpoint and/or the first view point. The presentation of the view of the 3D scene is performed concurrently with presenting the 3D scene.

IPC Classes  ?

  • G06T 15/20 - Perspective computation
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/04 - Picture reproducers
  • H04N 13/02 - Picture signal generators

51.

Operations in a three dimensional display system

      
Application Number 13926200
Grant Number 09829996
Status In Force
Filing Date 2013-06-25
First Publication Date 2016-02-11
Grant Date 2017-11-28
Owner zSpace, Inc. (USA)
Inventor Hosenpud, Jonathan J.

Abstract

System and method for invoking 2D and 3D operational modes of a 3D pointing device in a 3D presentation system. A 3D stereoscopic scene and a 2-dimensional (2D) scene are displayed concurrently via at least one stereoscopic display device. A current cursor position is determined based on a 6 degree of freedom 3D pointing device. The cursor is displayed concurrent with the 3D stereoscopic scene and the 2D scene, where the cursor operates in a 2D mode in response being inside a specified volume, where, in the 2D mode, the cursor is usable to interact with the 2D scene, and where the cursor operates in a 3D mode in response to being outside the specified volume, where, in the 3D mode, the cursor is usable to interact with the 3D stereoscopic scene.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

52.

Enhancing the coupled zone of a stereoscopic display

      
Application Number 14838189
Grant Number 09467685
Status In Force
Filing Date 2015-08-27
First Publication Date 2016-01-21
Grant Date 2016-10-11
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Berman, Arthur L.
  • Tu, Jerome C.
  • Morishige, Kevin D.

Abstract

Systems and methods for calibrating a three dimensional (3D) stereoscopic display system may include rendering a virtual model on a display of a 3D stereoscopic display system that may include a substantially horizontal display. The virtual model may be geometrically similar to a physical object placed at a location on the display. A vertex of the virtual model may be adjusted in response to user input. The adjustment may be such that the vertex of the virtual model is substantially coincident with a corresponding vertex of the physical object.

IPC Classes  ?

  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • H04N 13/04 - Picture reproducers
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting

53.

User input device camera

      
Application Number 14792844
Grant Number 10321126
Status In Force
Filing Date 2015-07-07
First Publication Date 2016-01-14
Grant Date 2019-06-11
Owner zSpace, Inc. (USA)
Inventor
  • Hosenpud, Jonathan J.
  • Berman, Arthur L.
  • Tu, Jerome C.
  • Morishige, Kevin D.
  • Chavez, David A.

Abstract

Systems and methods for capturing a two dimensional (2D) image of a portion of a three dimensional (3D) scene may include a computer rendering a 3D scene on a display from a user's point of view (POV). A camera mode may be activated in response to user input and a POV of a camera may be determined. The POV of the camera may be specified by position and orientation of a user input device coupled to the computer, and may be independent of the user's POV. A 2D frame of the 3D scene based on the POV of the camera may be determined and the 2D image based on the 2D frame may be captured in response to user input. The 2D image may be stored locally or on a server of a network.

IPC Classes  ?

  • H04N 13/398 - Synchronisation thereofControl thereof
  • H04N 13/337 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
  • H04N 13/334 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spectral multiplexing
  • H04N 13/366 - Image reproducers using viewer tracking

54.

Three dimensional display system and use

      
Application Number 14854458
Grant Number 09886102
Status In Force
Filing Date 2015-09-15
First Publication Date 2016-01-07
Grant Date 2018-02-06
Owner zSpace, Inc. (USA)
Inventor Hosenpud, Jonathan J.

Abstract

System and method for invoking 2D and 3D operational modes of a 3D pointing device in a 3D presentation system. A 3D stereoscopic scene and a 2-dimensional (2D) scene are displayed concurrently via at least one stereoscopic display device. A current cursor position is determined based on a 6 degree of freedom 3D pointing device. The cursor is displayed concurrent with the 3D stereoscopic scene and the 2D scene, where the cursor operates in a 2D mode in response being inside a specified volume, where, in the 2D mode, the cursor is usable to interact with the 2D scene, and where the cursor operates in a 3D mode in response to being outside the specified volume, where, in the 3D mode, the cursor is usable to interact with the 3D stereoscopic scene.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

55.

Liquid crystal variable drive voltage

      
Application Number 14838248
Grant Number 09958712
Status In Force
Filing Date 2015-08-27
First Publication Date 2015-12-24
Grant Date 2018-05-01
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Cheponis, Michael A.
  • Flynn, Mark F.

Abstract

A voltage may be provided to a liquid crystal addressable element as part of a liquid crystal device. The provided voltage may be reduced from a driven state to a relaxed state in a time period greater than 1 μs. The reduction may further be performed in less than 20 ms. The liquid crystal device may be a polarization switch, which in some embodiments may be a multi-segment polarization switch. In one embodiment, pulses of limited duration of a light source may be provided to the polarization switch. The manner of voltage reduction may reduce optical bounce of the liquid crystal device and may allow one or more of the pulses of the light source to be shifted later in time.

IPC Classes  ?

  • G02F 1/133 - Constructional arrangementsOperation of liquid crystal cellsCircuit arrangements
  • G09G 3/36 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using liquid crystals
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/34 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source
  • H04N 13/04 - Picture reproducers

56.

Network based 3D design and collaboration

      
Application Number 14837477
Grant Number 09342917
Status In Force
Filing Date 2015-08-27
First Publication Date 2015-12-24
Grant Date 2016-05-17
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Tu, Jerome C.
  • Thompson, Carola F.
  • Flynn, Mark F.
  • Twilleager, Douglas C.
  • Morishige, Kevin D.
  • Ullmann, Peter F.
  • Berman, Arthur L.

Abstract

In some embodiments, a system and/or method may include accessing three-dimensional (3D) imaging software on a remote server. The method may include accessing over a network a 3D imaging software package on a remote server using a first system. The method may include assessing, using the remote server, a capability of the first system to execute the 3D imaging software package. The method may include displaying an output of the 3D imaging software using the first system based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a first portion of the 3D imaging software using the remote server based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a second portion of the 3D imaging software using the first system based upon the assessed capabilities of the first system.

IPC Classes  ?

  • G06F 15/16 - Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • H04L 29/08 - Transmission control procedure, e.g. data link level control procedure
  • G06T 15/20 - Perspective computation
  • G06T 15/08 - Volume rendering
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04L 12/26 - Monitoring arrangements; Testing arrangements
  • G06F 12/00 - Accessing, addressing or allocating within memory systems or architectures

57.

3D design and collaboration over a network

      
Application Number 14837669
Grant Number 09286713
Status In Force
Filing Date 2015-08-27
First Publication Date 2015-12-17
Grant Date 2016-03-15
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Tu, Jerome C.
  • Thompson, Carola F.
  • Flynn, Mark F.
  • Twilleager, Douglas C.
  • Morishige, Kevin D.
  • Ullmann, Peter F.
  • Berman, Arthur L.

Abstract

In some embodiments, a system and/or method may include accessing three-dimensional (3D) imaging software on a remote server. The method may include accessing over a network a 3D imaging software package on a remote server using a first system. The method may include assessing, using the remote server, a capability of the first system to execute the 3D imaging software package. The method may include displaying an output of the 3D imaging software using the first system based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a first portion of the 3D imaging software using the remote server based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a second portion of the 3D imaging software using the first system based upon the assessed capabilities of the first system.

IPC Classes  ?

  • G06F 15/16 - Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
  • G06T 15/08 - Volume rendering
  • H04L 29/08 - Transmission control procedure, e.g. data link level control procedure
  • G06F 12/00 - Accessing, addressing or allocating within memory systems or architectures

58.

Head tracking eyewear system

      
Application Number 14822384
Grant Number 09473763
Status In Force
Filing Date 2015-08-10
First Publication Date 2015-12-03
Grant Date 2016-10-18
Owner zSpace, Inc. (USA)
Inventor
  • Tu, Jerome C.
  • Chavez, David A.

Abstract

In some embodiments, a system for tracking with reference to a three-dimensional display system may include a display device, an image processor, a surface including at least three emitters, at least two sensors, a processor. The display device may image, during use, a first stereo three-dimensional image. The surface may be positionable, during use, with reference to the display device. At least two of the sensors may detect, during use, light received from at least three of the emitters as light blobs. The processor may correlate, during use, the assessed referenced position of the detected light blobs such that a first position/orientation of the surface is assessed. The image processor may generate, during use, the first stereo three-dimensional image using the assessed first position/orientation of the surface with reference to the display. The image processor may generate, during use, a second stereo three-dimensional image using an assessed second position/orientation of the surface with reference to the display.

IPC Classes  ?

59.

Modifying displayed images in the coupled zone of a stereoscopic display based on user comfort

      
Application Number 14257105
Grant Number 09681122
Status In Force
Filing Date 2014-04-21
First Publication Date 2015-10-22
Grant Date 2017-06-13
Owner zSpace, Inc. (USA)
Inventor
  • Wilson, Mark P.
  • Twilleager, Douglas C.
  • Borel, David J.

Abstract

Systems and methods for enhancement of a coupled zone of a 3D stereoscopic display. The method may include determining a size and a shape of the coupled zone. The coupled zone may include a physical volume specified by the user's visual depth of field with respect to screen position of the 3D stereoscopic display and the user's point of view. Content may be displayed at a first position with a virtual 3D space and the first position may correspond to a position within the coupled zone. It may be determined that the content is not contained in the coupled zone or is within a specified distance from a boundary of the coupled zone and, in response, display of the content may be adjusted such that the content has a second position in the virtual 3D space that corresponds to another position within the coupled zone.

IPC Classes  ?

  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
  • H04N 13/04 - Picture reproducers
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof

60.

Enhancing the coupled zone of a stereoscopic display

      
Application Number 14335455
Grant Number 09123171
Status In Force
Filing Date 2014-07-18
First Publication Date 2015-09-01
Grant Date 2015-09-01
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Berman, Arthur L.
  • Tu, Jerome C.
  • Morishige, Kevin D.

Abstract

Systems and methods for calibrating a three dimensional (3D) stereoscopic display system may include rendering a virtual object on a display of a 3D stereoscopic display system that may include a substantially horizontal display. The virtual object may be geometrically similar to a physical object placed at a location on the display. At least one dimension of the virtual object may be adjusted in response to user input. The adjustment may be such that the at least one dimension of the virtual object is approximately the same as a corresponding at least one dimension of the physical object.

IPC Classes  ?

  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06T 15/20 - Perspective computation
  • H04N 13/04 - Picture reproducers

61.

Three-dimensional tracking of a user control device in a volume

      
Application Number 14635654
Grant Number 09201568
Status In Force
Filing Date 2015-03-02
First Publication Date 2015-06-18
Grant Date 2015-12-01
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Paranjpe, Milind
  • Tu, Jerome C.

Abstract

Tracking objects presented within a stereo three-dimensional (3D) scene. The user control device may include one or more visually indicated points for at least one tracking sensor to track. The user control device may also include other position determining devices, for example, an accelerometer and/or gyroscope. Precise 3D coordinates of the stylus may be determined based on location information from the tracking sensor(s) and additional information from the other position determining devices. A stereo 3D scene may be updated to reflect the determined coordinates.

IPC Classes  ?

  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06T 17/00 - 3D modelling for computer graphics
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof
  • H04N 13/04 - Picture reproducers
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/046 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means

62.

System and methods for using modified driving waveforms to inhibit acoustic noise during driving of a liquid crystal polarization rotator

      
Application Number 14103002
Grant Number 09958695
Status In Force
Filing Date 2013-12-11
First Publication Date 2015-06-11
Grant Date 2018-05-01
Owner zSpace, Inc. (USA)
Inventor
  • Reznikov, Dmytro Yu
  • Flynn, Mark F.
  • Yeung, Chun Wun

Abstract

In some embodiments, a system and/or method may operate a liquid crystal device. The method may include increasing a voltage provided to a driven level to a liquid crystal addressable element of the liquid crystal device. Said increasing may be performed over a time period greater than 1 ms. The liquid crystal addressable element may be in a driven state at the driven level. The method may include reducing the provided voltage to a relaxed level. The liquid crystal addressable element may be in a relaxed state at the relaxed level. Said increasing the voltage over the time period to the driven level may result in a reduced acoustical noise associated with the provided voltage. In some embodiments, the liquid crystal device may include a three-dimensional (3D) display.

IPC Classes  ?

  • G02B 27/26 - Other optical systems; Other optical apparatus for producing stereoscopic or other three-dimensional effects involving polarising means
  • H04N 13/04 - Picture reproducers

63.

System and methods for cloud based 3D design and collaboration

      
Application Number 14538300
Grant Number 09153069
Status In Force
Filing Date 2014-11-11
First Publication Date 2015-05-21
Grant Date 2015-10-06
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Tu, Jerome C.
  • Thompson, Carola F.
  • Flynn, Mark F.
  • Twilleager, Douglas C.
  • Morishige, Kevin D.
  • Ullmann, Peter F.
  • Berman, Arthur L.

Abstract

In some embodiments, a system and/or method may include accessing three-dimensional (3D) imaging software on a remote server. The method may include accessing over a network a 3D imaging software package on a remote server using a first system. The method may include assessing, using the remote server, a capability of the first system to execute the 3D imaging software package. The method may include displaying an output of the 3D imaging software using the first system based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a first portion of the 3D imaging software using the remote server based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a second portion of the 3D imaging software using the first system based upon the assessed capabilities of the first system.

IPC Classes  ?

  • G06F 15/16 - Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
  • G06T 15/20 - Perspective computation
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • H04L 29/08 - Transmission control procedure, e.g. data link level control procedure
  • G06F 12/00 - Accessing, addressing or allocating within memory systems or architectures

64.

Methods for automatically assessing user handedness in computer systems and the utilization of such information

      
Application Number 14072933
Grant Number 09841821
Status In Force
Filing Date 2013-11-06
First Publication Date 2015-05-07
Grant Date 2017-12-12
Owner zSpace, Inc. (USA)
Inventor
  • Tu, Jerome C.
  • Thompson, Carola F.
  • Flynn, Mark F.
  • Twilleager, Douglas C.
  • Chavez, David A.
  • Morishige, Kevin D.
  • Ullmann, Peter F.
  • Berman, Arthur L.

Abstract

In some embodiments, a system and/or method may assess handedness of a user of a system in an automated manner. The method may include displaying a 3D image on a display. The 3D image may include at least one object. The method may include tracking a position and an orientation of an input device in open space in relation to the 3D image. The method may include assessing a handedness of a user based on the position and the orientation of the input device with respect to at least one of the objects. In some embodiments, the method may include configuring at least a portion of the 3D image based upon the assessed handedness. The at least a portion of the 3D image may include interactive menus. In some embodiments, the method may include configuring at least a portion of an interactive hardware associated with the system based upon the assessed handedness.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

65.

Three button stylus

      
Application Number 29437600
Grant Number D0719162
Status In Force
Filing Date 2012-11-19
First Publication Date 2014-12-09
Grant Date 2014-12-09
Owner zSpace, Inc. (USA)
Inventor
  • Albers, Michael C.
  • Chavez, David A.
  • Yamada, Kevin S.
  • Vesely, Michael A.

66.

System and methods for cloud based 3D design and collaboration

      
Application Number 14085272
Grant Number 08903958
Status In Force
Filing Date 2013-11-20
First Publication Date 2014-12-02
Grant Date 2014-12-02
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Tu, Jerome C.
  • Thompson, Carola F.
  • Flynn, Mark F.
  • Twilleager, Douglas C.
  • Morishige, Kevin D.
  • Ullmann, Peter F.
  • Berman, Arthur L.

Abstract

In some embodiments, a system and/or method may include accessing three-dimensional (3D) imaging software on a remote server. The method may include accessing over a network a 3D imaging software package on a remote server using a first system. The method may include assessing, using the remote server, a capability of the first system to execute the 3D imaging software package. The method may include displaying an output of the 3D imaging software using the first system based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a first portion of the 3D imaging software using the remote server based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a second portion of the 3D imaging software using the first system based upon the assessed capabilities of the first system.

IPC Classes  ?

  • G06F 15/16 - Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • H04L 12/24 - Arrangements for maintenance or administration
  • G06F 12/00 - Accessing, addressing or allocating within memory systems or architectures

67.

Liquid crystal variable drive voltage

      
Application Number 14335708
Grant Number 09134556
Status In Force
Filing Date 2014-07-18
First Publication Date 2014-11-06
Grant Date 2015-09-15
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Cheponis, Michael A.
  • Flynn, Mark F.

Abstract

A voltage may be provided to a liquid crystal addressable element as part of a liquid crystal device. The provided voltage may be reduced from a driven state to a relaxed state in a time period greater than 1 μs. The reduction may further be performed in less than 20 ms. The liquid crystal device may be a polarization switch, which in some embodiments may be a multi-segment polarization switch. In one embodiment, pulses of limited duration of a light source may be provided to the polarization switch. The manner of voltage reduction may reduce optical bounce of the liquid crystal device and may allow one or more of the pulses of the light source to be shifted later in time.

IPC Classes  ?

  • G09G 3/36 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using liquid crystals
  • G09G 3/38 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using electrochromic devices
  • G02F 1/133 - Constructional arrangementsOperation of liquid crystal cellsCircuit arrangements

68.

Non-linear navigation of a three dimensional stereoscopic display

      
Application Number 14257854
Grant Number 09380295
Status In Force
Filing Date 2014-04-21
First Publication Date 2014-10-23
Grant Date 2016-06-28
Owner zSpace, Inc. (USA)
Inventor
  • Dolim, Scott M.
  • Lopez-Fresquet, Cisco

Abstract

Systems and methods for navigating a 3D stereoscopic scene displayed via a 3D stereoscopic display system using user head tracking. A reference POV including a reference user head position and a reference user head orientation may be established. The user head POV may be tracked, including monitoring user head positional displacements and user head angular rotations relative to the reference POV. In response to the tracking, a camera POV used to render the 3D stereoscopic scene may be adjusted based on a non-linear mapping between changes in the camera POV and the user head positional displacements and user head angular rotations relative to the reference POV. The non-linear mapping may include a mapping of user head positional displacements relative to the reference POV to translational movements in the camera POV and a mapping of user head angular rotations relative to the reference POV to rotations in the camera POV.

IPC Classes  ?

  • H04N 13/04 - Picture reproducers
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

69.

Zero parallax drawing within a three dimensional display

      
Application Number 14257623
Grant Number 10019130
Status In Force
Filing Date 2014-04-21
First Publication Date 2014-10-23
Grant Date 2018-07-10
Owner zSpace, Inc. (USA)
Inventor
  • Ullmann, Peter F.
  • Champion, Clifford S.

Abstract

Systems and methods for digitally drawing on virtual 3D object surfaces using a 3D display system. A 3D drawing mode may be enabled and a display screen of the system may correspond to a zero parallax plane of a 3D scene that may present a plurality of surfaces at non-zero parallax planes. User input may be received at a location on the display screen, and in response, a surface may be specified, rendered, and displayed at the zero parallax plane. Further, additional user input on the display screen may be received specifying drawing motion across the rendered and displayed surface. The drawing motion may start at the location and continue across a boundary between the surface and another contiguous surface. Accordingly, in response to the drawing motion crossing the boundary, the contiguous surface may be rendered and displayed at the zero parallax plane along with results of the drawing motion.

IPC Classes  ?

  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computerOutput arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/366 - Image reproducers using viewer tracking
  • H04N 13/398 - Synchronisation thereofControl thereof
  • G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof
  • H04N 13/305 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
  • H04N 13/334 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spectral multiplexing
  • H04N 13/337 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/04 - Picture reproducers

70.

Presenting a view within a three dimensional scene

      
Application Number 14268613
Grant Number 09202306
Status In Force
Filing Date 2014-05-02
First Publication Date 2014-08-28
Grant Date 2015-12-01
Owner zSpace, Inc. (USA)
Inventor
  • Vesely, Michael A.
  • Gray, Alan S.

Abstract

Presenting a view based on a virtual viewpoint in a three dimensional (3D) scene. The 3D scene may be presented by at least one display, which includes displaying at least one stereoscopic image of the 3D scene by the display(s). The 3D scene may be presented according to a first viewpoint. A virtual viewpoint may be determined within the 3D scene that is different than the first viewpoint. The view of the 3D scene may be presented on the display(s) according to the virtual viewpoint and/or the first view point. The presentation of the view of the 3D scene is performed concurrently with presenting the 3D scene.

IPC Classes  ?

  • G06T 15/20 - Perspective computation
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

71.

Liquid crystal variable drive voltage

      
Application Number 13110562
Grant Number 08786529
Status In Force
Filing Date 2011-05-18
First Publication Date 2014-07-22
Grant Date 2014-07-22
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Cheponis, Michael A.
  • Flynn, Mark F.

Abstract

A voltage may be provided to a liquid crystal addressable element as part of a liquid crystal device. The provided voltage may be reduced from a driven state to a relaxed state in a time period greater than 1 μs. The reduction may further be performed in less than 20 ms. The liquid crystal device may be a polarization switch, which in some embodiments may be a multi-segment polarization switch. In one embodiment, pulses of limited duration of a light source may be provided to the polarization switch. The manner of voltage reduction may reduce optical bounce of the liquid crystal device and may allow one or more of the pulses of the light source to be shifted later in time.

IPC Classes  ?

  • G09G 3/36 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix by control of light from an independent source using liquid crystals

72.

Three-dimensional collaboration

      
Application Number 13867245
Grant Number 09595127
Status In Force
Filing Date 2013-04-22
First Publication Date 2013-09-12
Grant Date 2017-03-14
Owner zSpace, Inc. (USA)
Inventor
  • Champion, Clifford S.
  • Hosenpud, Jonathan J.
  • Vesely, Michael A.

Abstract

Remote collaboration of a subject and a graphics object in a same view of a 3D scene. In one embodiment, one or more cameras of a collaboration system may be configured to capture images of a subject and track the subject (e.g., head of a user, other physical object). The images may be processed and provided to another collaboration system along with a determined viewpoint of the user. The other collaboration system may be configured to render and project the captured images and a graphics object in the same view of a 3D scene.

IPC Classes  ?

  • H04N 15/00 - Stereoscopic colour television systems; Details thereof
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • H04N 7/14 - Systems for two-way working
  • H04N 7/15 - Conference systems
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/04 - Picture reproducers
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06T 7/20 - Analysis of motion
  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
  • G06K 9/32 - Aligning or centering of the image pick-up or image-field
  • G06K 9/46 - Extraction of features or characteristics of the image

73.

Head tracking eyewear system

      
Application Number 13679630
Grant Number 09106903
Status In Force
Filing Date 2012-11-16
First Publication Date 2013-05-23
Grant Date 2015-08-11
Owner zSpace, Inc. (USA)
Inventor
  • Tu, Jerome C.
  • Chavez, David A.

Abstract

In some embodiments, a system for tracking with reference to a three-dimensional display system may include a display device, an image processor, a surface including at least three emitters, at least two sensors, a processor. The display device may image, during use, a first stereo three-dimensional image. The surface may be positionable, during use, with reference to the display device. At least two of the sensors may detect, during use, light received from at least three of the emitters as light blobs. The processor may correlate, during use, the assessed referenced position of the detected light blobs such that a first position/orientation of the surface is assessed. The image processor may generate, during use, the first stereo three-dimensional image using the assessed first position/orientation of the surface with reference to the display. The image processor may generate, during use, a second stereo three-dimensional image using an assessed second position/orientation of the surface with reference to the display.

IPC Classes  ?

74.

Extended overdrive tables and use

      
Application Number 13597499
Grant Number 09161025
Status In Force
Filing Date 2012-08-29
First Publication Date 2013-02-28
Grant Date 2015-10-13
Owner zSpace, Inc. (USA)
Inventor Flynn, Mark F.

Abstract

System and method for video processing. At least one overdrive (OD) look-up table (LUT) is provided, where the at least one OD LUT is dependent on input video levels and at least one parameter indicative of at least one attribute of the system or a user of the system. Video levels for a plurality of pixels for an image are received, as well as the at least one parameter. Overdriven video levels are generated via the at least one OD LUT based on the video levels and the at least one parameter. The overdriven video levels are provided to a display device for display of the image. The reception of video levels and at least one parameter, the generation of overdriven video levels, and the provision of overdriven video levels, may be repeated one or more times in an iterative manner to display a sequence of images.

IPC Classes  ?

  • H04N 13/04 - Picture reproducers
  • H04N 5/202 - Gamma control
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

75.

Optimizing stereo video display

      
Application Number 13481243
Grant Number 09736466
Status In Force
Filing Date 2012-05-25
First Publication Date 2012-11-29
Grant Date 2017-08-15
Owner zSpace, Inc. (USA)
Inventor Flynn, Mark F.

Abstract

System and method for video processing. First video levels for pixels for a left image of a stereo image pair are received from a GPU. Gamma corrected video levels (g-levels) are generated via a gamma look-up table (LUT) based on the first video levels. Outputs of the gamma LUT are constrained by minimum and/or maximum values, thereby excluding values for which corresponding post-OD display luminance values differ from static display luminance values by more than a specified error. Overdriven video levels are generated via a left OD LUT based on the g-levels. The overdriven video levels correspond to display luminance values that differ from corresponding static display luminance values by less than the error threshold, and are provided to a display device for display of the left image. This process is repeated for second video levels for a right image of the stereo image pair, using a right OD LUT.

IPC Classes  ?

  • H04N 13/04 - Picture reproducers
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof

76.

Tightly coupled interactive stereo display

      
Application Number 13300424
Grant Number 09354718
Status In Force
Filing Date 2011-11-18
First Publication Date 2012-06-28
Grant Date 2016-05-31
Owner zSpace, Inc. (USA)
Inventor
  • Vesely, Michael A.
  • Chavez, David A.
  • Twilleager, Douglas C.

Abstract

Modifying perspective of stereoscopic images provided by one or more displays based on changes in user view, user control, and/or display status. A display system may include a housing, a display comprised in the housing, and one or more tracking sensors comprised in the housing. The one or more tracking sensors may be configured to sense user view and/or user control position and orientation information. The one or more tracking sensors may be associated with a position and orientation of the display. The user view and/or user control position and orientation information may be used in generating the rendered left and right eye images for display.

IPC Classes  ?

  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • H04N 13/02 - Picture signal generators
  • H04N 13/04 - Picture reproducers
  • G06T 15/10 - Geometric effects
  • G06T 17/00 - 3D modelling for computer graphics
  • G06T 17/20 - Wire-frame description, e.g. polygonalisation or tessellation
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

77.

Three-dimensional tracking of a user control device in a volume

      
Application Number 13333299
Grant Number 08970625
Status In Force
Filing Date 2011-12-21
First Publication Date 2012-06-28
Grant Date 2015-03-03
Owner zSpace, Inc. (USA)
Inventor
  • Chavez, David A.
  • Paranjpe, Milind
  • Tu, Jerome C.

Abstract

Tracking objects presented within a stereo three-dimensional (3D) scene. The user control device may include one or more visually indicated points for at least one tracking sensor to track. The user control device may also include other position determining devices, for example, an accelerometer and/or gyroscope. Precise 3D coordinates of the stylus may be determined based on location information from the tracking sensor(s) and additional information from the other position determining devices. A stereo 3D scene may be updated to reflect the determined coordinates.

IPC Classes  ?

  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • H04N 13/04 - Picture reproducers
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
  • H04N 13/00 - Stereoscopic video systemsMulti-view video systemsDetails thereof
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0354 - Pointing devices displaced or positioned by the userAccessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks

78.

Tools for use within a three dimensional scene

      
Application Number 13182305
Grant Number 08643569
Status In Force
Filing Date 2011-07-13
First Publication Date 2012-01-19
Grant Date 2014-02-04
Owner zSpace, Inc. (USA)
Inventor Vesely, Michael A.

Abstract

Tools for use within a 3D scene. The 3D scene may be presented by at least one display, which includes displaying at least one stereoscopic image of the 3D scene by the display(s). The 3D scene may be presented according to a first viewpoint. User input may be received to the 3D scene using one or more tools. The 3D scene may be updated in response to the use of the one or more tools.

IPC Classes  ?

  • G06T 15/00 - 3D [Three Dimensional] image rendering

79.

Presenting a view within a three dimensional scene

      
Application Number 12797958
Grant Number 08717360
Status In Force
Filing Date 2010-06-10
First Publication Date 2011-08-04
Grant Date 2014-05-06
Owner zSpace, Inc. (USA)
Inventor
  • Vesely, Michael A.
  • Gray, Alan S.

Abstract

Presenting a view based on a virtual viewpoint in a three dimensional (3D) scene. The 3D scene may be presented by at least one display, which includes displaying at least one stereoscopic image of the 3D scene by the display(s). The 3D scene may be presented according to a first viewpoint. A virtual viewpoint may be determined within the 3D scene that is different than the first viewpoint. The view of the 3D scene may be presented on the display(s) according to the virtual viewpoint and/or the first view point. The presentation of the view of the 3D scene is performed concurrently with presenting the 3D scene.

IPC Classes  ?

80.

Modifying perspective of stereoscopic images based on changes in user viewpoint

      
Application Number 13019384
Grant Number 08717423
Status In Force
Filing Date 2011-02-02
First Publication Date 2011-05-26
Grant Date 2014-05-06
Owner zSpace, Inc. (USA)
Inventor
  • Vesely, Michael A.
  • Clemens, Nancy L.
  • Gray, Alan S.

Abstract

Modifying perspective of stereoscopic images provided by one or more displays based on changes in user viewpoint. The one or more displays may include a first display that is provided substantially horizontal for displaying 3D horizontal perspective images and/or a second display that is provided substantially vertical for displaying text or conventional images such as 2D images, or 3D vertical perspective images. The horizontal display surface may be typically positioned directly in front of the user, and at a height of about a desktop surface so that the user can have about a 45° looking angle. The vertical display surface may be positioned in front of the user and preferably behind and above the horizontal display surface.

IPC Classes  ?

81.

Three dimensional horizontal perspective workstation

      
Application Number 11429829
Grant Number 07907167
Status In Force
Filing Date 2006-05-08
First Publication Date 2006-11-09
Grant Date 2011-03-15
Owner ZSPACE, INC. (USA)
Inventor
  • Vesely, Michael A.
  • Clemens, Nancy L.

Abstract

The present invention discloses a horizontal perspective workstation comprising at least two display surfaces, one being substantially horizontal for displaying 3D horizontal perspective images, and one being substantially vertical for text or conventional images such as 2D images, or central perspective images. The horizontal display surface is typically positioned directly in front of the user, and at a height of about a desktop surface so that the user can have about a 45° angle looking. The vertical display surface is also positioned in front of the user and preferably behind and above the horizontal display surface.

IPC Classes  ?

  • H04N 13/04 - Picture reproducers
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

82.

Brain balancing by binaural beat

      
Application Number 11292376
Grant Number 07769439
Status In Force
Filing Date 2005-11-28
First Publication Date 2006-06-01
Grant Date 2010-08-03
Owner ZSPACE, INC. (USA)
Inventor
  • Vesely, Michael A.
  • Clemens, Nancy

Abstract

A method and apparatus to balance the brain left side and the brain right side by using binaural beat is disclosed. The disclosed apparatus comprises an electroencephalographic (EEG) system to measure the brain left and right electrical signals, and an audio generator to generate a binaural beat to compensate for the unbalanced EEG frequencies. The disclosed method includes measuring the brain wave frequency spectrum of the individual, selecting the frequency exhibiting imbalanced behavior, and generating a binaural beat of that frequency. The binaural beat can be continuous or intermitten.

IPC Classes  ?

  • A61B 5/04 - Measuring bioelectric signals of the body or parts thereof

83.

Multi-plane horizontal perspective display

      
Application Number 11141649
Grant Number 07796134
Status In Force
Filing Date 2005-05-31
First Publication Date 2005-12-01
Grant Date 2010-09-14
Owner ZSPACE, INC. (USA)
Inventor
  • Vesely, Michael A.
  • Clemens, Nancy L.

Abstract

The present invention multi-plane display system discloses a three dimension display system comprising at least two display surfaces, one of which displaying a three dimensional horizontal perspective images. Further, the display surfaces can have a curvilinear blending display section to merge the various images. The multi-plane display system can comprise various camera eyepoints, one for the horizontal perspective images, and optionally one for the curvilinear blending display surface.

IPC Classes  ?