Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

EE3001 - Project Interactive 3-D Modeling and Simulation Device

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

EE3001 – Project

Interactive 3-D Modeling and Simulation Device

1. Tentative Plan and Blueprint:


 Image/Video Acquisition via sensor device: either peripheral or in-
depth sensor (Example: Camera, Infra Red detector)
 3-D modeling for acquired Image/Video at a certain frame per
second rate
 FULL Body Motion Capture capability
 “No controller – You are the controller!”
 Interactive Modeling – Real time rendering
 Simulation Modeling – Controlled rendering
 Complex GPU System
 In-built Character and Environment editor
 *Features like Face recognition and Voice control
 Graphics Engine design (for simulation and Gaming option)
 Environment and Object rendering
 Display Output on LCD or Virtual reality Glasses

2. Software and Hardware:


 Data base system to record modeling data send via wifi from the
peripheral device or sensors
 Concept Designing of Software and Integrated Hardware
 3-D Modeling software to generate the virtual reality
 Environmental rendering software
 Peripheral device or in-depth sensor (Camera or infra red)
 Hardware Details
 Graphics Engine
 Inbuilt multi-array microphone
 Software to contain a Environmental Editor mode
3. Software Technical DETAILS and Efficiency checks:
 Implementation OR Development of Graphics Designing Engine to be
able to render Environment or Modify it for simulation
 What Modeling software to generate receiving info
 What Database to store receiving info
 How to transmit input signals (e.g. Wi-Fi, Wired or other tech)

4. Hardware Technical DETAILS:
 Concept design of Hardware
 How Software is integrated with it
 Peripheral Sensor device using what technology?
 Infrared sensor
 Camera with stereoscopic vision
 In-depth Sensor
 *Monochrome CMOS Sensor
 Design of Integrated circuit to support all the hardware and link to
the database
 Transmission of data via what source
 To provide FULL body motion capture, face recognition and

5. Display OUTPUT DETAILS:


 Processed data to be projected on different mediums
 LCD or *Interactive Projection
 Virtual Reality Glasses
 Different modes of working of the display feature

6. Target Market (DETAILS):


 Gaming Industry
 Animation Industry ( Movie/ Entertainment Industry)
 Engineering Firms ( Motion Capture capability)
7. Financial Details (Production/Manufacture Cost):
 Cost of research
 Cost of Actual Production
 Per Unit Cost

8. Projected Design OVERVIEW (Diagrammatic):
 Consist of a Blue print of the actual model design containing all the
components of the project link together
 Overflow of the functionality of device shown with the help of
flowchart
 Basic structure of Input  Output
9. Quality Assurance and Quality Control (QA):
 Basic Quality testing of the device
 Stress testing including in-depth testing of each and every
functionality of software and hardware
 Testing to be conducted under different conditions
 Bug and Different Error fixing Automation control

10. Application Development (Usage):


 Development of supporting Applications such as Training simulation,
Gaming , etc
 Use of technology in different sectors

11. That’s all I could think of, lol…

Sensor device
An approximately nine-inch (23 cm) wide horizontal bar connected to a small circular base with a ball
joint pivot, the Project Natal sensor is designed to be positioned lengthwise above or below the video
display. The device features an "RGB camera, depth sensor, multi-array microphone, and custom
processor running proprietary software", which provides full-body 3D motion capture, facial recognition,
and voice recognition capabilities. The Project Natal sensor's microphone array enables the Xbox 360 to
conduct acoustic source localization and ambient noise suppression, allowing for things such as headset-
free party chat over Xbox Live.

The depth sensor consists of an infrared projector combined with a monochrome CMOS sensor, and
allows the Project Natal sensor to see in 3D under any lighting conditions. Project Natal is reportedly
based on software technology developed internally by Microsoft (gesture recognition, skeletal mapping,
facial recognition, voice recognition) and hardware technology acquired by time-of-flight camera
developer 3DV Systems. Before agreeing to sell all its assets in March 2009, 3DV had been preparing a
similar device, known as the ZCam."

Interesting stuff.

This means that there is a custom processor doing all the hard work which leaves the core to get on with
the games based stuff.  With this in mind it may actually work as well as Microsoft and Lionhead say.

An active-pixel sensor (APS), also commonly written active pixel sensor, is an image sensor consisting of
an integrated circuit containing an array of pixel sensors, each pixel containing a photodetector and an
active amplifier. There are many types of active pixel sensors including the CMOS APS used most
commonly in cell phone cameras, web cameras and in some DSLRs. Such an image sensor is produced by
a CMOS process (and is hence also known as a CMOS sensor), and has emerged as an alternative to
charge-coupled device (CCD) imager sensors.

You might also like