Norbert Dentressangle UK - Northampton, United Kingdom Aug 2012 - Jan 2013
Inventory Control
EXL Service - Cochin Area, India 2010 - May 2012
Assistant Manager,EXL
The Fitness Edge - Fairfiled, CT, US Aug 2008 - Feb 2010
Asst Controller/Accountant
GE Capital Jan 2001 - Feb 2003
Cash Accountant
Education:
M.G. University 1994 - 1998
Bcom, Accounting/Business
Languages:
English
Awards:
Team Extra Miler Award GE Capital GE Excellence Award GE Capital
Director Technical Services at National Air Carrier Association, Director of Technical Services at NACA, Director technical services at NACA, director technical services at NACA
Location:
United States
Work:
National Air Carrier Association since 2010
Director Technical Services
NACA since 2010
Director of Technical Services
NACA since 2010
Director technical services
NACA since 2009
director technical services
MAXjet Airways - Dulles virginia 2007 - 2008
COO
Ventura County Medical Center Surgery 3291 Loma Vis Rd FL 2, Ventura, CA 93003 8056526237 (phone), 8056526184 (fax)
Education:
Medical School Semmelweis Orvostudomanyi Egyetem, Budapest, Hungary Graduated: 2000
Languages:
English Spanish Tagalog
Description:
Dr. Paul graduated from the Semmelweis Orvostudomanyi Egyetem, Budapest, Hungary in 2000. He works in Ventura, CA and specializes in Anesthesiology. Dr. Paul is affiliated with Ventura County Medical Center.
Charles J. Cohen - Ann Arbor MI Glenn Beach - Ypsilanti MI Brook Cavell - Ypsilanti MI Gene Foulk - Ann Arbor MI Charles J. Jacobus - Ann Arbor MI Jay Obermark - Ann Arbor MI George Paul - Ypsilanti MI
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
G06K 900
US Classification:
382103, 382209, 701 45, 345473, 345474
Abstract:
A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description. The disclosure details methods for gesture recognition, as well as the overall architecture for using gesture recognition to control of devices, including self-service machines.
Gesture-Controlled Interfaces For Self-Service Machines And Other Applications
Charles J. Cohen - Ann Arbor MI, US Glenn Beach - Brooklyn MI, US Brook Cavell - Ann Arbor MI, US Gene Foulk - Ann Arbor MI, US Charles J. Jacobus - Ann Arbor MI, US Jay Obermark - Ann Arbor MI, US George Paul - Bedford NH, US
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
G06K009/00
US Classification:
382103, 382276
Abstract:
A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description. The disclosure details methods for gesture recognition, as well as the overall architecture for using gesture recognition to control of devices, including self-service machines.
Charles J. Cohen - Ann Arbor MI, US Glenn Beach - Ypsilanti MI, US Brook Cavell - Ypsilanti MI, US Gene Foulk - Ann Arbor MI, US Charles J. Jacobus - Ann Arbor MI, US Jay Obermark - Ann Arbor MI, US George Paul - Belleville MI, US
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
G09G 5/00
US Classification:
715863
Abstract:
A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Where in the previous patent only one gesture was recognized at a time, in this system, multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
Tracking And Gesture Recognition System Particularly Suited To Vehicular Control Applications
George V. Paul - Belleville MI, US Glenn J. Beach - Ypsilanti MI, US Charles J. Cohen - Ann Arbor MI, US Charles J. Jacobus - Ann Arbor MI, US
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
G06K 9/00
US Classification:
382104, 348154, 701 45
Abstract:
A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle. The on-board sensor system would be used to track the driver or passenger, but when the algorithms produce a command for a desired response, that response (or just position and gesture information) could be transmitted via various methods (wireless, light, whatever) to other systems outside the vehicle to control devices located outside the vehicle.
Real-Time Head Tracking System For Computer Games And Other Applications
George V. Paul - Belleville MI, US Glenn J. Beach - Ypsilanti MI, US Charles J. Cohen - Ann Arbor MI, US Charles J. Jacobus - Ann Arbor MI, US
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
A63F 13/10
US Classification:
463 36
Abstract:
A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
Gesture-Controlled Interfaces For Self-Service Machines And Other Applications
Charles J. Cohen - Ann Arbor MI, US Glenn Beach - Ypsilanti MI, US Brook Cavell - Ypsilanti MI, US Gene Foulk - Ann Arbor MI, US Charles J. Jacobus - Ann Arbor MI, US Jay Obermark - Ann Arbor MI, US George Paul - Bedford NH, US
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
G06K 9/00 G06K 9/36 G06F 3/033
US Classification:
382103, 382276, 715863
Abstract:
A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description. The disclosure details methods for gesture recognition, as well as the overall architecture for using gesture recognition to control of devices, including self-service machines.
Gesture-Controlled Interfaces For Self-Service Machines And Other Applications
Charles J. Cohen - Ann Arbor MI, US Glenn J. Beach - Grass Lake MI, US Brook Cavell - Ann Arbor MI, US Eugene Foulk - Ann Arbor MI, US Charles J. Jacobus - Ann Arbor MI, US Jay Obermark - Ann Arbor MI, US George V. Paul - Belleville MI, US
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
G06K 9/00 G06F 3/033
US Classification:
382103, 715863
Abstract:
A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description. The disclosure details methods for gesture recognition, as well as the overall architecture for using gesture recognition to control of devices, including self-service machines.
George V. Paul - Belleville MI, US Glenn J. Beach - Brooklyn MI, US Charles J. Cohen - Ann Arbor MI, US Charles J. Jacobus - Ann Arbor MI, US
Assignee:
Cybernet Systems Corporation - Ann Arbor MI
International Classification:
G06K 9/00
US Classification:
382103, 382107, 382164, 382165, 382209
Abstract:
A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
Isbn (Books And Publications)
Parallel Systems and Computation: Proceedings of the 1986 IBM Europe Institute--Seminar on Parallel Computing, Oberlech, Austria, August 11-15, 1986
enterview Partners LLC's Alan Hartman as its financial advisers. The company worked with Morton Pierce, Chang-Do Gong, Robert Chung, William Dantzler, Henrik Patel, Daren Orzechowski, George Paul, Rebecca Farrington, counsel Ken Barr and associate Allison Dodd at White & Case LLP for legal advice.
Date: May 21, 2015
Category: Business
Source: Google
Four held in London raids over alleged terrorist plot
The Westbourne Grove arrest was carried out as a man left an Iranian restaurant, the Alounak. George Paul, 30, who saw the incident, said: The man was shouting something like, Please dont break my arms, please dont hurt me. He was cuffed-up and pushed up against the front of the restaurant. The
Date: Oct 14, 2013
Category: World
Source: Google
Jim Lehrer to step down from daily broadcast at PBS 'NewsHour'
Ed Fouhy, left, executive producer for the Commission on the Presidential Debates, discusses final plans for a presidential debate with George Paul, center, of CBS News, and Jim Lehrer on the debate stage at Washington University in St. Louis. Lehrer was moderator of the debate.