Samsung Jul 2012 - May 2013
Creative Director
Microsoft Jul 2012 - May 2013
Principal Design Manager
Microsoft Apr 2008 - Jul 2012
Design Lead
Invivia Inc. Jan 2006 - Apr 2008
Designer
American Museum of Natural History Oct 2001 - Jul 2004
Technical Assistant
Education:
Massachusetts Institute of Technology 2004 - 2008
Dartmouth College 1997 - 2001
Bachelors, Bachelor of Arts, Computer Science, Studio Art
Skills:
User Interface Design Design Strategy User Experience Interaction Design User Centered Design User Experience Design User Interface Visual Design Product Design Concept Design Information Architecture Experience Design Rapid Prototyping Adobe Creative Suite Graphic Design Human Computer Interaction Design Thinking Architecture Indesign User Research Web Design Concept Art Mobile Design Actionscript Css Usability Testing
V. Kevin Russ - Bellevue WA, US John A. Snavely - Seattle WA, US Edwin R. Burtner - Everett WA, US Ian M. Sands - Seattle WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
H04M 1/00
US Classification:
4555561, 455566, 4555501, 345173
Abstract:
A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.
John Snavely - Seattle WA, US Kevin Russ - Bellevue WA, US Ian Sands - Seattle WA, US Russ Burtner - Everett WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G01C 21/00
US Classification:
701400
Abstract:
Rule-based location sharing may be provided. A location determining device, such as a Global Positioning System (GPS) enabled device, may receive a request to share the location. A rule may be used to determine whether to share the location with the requestor. If the rule allows the location to be shared, the location may be sent to the requestor. The location may be relayed through a third party server, which may be operative to evaluate the rule before sharing the location with the requestor.
V. Kevin Russ - Bellevue WA, US Ian M. Sands - Seattle WA, US John A. Snavely - Seattle WA, US Edwin Russ Burtner - Everett WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G06F 3/01 G06F 3/033 G06F 3/14
US Classification:
715863, 715702, 715864
Abstract:
A virtual inking device is created in response to a touch input device detecting a user's inking gesture. For example, when a user places one of their hands in a pen gesture (i. e. by connecting the index finger with the thumb while holding the other fingers near the palm), the user may perform inking operations. When the user changes the pen gesture to an erase gesture (i. e. making a fist) then the virtual pen may become a virtual eraser. Other inking gestures may also be utilized.
V. Kevin Russ - Bellevue WA, US John A. Snavely - Seattle WA, US Edwin R. Burtner - Everett WA, US Ian M. Sands - Seattle WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G01C 21/36
US Classification:
701201
Abstract:
Navigation information may be provided. First, a destination location may be received at a portable device. Next, a current location of the portable device maybe detected. Then, at least one way-point may be calculated based on the current location and the destination location. An orientation and a level of the portable device may be determined and the at least one way-point may then be projected from the portable device.
V. Kevin Russ - Bellevue WA, US John A. Snavely - Seattle WA, US Edwin R. Burtner - Everett WA, US Ian M. Sands - Seattle WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G06F 3/048
US Classification:
715784, 715810
Abstract:
A tear-drop way-finding user interface (UI) may be provided. A first UI portion corresponding to a device location may be provided. In addition, an object may be displayed at a first relative position within the first UI portion. Then, upon a detected change in device location, a second UI portion corresponding to the changed device location may be provided. In response to the changed device location, a second relative position of the object may be calculated. Next, a determination may be made as to whether the second relative position of the object is within a displayable range of the second UI portion. If the second relative position of the object is not within the displayable range of the second UI portion, then a tear-drop icon indicative of the second relative position of the object may be displayed at an edge of the second UI portion.
V. Kevin Russ - Bellevue WA, US John A. Snavely - Seattle WA, US Edwin R. Burtner - Everett WA, US Ian M. Sands - Seattle WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G06F 3/033
US Classification:
715863, 715848
Abstract:
User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation.
Kevin Russ - Bellevue WA, US John Snavely - Seattle WA, US Ian Sands - Seattle WA, US Russ Burtner - Everett WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G06F 3/048
US Classification:
715803
Abstract:
Embodiments of the present invention are directed toward facilitating multi-user input on large format displays. In situations where multiple users may want to work individually on separate content, or individually on the same content, embodiments of the present invention provide an interface allowing a user or users to segment a display in a way to create isolated areas in which multiple users may manipulate content independently and concurrently.
Kevin Russ - Bellevue WA, US Ian Sands - Seattle WA, US Russ Burtner - Everett WA, US John Snavely - Seattle WA, US
Assignee:
Microsoft Corporation - Redmond WA
International Classification:
G06F 3/041
US Classification:
345173
Abstract:
Embodiments of the present invention provide a dual-sided multi-touch computing device that offers the advantages of a keyboard in addition to the conveniences of a slate device. The dual-sided multi-touch computing device may be utilized in two orientations; one side is a multi-touch slate device, and the alternate side is a multi-touch display keyboard. The device is configured with an orientation-recognition device, so that it may be configured based on its orientation. The present invention may be utilized as a stand alone personal computer or as a peripheral device in conjunction with other devices.