Eric J Undersander

age ~43

from San Francisco, CA

Also known as:
  • Eric John Undersander

Eric Undersander Phones & Addresses

  • San Francisco, CA
  • Miami, FL
  • Baltimore, MD
  • 912 22Nd St, Austin, TX 78705 • 5124571215
  • Saint Cloud, MN
  • Cockeysville, MD
  • Orlando, FL
  • Maitland, FL
  • Grapevine, TX
  • 912 W 22Nd St, Austin, TX 78705 • 5126334219

Work

  • Company:
    Facebook
    Apr 2020
  • Position:
    Research engineer

Education

  • Degree:
    Bachelors, Bachelor of Science
  • School / High School:
    The University of Texas at Austin
    1999 to 2003
  • Specialities:
    Computer Science

Skills

Video Games • Game Development • C++ • Perforce • Gameplay • Game Design • Game Programming • C • Software Development • Object Oriented Programming • C#

Industries

Computer Software

Resumes

Eric Undersander Photo 1

Research Engineer

view source
Location:
1137 Webster St, San Francisco, CA 94115
Industry:
Computer Software
Work:
Facebook
Research Engineer

Cruise Automation
Staff Software Engineer

Baidu Usa Jan 2017 - Jun 2018
Research Software Engineer

Stardock Jul 2014 - Dec 2016
Lead Developer

Big Huge Games Apr 2007 - Jun 2011
Senior Software Engineer
Education:
The University of Texas at Austin 1999 - 2003
Bachelors, Bachelor of Science, Computer Science
Skills:
Video Games
Game Development
C++
Perforce
Gameplay
Game Design
Game Programming
C
Software Development
Object Oriented Programming
C#

Us Patents

  • Systems And Methods For Block-Sparse Recurrent Neural Networks

    view source
  • US Patent:
    20190130271, May 2, 2019
  • Filed:
    Oct 4, 2018
  • Appl. No.:
    16/151886
  • Inventors:
    - Sunnyvale CA, US
    Eric UNDERSANDER - San Francisco CA, US
    Gregory DIAMOS - San Jose CA, US
  • Assignee:
    Baidu USA LLC - Sunnyvale CA
  • International Classification:
    G06N 3/08
    G06N 3/04
  • Abstract:
    Described herein are systems and methods to prune deep neural network models in reducing the overall memory and compute requirements of these models. It is demonstrated that using block pruning and group lasso combined with pruning during training, block-sparse recurrent neural networks (RNNs) may be built as accurate as dense baseline models. Two different approaches are disclosed to induce block sparsity in neural network models: pruning blocks of weights in a layer and using group lasso regularization to create blocks of weights with zeros. Using these techniques, it is demonstrated that block-sparse RNNs with high sparsity can be created with small loss in accuracy. Block-sparse RNNs eliminate overheads related to data storage and irregular memory accesses while increasing hardware efficiency compared to unstructured sparsity.

Youtube

Wide Swathing and Low-Lignin Alfalfa - Dr. Da...

As the dairy industry continues to advance in terms of milk production...

  • Duration:
    1m 39s

Live From Emmet's Place Vol. 64 - Eric Alexan...

Streaming from: Facebook.com/hey... Subscribe to Newsletter: Join E...

  • Duration:
    1h 5m 27s

Eric Alexander: Touching (full album)

Eric Alexander: Touching (2013) Label: High Note Orig Year: 2013 Relea...

  • Duration:
    51m 1s

Rob Undersander and Minnesota Senator Jeff Ho...

Minnesota millionaire Rob Undersander and Minnesota State Senator Jeff...

  • Duration:
    4m 47s

9. Faster RL Training: Profiling and Optimiza...

Presenter: Eric Undersander Why focus on performance optimization for ...

  • Duration:
    17m 29s

Maryland guards Eric Ayala, Xavier Green revi...

ahmedghafir.Subs...

  • Duration:
    3m 26s

Get Report for Eric J Undersander from San Francisco, CA, age ~43
Control profile