Research Scientist, Machine Perception (PhD)
Posted 309 days ago
Job Description
This job posting has expired and no longer accepting applications.
At Meta Reality Labs, we’re developing the input technologies that will define the future of the virtual and augmented reality computing platforms. We are looking for a skilled and motivated research scientist to join our machine perception team in advancing the state-of-the-art of human understanding, action recognition and statistical text decoding.
In this role, you will work with research scientists, software engineers and hardware engineers to design technical solutions in a fast-paced multidisciplinary environment.
- Work with researchers to develop input architectures that combine vision and language.
- Design and prototype interactive experiences and evaluate them through user research.
- Design and prototype data collection systems to accelerate machine learning.
- Collaborate and work across teams to develop concepts that advance the entire product pipeline (hardware, software, data collection, machine learning, etc.).
Minimum Qualifications
- Currently has, or is in the process of obtaining a PhD degree in Computer Science, Computer Engineering, or related field.
- Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience. Degree must be completed prior to joining Meta.
- Experience in developing solutions for human-computer interaction, computer vision, natural language processing, or computer graphics.
- Experience working with PyTorch or TensorFlow.
- Experience designing data collection protocols and quality control.
- Skills to innovate or extend research ideas for new problem domains or for fitting new constraints.
- Must obtain work authorization in the country of employment at the time of hire, and maintain ongoing work authorization during employment.
- Interpersonal experience: cross-group collaboration.
Preferred Qualifications
- Research experience in end-to-end speech recognition or real-time statistical text decoding.
- Experience transferring technology from research in computer vision, machine perception or machine learning into a shipping product.
- Experience with joint hardware-software development and associated rapid prototyping.
- Proven track record of achieving significant results as demonstrated by grants, fellowships, patents, as well as first-authored publications at top-tier conferences such as CVPR, ECCV/ICCV, SIGGRAPH, NeurIPS, ICLR, ACL, HCI, UIST or similar.
- Demonstrated software engineer experience via an internship, work experience, coding competitions, or widely used contributions in open source repositories (e.g. GitHub).
About Meta
Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics.
Equal Employment Opportunity
Meta is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. You may view our Equal Employment Opportunity notice here.
Meta is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, fill out the Accommodations request form.
This job posting has expired and no longer accepting applications. Please check out our latest AI jobs.
Meta
0 jobs posted
About the job
Similar Jobs
14d
Research Scientist, PhD
OpenAI
San Francisco, CAResearch Scientist, PhD
OpenAI
San Francisco, CA14d15d
Senior Applied Research Scientist, Perception
Waymo
$204K - $259KMountain View, CASan Francisco, CASenior Applied Research Scientist, Perception
Waymo
$204K - $259KMountain View, CASan Francisco, CA15d1d
Applied Research Scientist, Perception LLM/VLM (PhD, New Grad)
Waymo
$170K - $216KMountain View, CASan Francisco, CAApplied Research Scientist, Perception LLM/VLM (PhD, New Grad)
Waymo
$170K - $216KMountain View, CASan Francisco, CA1d7d
Research Scientist, Robotics
DeepMind
London, United KingdomResearch Scientist, Robotics
DeepMind
London, United Kingdom7d12d
Research Engineer / Research Scientist, Tokens
Anthropic
$350K - $500KNew York City, NYSeattle, WASan Francisco, CAResearch Engineer / Research Scientist, Tokens
Anthropic
$350K - $500KNew York City, NYSeattle, WASan Francisco, CA12d15d
Research Scientist, Autonomous Agents — RL
DeepMind
London, United KingdomResearch Scientist, Autonomous Agents — RL
DeepMind
London, United Kingdom15d10d
Research Scientist, Intelligent Editing (Multimodality)
TikTok
San Jose, CAResearch Scientist, Intelligent Editing (Multimodality)
TikTok
San Jose, CA10d9d
Research Scientist (Measurement and Evaluation)
Abridge
NYC OfficeResearch Scientist (Measurement and Evaluation)
Abridge
NYC Office9d17d
Senior Research Scientist - GenAI (Design Generation)
Canva
Sydney, NSW, AustraliaSenior Research Scientist - GenAI (Design Generation)
Canva
Sydney, NSW, Australia17d19d
Research Scientist, AQUA
DeepMind
Bangalore, IndiaResearch Scientist, AQUA
DeepMind
Bangalore, India19d
Looking for something different?
Browse all AI jobsFree AI job alerts
Get the latest AI jobs delivered to your inbox every week. Free, no spam.