Career Profile
My name is James Yang, a Data Scientist / Electrical & Computer Engineer. I obtained my Masters degree in Data Science at the University of Washington - Seattle where I am currently located as well. Prior to that, I got my undergrad degree at Oregon State University in Electrical & Computer Engineering with a minor in Computer Science. Resume here!
Here, you can follow my journey from hardware engineering to software development and now data science! I enjoy learning new things and understanding why they work. But don’t worry, I also like stepping aside from the computer. I play basketball (can dunk on a very very optimistic day), huge Seahawks fan, and love to create music (piano, guitar, etc.)
If you want to reach out, please email me or connect on Linkedin!
Experiences
- Creating and driving 3+ production analysis tools to improve Intel die testing, increasing an estimated 5-10% yield through supervised machine learning on wafers to improve customer processor performance.
- Automating both ML and Web pipelines for internal analysis applications using bash and Python alongside SQL/Redis. Implementing security on sensitive Intel data, using Microsoft Azure for SSO.
- Saving $50,000 on improvements in overall die testing and reliability through data modeling and analysis.
- Managing 5+ applications hosted in Kubernetes on Rancher.
- Led a team of 3 in developing predictive models for optimizing Intel chip processes, resulting in 10% reduction in materialized defects.
- Deriving material defects that result in 10% time savings for processor improvement using unsupervised modeling, Pandas, and SQL by training 30+ features into an interpretable predictive output for predicting processor strength.
- Modeling processor self-repair with K-means clustering, Z-score models, and regression models to classify, detect, and verify outlier repair rates, potentially saving millions in mass-produced wafers.
- Marketing project to director and VP of org using Tableau to produce interest and introduce new workflow within org to increase 20% efficiency in data pulling.
- Collaborating with cross-functional teams using root-cause analysis on latest Intel 20A/18A processor. Working to create a framework that incorporates efficient data pipelining to front-end user interface for Intel sensitive data.
- Conducting code reviews and developing in scrum/agile environments to drive solutions.
- Creating internal tooling features under a project lead to reach sprint goals for improved data analysis, generating 3+ new standardized metrics on processor performance for engineers to reach.
- Developing Windows Full-Stack Applications using C#,SQL, HTML, and RestAPI. Testing APIs using Postman.