Micron Intern, Data Engineer

CS IT Jobs

Company Description :

Broad knowledge and experience in:

  • Understanding of Big Data Engineering/processing, Business Intelligence and Advanced analytics
  • Knowledge on ETL/ELT Design and processes
  • Knowledge in databases and Datawarehouse modeling
  • Knowledge in Cloud based data engineering and Machine Learning Models
  • Knowledge in building APIs for application integration
  • Knowledge on Software development life cycle process. e.g. Agile and Waterfall.
  • Establish procedures and determine the necessary transformations to transfer both structured and unstructured data from the source to a new physical Data Model.
  • Work with Data Scientists to implement strategies for cleaning and preparing data for analysis, to develop data imputation algorithms, and optimize performance of big data and machine learning systems.

Above average skills in:

  • Big Data Engineering and Processing using Google Cloud Tech stack (Big Query, Big Table, Cloud functions, Spark, and cloud storage etc.)
  • Cloud Data Engineering knowledge on GCP/AWS/AZURE is a plus.
  • Conceptual Knowledge on ETL/ELT Design and processes (Apache Ni-Fi and Snowflake is a plus)
  • Algorithms and Data Structures (Understand appropriate usage of data structures and how they are used in common algorithms)
  • Cloud tools (Object Store (GCS),  Kubernetes (GKE), Databases (BQ))
  • Solid understanding of SQL and databases (Oracle, MSSQL, etc.), including SQL and NoSQL.
  • Programming Skills in Python/Java etc…
  • Effective Communicator with analytical, logical and problem-solving skills.

Demonstrated ability to:

  • Work in a dynamic, fast-paced work environment
  • Be self-motivated and able to work under minimal supervision.
  • Adapt to new technologies and learn quickly.
  • Have a strong passion for data and information, as well as strong analytical, problem-solving, and organizational skills.
  • Work in multi-functional groups with diverse interests and requirements to achieve a common goal.
  • Communicate very well with distributed teams (written, verbal and presentation)

Broad knowledge and experience in:

  • Understanding of Big Data Engineering/processing, Business Intelligence and Advanced analytics
  • Knowledge on ETL/ELT Design and processes
  • Knowledge in databases and Datawarehouse modeling
  • Knowledge in Cloud based data engineering and Machine Learning Models
  • Knowledge in building APIs for application integration
  • Knowledge on Software development life cycle process. e.g. Agile and Waterfall.
  • Establish procedures and determine the necessary transformations to transfer both structured and unstructured data from the source to a new physical Data Model.
  • Work with Data Scientists to implement strategies for cleaning and preparing data for analysis, to develop data imputation algorithms, and optimize performance of big data and machine learning systems.

Above average skills in:

  • Big Data Engineering and Processing using Google Cloud Tech stack (Big Query, Big Table, Cloud functions, Spark, and cloud storage etc.)
  • Cloud Data Engineering knowledge on GCP/AWS/AZURE is a plus.
  • Conceptual Knowledge on ETL/ELT Design and processes (Apache Ni-Fi and Snowflake is a plus)
  • Algorithms and Data Structures (Understand appropriate usage of data structures and how they are used in common algorithms)
  • Cloud tools (Object Store (GCS),  Kubernetes (GKE), Databases (BQ))
  • Solid understanding of SQL and databases (Oracle, MSSQL, etc.), including SQL and NoSQL.
  • Programming Skills in Python/Java etc…
  • Effective Communicator with analytical, logical and problem-solving skills.

Demonstrated ability to:

  • Work in a dynamic, fast-paced work environment
  • Be self-motivated and able to work under minimal supervision.
  • Adapt to new technologies and learn quickly.
  • Have a strong passion for data and information, as well as strong analytical, problem-solving, and organizational skills.
  • Work in multi-functional groups with diverse interests and requirements to achieve a common goal.
  • Communicate very well with distributed teams (written, verbal and presentation)
👇 Find the Apply link Below 👇👇

Tags :

example, category, and, terms

Share This :

Join our Official DataTpoint
Telegram Channel
Join our Official DataTpoint
Whatsapp Group

checkout more :

Datatpoint is a comprehensive platform that caters to fresh graduates and job seekers, primarily in the fields of Computer Science and Information Technology. Our goal is to bridge the gap between employers and job seekers by providing a wide variety of fresher jobs in CS/IT, as well as government and miscellaneous sectors. We prioritize high-demand CS/IT roles to meet the growing needs of freshers. Our platform is designed to simplify job hunting and offer a seamless experience for users.

In addition to job listings, we provide extensive resources for interview preparation, helping candidates build confidence and improve their chances of success. From curated coding questions to mock interview guides, Datatpoint ensures that users are equipped with the knowledge and skills they need for quick and effective preparation. Whether you are looking for your first job or aiming to strengthen your interview skills, Datatpoint is the place to start your prep.

Our focus on practical content allows users to prepare for real-world scenarios that they might encounter during interviews. We regularly update our job listings and preparation materials to keep our users ahead in the competitive job market.

Tags :

example, category, and, terms

Share This :

Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

We have detected that you are using extensions to block ads.
Please support us by disabling these ads blocker.
Regard\'s, DataTpoint

Powered By
100% Free SEO Tools - Tool Kits PRO