Data Engineer, Operations and Infrastructure Data Science
Skip the busywork
ApplyBolt rewrites your resume for this exact role and hits submit. You just pick the jobs.
About this role
Minimum qualifications:
- Bachelor's degree in Computer Science, Mathematics, a related field, or equivalent practical experience.
- 1 year of experience with data processing software (e.g., Hadoop, Spark, Pig, Hive) and algorithms (e.g., MapReduce, Flume).
- Experience with database administration techniques or data engineering, as well as writing software in Java, C++, Python, Go, or JavaScript.
- Experience managing client-facing projects, troubleshooting technical issues, and working with engineering and sales services teams.
Preferred qualifications:
- Experience in technical consulting.
- Experience working with data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT, and reporting/analytic tools and environments.
- Experience working with Big Data, information retrieval, data mining, or machine learning.
- Experience in building multi-tier high availability applications with modern web technologies (e.g., NoSQL, MongoDB, SparkML, TensorFlow).
- Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments.
About the job
The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners.
As a Data Engineer, you will guide customers on how to ingest, store, process, analyze, explore, and visualize data on Google Cloud Platform. You will lead data migrations and transformations, partner with clients to architect scalable data processing systems, build efficient data pipelines, and resolve platform challenges. In this role, you will collaborate with Google's strategic cloud customers and our team to implement Google Cloud products.
Responsibilities
- Design and manage the data pipelines that power Vertex AI operations, ensuring they are scalable and reliable.
- Build and refine automated agents to monitor system health and streamline operational workflows.
- Create intuitive dashboards that provide clear visibility into Vertex AI efficiency metrics for both technical and finance stakeholders.
- Partner closely with data science and finance teams to translate complex efficiency goals into actionable data models and tracking systems.
- Work with finance clients to ensure technical metrics align with budgetary and resource-planning requirements.