Data Engineer<p>The Data Engineer role requires robust data engineering capabilities to efficiently manage data flows from operational systems into a lakehouse architecture. The ideal candidate will understand the theory and mechanics behind data transformation, ensuring data adds more value as it transitions from bronze to silver to gold stages. Ultimately, the data will be structured into a star schema for ad-hoc analysis or reporting purposes using PowerBI. This role leverages Microsoft Fabric, including Azure Data Factory, Spark, SQL, PowerBI, and necessitates familiarity with high-level object-oriented programming (OOP). ABOUT CARDONE VENTURES Our mission is to help business owners achieve their personal, professional, and financial goals through the growth of their businesses. We work in dozens of verticals and provide strategic business guidance through courses, live events, partnerships, and investments. Our core values are the backbone of our business and guide our hiring process: we are inspirational, disciplined, accountable, transparent, aligned, and results oriented. This company operates nationally and is growing by the day. </p><p>OBJECTIVES </p><p>● Collaborate with other team members on scoping solutions and project decision points. </p><p>● Design and implement data products (pipelines, reports, visualizations) that add value. </p><p>● Interact with developers, business teams, and other stakeholders to determine requirements. </p><p>● Further the buildout of our internal data lakehouse, ultimately providing better data-analysis platforms to internal teams. </p><p>● Forecast and report transformed data to provide actionable insights. </p><p>● Design ETL flows independently, ensuring data quality and efficiency. </p><p>● Implement processes to transition data from bronze (raw) to silver (cleaned) to gold (optimized) stages, enhancing data quality and value. </p><p>● Create and manage star schema data models to support efficient ad-hoc analysis and reporting by end users. </p>Data EngineerWe are offering a contract to permanent employment opportunity for a Data Engineer in Scottsdale, Arizona. The primary function of this role is to develop and oversee robust systems for data transformation, as well as to organize and increase the value of data flows through various stages utilizing Microsoft Fabric tools.<br><br>Responsibilities:<br><br>• Collaborate with team members to establish project objectives and solutions.<br>• Develop data solutions including pipelines, reports, and visualizations that deliver substantial value.<br>• Communicate with developers, business units, and stakeholders to collect and analyze requirements.<br>• Improve the internal data lakehouse to optimize data platforms for analysis by internal teams.<br>• Utilize data transformation techniques to provide meaningful, actionable insights for forecasting and reporting.<br>• Independently design efficient ETL processes with an emphasis on data quality.<br>• Implement workflows to refine data from raw (bronze) to cleaned (silver) and ultimately optimized (gold) stages.<br>• Design and maintain star schema models to facilitate quick ad-hoc reporting and analysis for end users.<br>• Apply object-oriented programming principles using languages like Python, C#, or Java to optimize data processes.<br>• Use Azure Data Factory for workflow orchestration, Spark for large-scale data processing, and SQL for database operation.<br>• Manage structured and unstructured data to improve standard practices and explore innovative solutions.<br>• Monitor and improve data performance, ensuring scalability and efficiency when handling large datasets.