This job posting is no longer active.
Category: Global Technology Services
Position Type: Regular Full-Time
External ID: 9697
Location: Winston-Salem, NC, United States
Date Posted: Oct 23, 2023
Hiring Range: 129,269.28 to 145,330.08 USD Annually
POSITION SUMMARY STATEMENT:
The Sr. Informatica Developer / Data Engineer (with Snowflake) role is responsible for data modelling, design, optimization, security and administration of ETL/ELT tools and data engineering interfaces. The role will be responsible for building data interfaces focusing on data accuracy, completeness and availability. The role will analyze source and target systems design, build and test solutions to make data available Business intelligence systems, websites mobile apps and other applications.
HOW YOU WOULD CONTRIBUTE:
• Design, Build, Test and migrate ETL/ELT systems and data interfaces
• Analyze and design source and target systems data architectures
• Design and implement data tables, functions, views, procedures and routines to deliver accurate and complete data to BI and data applications.
• Find opportunities for technical innovation that contribute to the platform.
• Assist in establishing and embedding data management and governance processes.
• Enhance performance in the Applications environment.
• Meet service level agreements for production support response and resolution.
• Research, Design and Develop technical solutions to a pre-defined requirement and develop components including extensions, views, customizations, modifications, reports, and workflows independently or as a part of a team.
• Follow documentation, software development methodology, version control and testing, and migration standards.
• Develop and improve the current data architecture, emphasizing data security, data quality and timeliness, scalability, and extensibility.
• Provide technical guidance and mentoring to others in areas of expertise.
• Deploy and use various technologies and run pilots to design low latency data architectures at scale
• Collaborate with BI teams, business analysts, product managers, and application teams to provide data for BI, web, mobile applications
• Collaborate with business analysts, data scientists, product managers, and BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction, forecasting, clustering.
• Develop a cooperative environment that fosters knowledge sharing.
• Expert proficiency in building data pipelines and ETL/ELT using tools such as Informatica, ODI, Azure Data Factory etc.
• Experience in administration and migration activities of data movement and transformation tools.
• Expert knowledge in advanced SQL, stored procedures, views, functions, indexes etc.
• Expert knowledge of Entity Relationship Diagrams.
• Expert knowledge of data modeling techniques (type 1,2,3,4), dimensions, facts and aggregations.
• Proficiency in relational and non-relational databases such as OLTP, MPP appliances, NoSQL, DaaS, Cloud etc.
• Proficiency in data movement techniques such as replication, switching, pipelines etc.
• Proficiency in data integrity, profiling and storage and mining techniques.
• Proficiency in change data capture techniques including timestamp, log and trigger-based mechanisms.
• Proficiency in working with different types of large and small sets of data; structured, semi structured and unstructured.
• Proficiency in working with scheduling tools.
• Proficiency in working with organizational change tools and processes include source control, versioning, defect tracking and release management.
• Proficiency in analyzing impact of smaller and large-scale initiatives.
• Good working knowledge of Unix/Linux/PowerShell scripting.
• Manage multiple priorities.
• Excellent written and verbal communication skills.
• 5+ year’s experience in working with data in a data mart, warehousing, OLTP environment.
• 5+ year’s experience working with ETL/ELT and data movement tools.
• Hands-on development experience with Snowflake features such as Snow SQL, Snow Pipe, Time travel, Zero Copy Cloning, Optimizer, Metadata Manager, stored procedures is a plus
• Exposure to Python, Snow park, Snowflake Tasks, Streams, data sharing is a plus
• Bachelor’s in information technology, computer science or related field
At Herbalife, we value doing what’s right. We are proud to be an equal opportunity employer, making decisions without regard to race, color, religion, sex, sexual orientation, gender identity, marital status, national origin, age, veteran status, disability, or any other protected characteristic. We value diversity, strive for inclusivity, and believe the differences among our teammates is a key contributor to Herbalife’s ongoing success.
Herbalife offers a variety of benefits to eligible employees in the U.S. (limited to the 50 States and the District of Columbia), which includes Group Health Programs, other Voluntary Benefit Programs, and Paid Time Off. Group Health Programs include Medical, Dental, Vision, Health Savings Account (HSA), Flexible Spending Accounts (FSA), Basic Life/AD&D; Short-Term and Long-Term Disability and an Employee Assistance Program (EAP).
Other Voluntary Benefit Programs include a 401(k) plan, Wellness Incentive Program, Employee Stock Purchase Plan (ESPP), Supplemental Life/Critical Illness/Hospitalization/Accident Insurance, and Pet Insurance. Paid time off includes Company-observed U.S. Holidays, Floating Holidays, Vacation, Sick Time, a Volunteer Program, Paid Maternity and Paternity Leave, Bereavement Leave, Personal Leave and time off for voting.
If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please email your request to [email protected].