OnlyDataJobs.com

NESC Staffing Corp.
  • Houston, TX

HOURS OF WORK - 8 hours minimum with flexible start before 9:00 AM

Description


NESC is seeking an experienced Data Modeler to assist in building and supporting RigSystems & Aftermarket's Data Warehouse. This resource will be responsible for separating different types of data into structures that can be easily processed by various systems. This resource will also focus on a variety of issues, such as enhancing data migration from one system to another and eliminating data redundancy. Duties and responsibilities include:

Understand and translate business needs into data models supporting long-term solutions.

Work with the Application Development team to implement data strategies, build data flows and develop conceptual data models.

Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.

Optimize and update logical and physical data models to support new and existing projects.

Maintain conceptual, logical and physical data models along with corresponding metadata.

Develop best practices for standard naming conventions and coding practices to ensure consistency of data models.

Recommend opportunities for reuse of data models in new environments.

Perform reverse engineering of physical data models from databases and SQL scripts.

Evaluate data models and physical databases for variances and discrepancies.

Validate business data objects for accuracy and completeness.

Analyze data-related system integration challenges and propose appropriate solutions.

Develop data models according to company standards.

Guide System Analysts, Engineers, Programmers and others on project limitations and capabilities, Performance requirements and interfaces.

Review modifications to existing software to improve efficiency and performance.

Examine new application design and recommend corrections if required.

Data modeling using ERWin tool (Work Group version)

Enterprise Data Warehouse modeling skill

Business Analysis Skill

Oracle Database Skill


EXPERIENCE/SKILLS: 


3+ years experience as a Data Modeler/Data Architect

Proficient in the use of data modeling tools; Eriwin proficiency is a must.

Experience in meta data management and data integration engines such as Biztalk or Informatica

Experience in supporting as well as implementing Oracle and SQL data infrastructures

Knowledge of the entire process behind software development including design and deployment (SOA knowledge and

experience is a bonus)

Expert analytical and problem-solving traits

Knowledge of the design, development and maintenance of various data models and their components

Understand BI tools and technologies as well as the optimization of underlying databases

Education:


BS in Computer Science or IT



Apps IT America
  • Houston, TX

Data Modeler

EXPERIENCE & SKILL SET:
Data modeling using ERWin tool (Work Group version)
Enterprise Data Warehouse modeling skill
Business Analysis Skill
Oracle Database Skill


Description 


My Client is seeking an experienced Data Modeler to assist in building and supporting RigSystems & Aftermarket's Data Warehouse. This resource will be responsible for separating different types of data into structures that can be easily processed by various systems. This resource will also focus on a variety of issues, such as enhancing data migration from one system to another and eliminating data redundancy. Duties and responsibilities include:

Understand and translate business needs into data models supporting long-term solutions.
Work with the Application Development team to implement data strategies, build data flows and develop conceptual data models.
Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.
Optimize and update logical and physical data models to support new and existing projects.
Maintain conceptual, logical and physical data models along with corresponding metadata.
Develop best practices for standard naming conventions and coding practices to ensure consistency of data models.
Recommend opportunities for reuse of data models in new environments.
Perform reverse engineering of physical data models from databases and SQL scripts.
Evaluate data models and physical databases for variances and discrepancies.
Validate business data objects for accuracy and completeness.
Analyze data-related system integration challenges and propose appropriate solutions.
Develop data models according to company standards.
Guide System Analysts, Engineers, Programmers and others on project limitations and capabilities, Performance requirements and interfaces.
Review modifications to existing software to improve efficiency and performance.
Examine new application design and recommend corrections if required.

Required Skills

3+ years experience as a Data Modeler/Data Architect
Proficient in the use of data modeling tools; Eriwin proficiency is a must.
Experience in meta data management and data integration engines such as Biztalk or Informatica
Experience in supporting as well as implementing Oracle and SQL data infrastructures
Knowledge of the entire process behind software development including design and deployment (SOA knowledge and
experience is a bonus)
Expert analytical and problem-solving traits
Knowledge of the design, development and maintenance of various data models and their components
Understand BI tools and technologies as well as the optimization of underlying databases

Cloudreach
  • Dallas, TX

Big dreams often start small. From an idea in a London pub, we have grown into a global cloud enabler which operates across 7 countries and speak over 30 languages.


Our purpose at Cloudreach is to enable innovation. We do this by helping enterprise customers adopt and harness the power of cloud computing. We believe that the growth of a great business can only be fuelled by great people, so join us in our partnership with AWS, Microsoft and Google and help us  build one of the most disruptive companies in the cloud industry. Its not your average job, because Cloudreach is not your average company.


What does the Cloud Enablement team do?

Our Cloud Enablement team helps provide consultative, architectural, program and engineering support for our customers' journeys to the cloud. The word 'Enablement' was chosen carefully, to encompass the idea that we support and encourage a collaborative approach to Cloud adoption, sharing best practices, helping change the culture of teams and strategic support to ensure success.


How will you spend your days ?

    • Build technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources using open source, AWS, Azure or GCP big data frameworks and services.
    • Work with the product and software team to provide feedback surrounding data-related technical issues and support for data infrastructure needs uncovered during customer engagements / testing.
    • Understand and formulate processing pipelines of large, complex data sets that meet functional / non-functional business requirements.
    • Create and maintain optimal data pipeline architecture
    • Working alongside the Cloud Architect and Cloud Enablement Manager to implement Data Engineering solutions
    • Collaborating with the customers data scientists and data stewards/governors during workshop sessions to uncover more detailed business requirements related to data engineering


What kind of career progression can you expect ?

    • You can grow into a Cloud Data Engineer Lead or a Cloud Data Architect
    • There are opportunities for relocation in our other cloudy hubs

How to stand out ?

    • Experience in building scalable end-to-end data ingestion and processing solutions
    • Good understanding of data infrastructure and distributed computing principles
    • Proficient at implementing data processing workflows using Hadoop and frameworks such as Spark and Flink
    • Good understanding of data governance and how regulations can impact data storage and processing solutions such as GDPR and PCI
    • Ability to identify and select the right tools for a given problem, such as knowing when to use a relational or non-relational database
    • Working knowledge of non-relational and row/columnar based relational databases
    • Experience with Machine Learning toolkits
    • Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
    • This position requires travel up to 70% (M-F) any given week, with an average of 50% per year


What are our cloudy perks?

      A M
    • a
        c
      • Book Pro and iphone or google pixel (your pick!)
      • Unique cloudy culture -- we work hard and play hard
      • Uncapped holidays and your birthday off
      • World-class training and career development opportunities through our own Cloudy University
      • Centrally-located offices
      • Fully stocked kitchen and team cloudy lunches, taking over a restaurant or two every Friday
      • Office amenities like pool tables and Xbox on the big screen TV
      • Working with a collaborative, social team, and growing your skills faster than you will anywhere else
      • Quarterly kick-off events with the opportunity to travel abroad
      • Full benefits and 401k match


    If you want to learn more, check us out on Glassdoor. Not if. When will you join Cloudreach?

Beyond Finance, Inc.
  • Houston, TX

About the Role

We are seeking an experienced Data Architect who has demonstrated experience designing & implementing the full life-cycle of both EDW and ODS systems. The Data Architect will be passionate about data-driven information which will guide and influence key business decisions.  This role will partner with internal clients to understand business inquiries, drive requirements, propose solutions and deliver multiple cross-functional projects and integrations between different systems..

What You'll Do

  • Drive the discovery, design and implementation of our EDW and ODS systems

  • Work closely with marketing, sales, technical, and business operations stakeholders

  • Serve as the technical expert responsible for the architecture, design and implementation of BI/DW solutions, with complete and accurate information information/ data delivery using maintainable, systematic, and automated procedures.

  • Make recommendations about data collection methods, data management, data definitions, and evaluation methods in collaboration with internal stakeholders

  • Help establish and define validation, data cleansing, integration, and transformation practices

  • Facilitate reporting and analytics design review sessions

  • Stay current with contemporary technology trends/concepts and serve as an SME for the business teams

  • Manage issues and bugs within the system using tracking/ support systems; liaise with internal and external resources to facilitate resolution and closure

  • Manage external vendor technology - (AWS products including Redshift & Athena, Dell Boomi, etc.)


What We Look For

  • Degree required with preference to business, marketing, or data science, Masters degree a plus

  • 8+ years of experience

  • Excellent verbal,oral, and written communication skills

  • Confidence communicating & translating data driven insights and technical concepts into simple terminology for business clients of various levels

  • Passionate about identifying and solving problems for customers and the ability to uncover business needs through direct interaction as well as quantitative & qualitative research to define compelling solutions

  • Professional development experience with and knowledge of relational databases and enterprise data warehouses, including work with MySQL, SQL Server, Oracle, or other common RDBMS

  • Experience with big data technology (Hadoop, Apache Spark, MapReduce, etc.)

  • Knowledge of best practices and principles for data modeling, and data mining

  • Strong project management skills; ability to prioritize and manage client expectations across multiple projects

Carrot Health
  • Minneapolis, MN

Carrot Health is looking for a Data Architect to join our dynamic, growing team in Minneapolis, MN. The Data Engineer is responsible for supporting the development, implementation, and maintenance of Carrot Healths MarketView solutions by creating a best-in-class data architecture to support application functionality and data science.


This is a full-time, salaried, overtime-exempt position based in Minneapolis, MN.


Responsibilities:


The Data Engineer will develop databases and data solutions to support efficiencies in the MarketView platform, including the following:


    • Design and code complex, automated data pipelines for extraction, transformation, and loading of data from a wide variety of data sources into Snowflake.
    • Build and maintain database schemas and supporting documentation.
    • Ensure data pipeline solutions can support integrations into Tableau, R and Python for consumption by the Product Engineering team.
    • Identify, design and implement internal process improvements, such as automating manual processes, optimizing data storage and delivery, creating R functions and packages for the Product Engineering team, etc.


Required Skills:


    • Bachelors Degree in Computer Science.
    • 2+ years of experience designing and implementing software as a Data Engineer and/or Software Engineer.
    • SQL knowledge and experience designing and working with relational databases.
    • Experience using object-oriented patterns to create complex, re-useable, configurable software to account for differences and similarities across customers.
    • Experience with Python and/or R.
    • Experience gathering and analyzing data requirements to understand optimal data management.


Desired Skills/Pluses:

    • Experience scripting using Python/Bash.
    • Experience with Snowflake and/or Matillion.
    • Experience with Health Care claims and EMR records.
    • Experience with integrating databases with Tableau.
    • Experience with machine learning algorithms and large-scale execution of models.

Personal Attributes:


    • Creative problem solver who is able to work on ambiguous software projects to create solutions.
    • Employs a strong work ethic and high standards for his/her own work and the work of others.
    • Desire & ability to continually learn and innovate in new technologies.
    • Ability to be self-directed and work independently, as well as collaboratively.
    • Excellent organizational skills and the ability to handle multiple projects at once, with superior attention to detail and follow-through.
    • Written and oral communication skills in the English language.
    • Flexibility and ability to adapt quickly to changes.
Blue Horizon Tek Solutions, Inc.
  • Philadelphia, PA
Company Global Service Provider
Location -   Philadelphia, PA
Position -     Data Architect (Senior Level) To $200K Base + Bonus

 
What You'll Need to Succeed:

  • Deep expertise in Data Governance, Data Architecture, Data Protection, Unified Reference Architecture
  • View of Data from a Holistic data architecture perspective
  • Java, Scala, .Net, Python, etc. + Unix/Linux, Windows, MacOS
  • Experience with emerging technologies specifically in security and infrastructure.
  • Parallel and Distributed Processing Systems, Information Retrieval, Data Mining, Natural Language Processing, Machine Learning, Statistical Computing, Very Large Scale Databases, Knowledge Representation, Knowledge Organization, and High-Performance Computing.
 Summary:
 
While rolling up to the Lead Data Architect, you yourself will provide leadership, guidance, mentorship, etc. in all data architectural and structural design. In this role there will be a heavy emphasis placed on Data Protection. As the architect you're always thinking best practices, governance, core data principles, etc. You will work with diverse business and organizational groups driving data centric initiatives.
 
Granular Details:
 
(Available Upon Request)
 
 
r2 Technologies, Inc.
  • Dallas, TX

Quality Assurance Analyst (Data Warehouse)

This Dallas based client has a large, 85+ Terabyte data warehouse, serving multiple business units. Due to last years success, they are adding a Data Warehouse Quality Assurance (QA) Analyst to the team. This company is a leader in providing innovative services to their customers. They have proven track record, a history of innovation, and excellence in Data Warehouse solutions.

Here are the top three skills they are looking for:

  1. Experience in Quality Assurance testing, scripting and remediation in a manual testing environment.
  2. Ability to create a test strategy, test cases, and testing scripts for Data Warehouse ETL processes.
  3. Excellent SQL skills.

If performing a Quality Assurance role in a Data Warehouse is your passion, and you're looking for a joining a highly talented peers in a fun corporate culture, then our client is the place for you! Their dedication to employees, clients, partners, and customers has resulted in tremendous growth and public recognition!

Your day-to-day Development role will:

  • Execute the technical testing strategy and tactical plan for performing Quality Assurance functions for the Enterprise Data Warehouse Platform.
  • Serve as a DW QA Analyst, with project management responsibilities in the day-to-day interaction and requirements gathers duties from clients, stakeholders, customers, and clients.
  • Work closely with business users to understand key performance metrics and analytics in delivering effective solutions.
  • Gather, document, and communicate accurate, detailed, and extensively diagramed testing requirements at an appropriate level of detail suitable for use by data warehouse architects / analysts and application developers;
  • Plan, execute, and document user interviews and outcomes while building solid relationships with business users; with minimal supervision
  • Collaborate with key members of the design team to design, build, test, and deploy an environment that supports a single version of the truth, actionable results, flexibility for the future, performance, data quality, and data governance.
  • Collaborate in the conceptualization, analysis, development, and deployment strategies for data storage, ETL, and reporting at the enterprise level with minimal supervision. Responsible for the accurate documentation of business terms, reporting needs, calculations, and any appropriate metadata; using available tools
  • Ensure that established standards, quality requirements & best practices are followed in all development efforts.
  • Identify, develop and document detailed business requirements and detailed specifications through the Data Warehouse life cycle integrating with existing data warehouse environment.
  • Review data models, ETL design documents, and BI design documents with appropriate team members to verify they meet business requirements.

We can get you an interview, If You Have:

·         7+ years experience with a proven track record as an Information Technology Professional

·         Highly skilled in developing and executing Quality Assurance (QA) testing and remediation aligning technology to business needs

·         Minimum 5 years experience in data warehouse gathering, understanding, and documenting business reporting requirements for complex data warehouse solutions

·         Highly skilled in SQL programming, stored procedures, and scripting. Experience with Oracle, DB2, Teradata, SQL Server databases.

·         Exposure or knowledge using InfoSphere products including: Information Data Architect, DataStage, QualityStage, Business Glossary, Information Analyzer, Metadata Workbench, Information Services Director, Blueprint Director, Fast Track, and Optim in a Data Warehouse environment

·         Ability to partner and build relationships with Senior Management, and communicate to multiple audiences

·         Proactive, great attention to detail, results-oriented problem solver

·         Ability to deal with long term goals along with immediate performance concerns

·         Strong communication skills and the ability to translate highly complex technical issues to non-technical audience

·         Experience with leading BI tools such as MicroStrategy, Cognos, Tableau, Business Objects, Crystal Reports, or others desired.

Why you'll love working at our client:

·         You'll work with an Outstanding group of talented professionals, one of the best in the industry, in a highly collaborative, team and results oriented atmosphere!

·         You'll have the opportunity to work in a dynamic and extremely positive environment where there is always the opportunity to challenge your skills and really move the needle.

·         Youll work with large, sophisticated, and progressive clients throughout North America.

If you are interested in this opportunity, please forward your resume and indicate job title. U.S. citizens and US Permanent residents are encouraged to apply. You must pass a background check, drug test, education verification, and employment verification before starting this assignment. If you feel you do not have all the qualifications to fit this position, please forward this to a friend or co-worker.

We would like to thank everyone who submits his or her resume for this position. Due to the volume of resumes that we receive, only those candidates selected for interviews will be contacted. r2 Technologies offers a competitive salary with excellent benefits, is an equal opportunity employer and promotes a smoke free and drug free workplace.

If you are interested in this position please forward your resume to: trussell@r2now.com.

Learn more about r2 Technologies at www.r2now.com.

Babich & Associates
  • Houston, TX
    • Our Data architect is responsible for the design, structure, and maintenance of data. The candidate ensures the accuracy and accessibility of data relevant to our organization or a project. The management and organization of data is highly technical and requires advanced skills with computers and proficiency with data-oriented computer languages such as SQL and XML.

      You are required to possess superior analytical skills and be detail-oriented as well as the ability to communicate effectively, as part of the larger team within the information technology department. Additionally, you will likely need to explain complex technical concepts to non-technical staff. Since development of data models and logical workflows is common, you must also exhibit advanced visualization skills, as well as creative problem-solving.

      Responsibilities:

      Pla
        ns
      • , architects, designs, analyses, develops, codes, tests, debugs and documents data & analytics platforms to satisfy business requirements for large, complex Data Reservoir/Data Warehouse, Reporting & Analytics & development Lead and perform database level tuning and optimization in support of application development teams on an ad-hoc basis.
      • Analyses business and data requirements to support the implementation of an applications full functionality
      • Contributes to high level functional design used across all Reporting & Analytics applications based on system build and knowledge of business needs
      • Collaborates with fellow team members and keeps the team and other key stakeholders well informed of progress of application business features being developed
      • Create data architecture strategies for each subject area of the enterprise data model.
      • Communicate plans, status and issues to higher management levels.
      • Create and maintain a corporate repository of all data architecture artifacts.
      • Collaborate with the business and other IT organizations to plan a data strategy.
      • Produce all project data architecture deliverables.

    Qualifications

      • 6+ years of experience with demonstrated knowledge in the design & development of data warehouses and/or data reservoir/data lake/data mart platform.
      • 4+ years of expert level experience in data ingestion tools and techniques including ETL & ELT methodologies.
      • 3+ Experience in one or more of the reporting/visualization tools (Cognos, Tableau or Power BI) are desirable
      • Working proficiency in a selection of software engineering disciplines and demonstrates understanding of overall software skills including business analysis, development, testing, deployment, maintenance and improvement of software.
      • Strong communication skills with demonstrated experience coordinating development cycles and project management.
      • Self-starter that can work alone and as part of a larger internal and external team
      • Works well with others and understands the importance of the team
      • Exceptional data analysis skills and problem solving ability
      • Knowledge of advanced analytics tools & methodologies (Python, R etc.)
      • Experience with statistical analysis and predictive modelling skills a plus
      • Bachelor or masters degree in computer science or similar field.
Techaxis, Inc
  • Detroit, MI

Title: Big Data Architect - Autonomous Driving

Location for the position Open (prefer Detroit MI)

Duration: Full time / Permanent


Future Manufacturing Enterprise Team: The Future Manufacturing Enterprise Team is focused on strategic area of CASE Automotive Business (Connected Autonomous Shared & Electric Automotive Business). Autonomous Vehicle Development technologies is the target for this job requirement within the CASE Automotive Business group.
Job Description: The big data architect will support the development of autonomous driving software solutions by playing the role of subject matter expert for the big data systems, underlying data and data products. The aspirant must possess first hand big data establishment experience and be willing to operate in flat organization structure and across functional teams to leverage the best of capabilities available. This role will join a team developing the next generation of autonomous driving solutions drawing on state of the art technologies including big data analytics, cloud, agile and automation, for solving business problems in the automotive and next-gen mobility industry.
Responsibilities: Architect and design big data storage and analytic solutions on Hadoopbased platform, create custom analytic and data mining algorithms to improve performance. Responsible for architecture vision and design of next generation data infrastructure required for autonomous driving technologies and future programs. Establish standards and guidelines for the design & development, tuning, deployment and maintenance of information, advanced data analytics, ML/DL, data access frameworks and physical data persistence technologies. Define new data infrastructure platform to capture vehicle data. Research and develop new data management solutions, approaches and techniques for data privacy and data security, and advanced Data Analytics systems. Undertake solution response preparation, solution estimation and client presentations for client based opportunities. Prepare white papers and participate in technology forums through blogs and speaker sessions. Measurement Metrics for the role include successful deliverables of Assets & Solutions and successful Client Engagements
Experience: Background in Information Technology in the Automotive industry. Experience solving problems with big data, understanding data quality and data management. Research experience in the areas of advanced data techniques, including data ingestion, data processing, data integration, data access, data visualization, text mining, data discovery, statistical methods, database design and implementation. Qualifications: 5+ years of automotive industry experience working at OEM / Tier 1 10+ years as a big data architect in a large IT organization. Strong experience with Hadoop, (HDFS), Spark, Kafka NoSQL, Cassandra, HBase, S3 protocol, ObjectStore and with processing large data stores. Strong capability in researching, evaluating and implementing new and improved data solutions for multi-national enterprises. Experience with the design and development of multiple objectoriented systems. Strong track record to drive rapid prototyping and design for Big Data. TOGAF Enterprise Architecture or equivalent. Expertise with Cloud vendors like Azure or AWS. Data storage petabyte scale architectures. Knowledge of data privacy and security w.r.t country specific laws BS or MS in computer science, engineering, or a related field
Cloudreach
  • Atlanta, GA

Big dreams often start small. From an idea in a London pub, we have grown into a global cloud enabler which operates across 7 countries and speak over 30 languages.


Our purpose at Cloudreach is to enable innovation. We do this by helping enterprise customers adopt and harness the power of cloud computing. We believe that the growth of a great business can only be fuelled by great people, so join us in our partnership with AWS, Microsoft and Google and help us  build one of the most disruptive companies in the cloud industry. Its not your average job, because Cloudreach is not your average company.


What does the Cloud Enablement team do?

Our Cloud Enablement team helps provide consultative, architectural, program and engineering support for our customers' journeys to the cloud. The word 'Enablement' was chosen carefully, to encompass the idea that we support and encourage a collaborative approach to Cloud adoption, sharing best practices, helping change the culture of teams and strategic support to ensure success.


How will you spend your days ?

    • Build technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources using open source, AWS, Azure or GCP big data frameworks and services.
    • Work with the product and software team to provide feedback surrounding data-related technical issues and support for data infrastructure needs uncovered during customer engagements / testing.
    • Understand and formulate processing pipelines of large, complex data sets that meet functional / non-functional business requirements.
    • Create and maintain optimal data pipeline architecture
    • Working alongside the Cloud Architect and Cloud Enablement Manager to implement Data Engineering solutions
    • Collaborating with the customers data scientists and data stewards/governors during workshop sessions to uncover more detailed business requirements related to data engineering


What kind of career progression can you expect ?

    • You can grow into a Cloud Data Engineer Lead or a Cloud Data Architect
    • There are opportunities for relocation in our other cloudy hubs

How to stand out ?

    • Experience in building scalable end-to-end data ingestion and processing solutions
    • Good understanding of data infrastructure and distributed computing principles
    • Proficient at implementing data processing workflows using Hadoop and frameworks such as Spark and Flink
    • Good understanding of data governance and how regulations can impact data storage and processing solutions such as GDPR and PCI
    • Ability to identify and select the right tools for a given problem, such as knowing when to use a relational or non-relational database
    • Working knowledge of non-relational and row/columnar based relational databases
    • Experience with Machine Learning toolkits
    • Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
    • This position requires travel up to 70% (M-F) any given week, with an average of 50% per year


What are our cloudy perks?

      A M
    • a
        c
      • Book Pro and iphone or google pixel (your pick!)
      • Unique cloudy culture -- we work hard and play hard
      • Uncapped holidays and your birthday off
      • World-class training and career development opportunities through our own Cloudy University
      • Centrally-located offices
      • Fully stocked kitchen and team cloudy lunches, taking over a restaurant or two every Friday
      • Office amenities like pool tables and Xbox on the big screen TV
      • Working with a collaborative, social team, and growing your skills faster than you will anywhere else
      • Quarterly kick-off events with the opportunity to travel abroad
      • Full benefits and 401k match


    If you want to learn more, check us out on Glassdoor. Not if. When will you join Cloudreach?

Cloudreach
  • Atlanta, GA

Big dreams often start small. From an idea in a London pub, we have grown into a global cloud enabler which operates across 7 countries and speak over 30 languages.


Our purpose at Cloudreach is to enable innovation. We do this by helping enterprise customers adopt and harness the power of cloud computing. We believe that the growth of a great business can only be fuelled by great people, so join us in our partnership with AWS, Microsoft and Google and help us  build one of the most disruptive companies in the cloud industry. Its not your average job, because Cloudreach is not your average company.


What does the Cloud Enablement team do?

Our Cloud Enablement team helps provide consultative, architectural, program and engineering support for our customers' journeys to the cloud. The word 'Enablement' was chosen carefully, to encompass the idea that we support and encourage a collaborative approach to Cloud adoption, sharing best practices, helping change the culture of teams and strategic support to ensure success.


How will you spend your days ?

    • Build technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources using open source, AWS, Azure or GCP big data frameworks and services.
    • Work with the product and software team to provide feedback surrounding data-related technical issues and support for data infrastructure needs uncovered during customer engagements / testing.
    • Understand and formulate processing pipelines of large, complex data sets that meet functional / non-functional business requirements.
    • Create and maintain optimal data pipeline architecture
    • Working alongside the Cloud Architect and Cloud Enablement Manager to implement Data Engineering solutions
    • Collaborating with the customers data scientists and data stewards/governors during workshop sessions to uncover more detailed business requirements related to data engineering


What kind of career progression can you expect ?

    • You can grow into a Cloud Data Engineer Lead or a Cloud Data Architect
    • There are opportunities for relocation in our other cloudy hubs

How to stand out ?

    • Experience in building scalable end-to-end data ingestion and processing solutions
    • Good understanding of data infrastructure and distributed computing principles
    • Proficient at implementing data processing workflows using Hadoop and frameworks such as Spark and Flink
    • Good understanding of data governance and how regulations can impact data storage and processing solutions such as GDPR and PCI
    • Ability to identify and select the right tools for a given problem, such as knowing when to use a relational or non-relational database
    • Working knowledge of non-relational and row/columnar based relational databases
    • Experience with Machine Learning toolkits
    • Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
    • This position requires travel up to 70% (M-F) any given week, with an average of 50% per year


What are our cloudy perks?

      A M
    • a
        c
      • Book Pro and iphone or google pixel (your pick!)
      • Unique cloudy culture -- we work hard and play hard
      • Uncapped holidays and your birthday off
      • World-class training and career development opportunities through our own Cloudy University
      • Centrally-located offices
      • Fully stocked kitchen and team cloudy lunches, taking over a restaurant or two every Friday
      • Office amenities like pool tables and Xbox on the big screen TV
      • Working with a collaborative, social team, and growing your skills faster than you will anywhere else
      • Quarterly kick-off events with the opportunity to travel abroad
      • Full benefits and 401k match


    If you want to learn more, check us out on Glassdoor. Not if. When will you join Cloudreach?

The JPI Group
  • Fort Worth, TX

**OUR CLIENT CANNOT SPONSOR CITIZENSHIP AT THIS TIME. CANDIDATES MUST BE US CITIZEN OR GREEN CARD HOLDER**

Responsibilities

·         Own data-quality and be responsible for the efficiency of data-delivery to the data warehouse:

o   Create new database structures to support the needs of the business.

o   Improve performance and usability of existing data warehouse structures, and their related processes.

o   Creation of tools and reports to diagnose issues, fine-tune data warehouse load-performance, storage, and query-performance.

·         Act as Enterprise Data Architect

·         Research new platforms that may enhance our data-consumers experience.

o   Propose architectures that:

§ institute best practices for the current environment.

§ Remain flexible; taking into consideration the possibility of incorporating additional technologies, and/or migrating to other data platforms in the future.

·         Build and launch new data models that provide intuitive analytics to our customers (Vertica/Star Schema, Looker analytics)

·         Design and develop new systems and tools to enable clients to optimize and track advertising campaigns (Vertica, Looker, Spark)

·         Work across multiple teams in high visibility roles and own the solution end-to-end

·         Provide support for our existing production systems. We use Datadog and PagerDuty for monitoring and alerting.

Requirements

·         Extensive experience in data warehouse design/data modeling.

·         Familiarity with horizontally-partitioned columnar databases.

o   Experience using Vertica preferred.

·         Must be comfortable working with PostgreSQL dialect.

·         Proficient in query optimization, which includes

o   Understanding of how to interpret query execution plans.

o   Ability to diagnose inefficiencies in existing query plans and refactor code and/or create additional data objects to execute more efficiently.

·         Extensive experience in creating efficient ETL processes.

o   Experience using Airflow preferred.

·         Extensive experience in developing BI solutions on platforms that use a metadata/semantic layer

o   Experience using Looker preferred.

·         Proficiency building and supporting applications on Linux topology.

·         Familiarity with OO and FP methodologies and philosophies

o   Experience in Python and/or Ruby development preferred.

·         Moderate experience in Big Data ecosystem (Hadoop, Spark, Kafka, etc.) preferred.

·         Excellent communication skills including the ability to identify and communicate data driven insights.

·         BS or MS degree in Computer Science, Software Engineering, or a related technical field. Will consider equivalent experience in the industry.

Carrot Health
  • Minneapolis, MN

Carrot Health is looking for a Data Architect to join our dynamic, growing team in Minneapolis, MN. The Data Architect is responsible for supporting the development, implementation, and maintenance of Carrot Healths healthcare business intelligence solutions by creating a best-in-class data architecture to support application functionality and data science.


This is a full-time, salaried, overtime-exempt position based in Minneapolis, MN.


Responsibilities


The Data Architect will be responsible for the following:


    • Design and code complex automated data pipelines for optimal extraction, transformation, and loading of data from a wide variety of data sources across AWS and Snowflake.
    • Architect, build and maintain database schemas and supporting documentation.
    • Lead data governance efforts to ensure high quality data.
    • Ensure data pipeline solutions can support integrations into Tableau, R and Python for use by the data science team.
    • Lead meetings to understand the business uses for different types of data.
    • Research, analyze, and recommend database and data pipeline solutions that deliver optimal cost/benefit.
    • Help maintain the integrity and security of Carrot Healths database solutions.

Qualifications


    • Bachelor's degree or higher.
    • Proven experience in a Data Architect, Software Developer or Database Administrator role.
    • Advanced knowledge of data storage theory for both structured and unstructured formats and application to architect data solutions.
    • Experience in SQL, PostreSQL, R, and Python
    • Experience with integrating databases with Tableau.
    • Experience with Snowflake.
    • Experience with Health Care claims and EMR records.
    • Experience gathering and analyzing system requirements.
    • Demonstrated ability to develop creative solutions to problems.
    • Excellent data storytelling and technical design skills.
    • Ability to be self-directed and work independently.
    • Ability to creatively problem solve and manage ambiguous projects and tasks.


Personal Attributes


    • Employs a strong work ethic and high standards for his/her own work and the work of others.
    • Excellent organizational skills and the ability to handle multiple projects at once, with superior attention to detail and follow-through.
    • Excellent written and oral communication skills in the English language.
    • Extremely comfortable and effective at presenting to C-suite and senior management levels.
    • Flexibility and ability to adapt quickly to changes.
    • Ability to be self-directed and work independently, as well as collaboratively.
    • Creative problem solver.
Cloudreach
  • Atlanta, GA

Big dreams often start small. From an idea in a London pub, we have grown into a global cloud enabler which operates across 7 countries and speak over 30 languages.


Our purpose at Cloudreach is to enable innovation. We do this by helping enterprise customers adopt and harness the power of cloud computing. We believe that the growth of a great business can only be fuelled by great people, so join us in our partnership with AWS, Microsoft and Google and help us  build one of the most disruptive companies in the cloud industry. Its not your average job, because Cloudreach is not your average company.


What does the Cloud Enablement team do?

Our Cloud Enablement team helps provide consultative, architectural, program and engineering support for our customers' journeys to the cloud. The word 'Enablement' was chosen carefully, to encompass the idea that we support and encourage a collaborative approach to Cloud adoption, sharing best practices, helping change the culture of teams and strategic support to ensure success.


How will you spend your days ?

    • Build technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources using open source, AWS, Azure or GCP big data frameworks and services.
    • Work with the product and software team to provide feedback surrounding data-related technical issues and support for data infrastructure needs uncovered during customer engagements / testing.
    • Understand and formulate processing pipelines of large, complex data sets that meet functional / non-functional business requirements.
    • Create and maintain optimal data pipeline architecture
    • Working alongside the Cloud Architect and Cloud Enablement Manager to implement Data Engineering solutions
    • Collaborating with the customers data scientists and data stewards/governors during workshop sessions to uncover more detailed business requirements related to data engineering


What kind of career progression can you expect ?

    • You can grow into a Cloud Data Engineer Lead or a Cloud Data Architect
    • There are opportunities for relocation in our other cloudy hubs

How to stand out ?

    • Experience in building scalable end-to-end data ingestion and processing solutions
    • Good understanding of data infrastructure and distributed computing principles
    • Proficient at implementing data processing workflows using Hadoop and frameworks such as Spark and Flink
    • Good understanding of data governance and how regulations can impact data storage and processing solutions such as GDPR and PCI
    • Ability to identify and select the right tools for a given problem, such as knowing when to use a relational or non-relational database
    • Working knowledge of non-relational and row/columnar based relational databases
    • Experience with Machine Learning toolkits
    • Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
    • This position requires travel up to 70% (M-F) any given week, with an average of 50% per year


What are our cloudy perks?

      A M
    • a
        c
      • Book Pro and iphone or google pixel (your pick!)
      • Unique cloudy culture -- we work hard and play hard
      • Uncapped holidays and your birthday off
      • World-class training and career development opportunities through our own Cloudy University
      • Centrally-located offices
      • Fully stocked kitchen and team cloudy lunches, taking over a restaurant or two every Friday
      • Office amenities like pool tables and Xbox on the big screen TV
      • Working with a collaborative, social team, and growing your skills faster than you will anywhere else
      • Quarterly kick-off events with the opportunity to travel abroad
      • Full benefits and 401k match


    If you want to learn more, check us out on Glassdoor. Not if. When will you join Cloudreach?

Cloudreach
  • Atlanta, GA

Big dreams often start small. From an idea in a London pub, we have grown into a global cloud enabler which operates across 7 countries and speak over 30 languages.


Our purpose at Cloudreach is to enable innovation. We do this by helping enterprise customers adopt and harness the power of cloud computing. We believe that the growth of a great business can only be fueled by great people, so join us in our partnership with AWS, Microsoft and Google and help us build one of the most disruptive companies in the cloud industry. Its not your average job, because Cloudreach is not your average company.


Mission:

The purpose of a Cloud Data Architect is to design solution that enable data scientists and analysts to gain insights into data using data-driven cloud based services and infrastructures. At Cloudreach, they will be subject matter experts and will be responsible for the stakeholder management and technical leadership for data ingestion and processing engagements. A good understanding of cloud platforms and prior experience working with big data tooling and frameworks is required.


What will you do at Cloudreach?

  • Build technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources using open source, AWS, Azure or GCP big data frameworks and services.
  • Work with the product and software team to provide feedback surrounding data-related technical issues and support for data infrastructure needs uncovered during customer engagements / testing.
  • Understand and formulate processing pipelines of large, complex data sets that meet functional / non-functional business requirements.
  • Create and maintain optimal data pipeline architecture
  • Working alongside Cloud Data Engineers, Cloud System Developers and Cloud Enablement Manager to implement Data Engineering solutions
  • Collaborating with the customers data scientists and data stewards during workshop sessions to uncover more detailed business requirements related to data engineering
  • This position requires travel up to 70% (M-F) any given week, with an average of 50% per year


What do we look for?

The Cloud Data Architect has extensive experience working with big data tools and supporting cloud services, a pragmatic mindset focused on translating functional and non-functional requirements into viable architectures, and ideally a consultancy background, leading a highly skilled team on engagements that implement complex and innovative data solutions for clients.


In addition, the Cloud Data Architect thrives in a collaborative and agile environment with an ability to learn new concepts easily.


  • Technical
      skills:
  • Exper
    • ience in building scalable end-to-end data ingestion and processing solutions Good
    • understanding of data infrastructure and distributed computing principles Profi
    • cient at implementing data processing workflows using Hadoop and frameworks such as Spark and Flink Good
    • understanding of data governance and how regulations can impact data storage and processing solutions such as GDPR and PCI Abili
    • ty to identify and select the right tools for a given problem, such as knowing when to use a relational or non-relational database Worki
    • ng knowledge of non-relational and row/columnar based relational databases Exper
    • ience with Machine Learning toolkits Exper
    • ience with object-oriented and/or functional programming languages, such as Python, Java and Scala
    • Demonstrable working experience
      • A successful history of manipulating, processing and extracting value from large disconnected datasets
      • Delivering production scale data engineering solutions leveraging one or more cloud services
      • Confidently taking responsibility for the technical output of a project
      • Ability to quickly pick up new skills and learn on the job
      • Comfortably working with various stakeholders such as data scientists, architects and other developers
    • Solid communication skills: You can clearly articulate the vision and confidently communicate with all stakeholder levels: Cloudreach, Customer, 3rd Parties and Partners - both verbal and written. You are able to identify core messages and act quickly and appropriately.


    What are our cloudy perks?

    • A MacBook Pro and smartphone.
    • Unique cloudy culture -- we work hard and play hard.
    • Uncapped holidays and your birthday off.
    • World-class training and career development opportunities through our own Cloudy University.
    • Centrally-located offices.
    • Fully stocked kitchen and team cloudy lunches, taking over a restaurant or two every Friday.
    • Office amenities like pool tables and Xbox on the big screen TV.
    • Working with a collaborative, social team, and growing your skills faster than you will anywhere else.
    • Full benefits and 401k match.
    • Kick-off events at cool locations throughout the country.
MRE Consulting, Ltd.
  • Houston, TX

Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.


Our client is seeking to hire an Enterprise Data Architect. The position reports to the VP IT. The Data Architect is responsible for providing a standard common business vocabulary across all applications and data elements, expressing and defining strategic data requirements, outlining high level integrated designs to meet the various business unit requirements, and aligning with the overall enterprise strategy and related business architecture.


Essential Duties & Responsibilities:
Provide insight and strategies for changing databased storage and utilization requirements for the company and provide direction on potential solutions
Assist in the definition and implementation of a federated data model consisting of a mixture of multi-cloud and on premises environments to support operations and business strategies
Assist in managing vendor cloud environments and multi-cloud database connectivity.
Analyze structural data requirements for new/existing applications and platforms
Submit reports to management that outline the changing data needs of the company and develop related solutions
Align database implementation methods to make sure they support company policies and any external regulations
Interpret data, analyze results and provide ongoing reporting and support
Implement data collection systems and other strategies that optimize efficiency and data quality
Acquire available data sources and maintain data systems
Identify, analyze, and interpret trends or patterns in data sets
Scrub data as needed, review reports, printouts, and performance indicators to identify inconsistencies
Develop database design and architecture documentation for the management and executive teams
Monitor various data base systems to confirm optimal performance standards are met
Contribute to content updates within resource portals and other operational needs
Assist in presentations and interpretations of analytical findings and actively participate in discussions of results, internally and externally
Help maintain the integrity and security of the company database
Ensure transactional activities are processed in accordance with standard operating procedures The employee will be on call 24 hours 7 days per week.


Qualifications
Minimum of 10 + years of experience.
Proven work experience as a Data Architect, Data Scientist, or similar role
In-depth understanding of database structure principles
Strong knowledge of data mining and segmentation techniques
Expertise in MS SQL and other database platforms
Familiarity with data visualization tools
Experience with formal Enterprise Architecture tools (like BiZZ design)
Experience in managing cloud-based environments
Aptitude regarding data models, data mining, and in cloud-based applications.
Advanced analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
Adept at report writing and presenting findings
Proficiency in systems support and monitoring
Experience with complex data structures in the Oil and Gas Industry a plus


Education 
A bachelors degree in Computer Science, Math, Statistics, or related quantitative field required.


Travel Requirements
The percentage of travel anticipated for this position is 10 20%, including overnight extended stays.


All qualified candidates should apply by providing a current Word resume and denoting skill set experience as it relates to this requirement.

Experian
  • Austin, TX

Experian Consumer Services – Careers That Define “What’s the Next Big (Data) Thing” for Consumers?

What could be more exciting – personally and professionally – than being part of a “disruptive” business? Consider taking your career to the next level by joining the Leader that continues to disrupt the competition. As the “disruptor” and market leader we pride ourselves on building new markets, leading the pack through continuous evolution and innovation. It’s a position Experian Consumer Services has enjoyed for more than a decade and we’re always looking for the talent that can help expand that lead.

When you’re the leader, it’s always urgent, important and market-changing. We think that defines the true “disruptive” business. Join us and create some chaos for the competition.

Key Responsibilities:



  • Drive enterprise-wide technology integration strategy-setting and implementation based on leading-edge industry standards, best practices and comprehensive understanding of business operations.

  • Responsible for design and implementation of integration strategy, architecture and platforms

  • Accountable for adhering to enterprise architecture standards, ensuring integration technology standards and best practices are maintained across the organization and leading enterprise architecture strategy-setting

  • Leads governance processes of new technologies and solutions to ensure consistent technology life cycle management

  • Lead integration efforts across all business areas and client groups including key data and infrastructure components across the enterprise.


Knowledge, Skills, & Experience



  • 7+ years of experience in MySQL

  • Deep knowledge in MySQL databases – physical / logical database design, security model, HA/DR, performance optimization, DBA and application design/development

  • Ability to create clear, detailed, concise documentation — architecture diagrams, presentations, and design documents

  • Ability to identify issues with existing schemas, suggest improvements

  • 2+ years of hands-on experience with Apache Hadoop, Spark and Python.

  • Excellent understanding of Entity Relationship Diagrams (ERDs)

  • Experience working in cloud computing and distributed data environments is PLUS

  • Strong data modeling and design skills.

  • Experience with data migration and ETL tools.

  • Strong experience in translating business requirements to data solutions.

  • Experience in large data environments with highly performant, highly scalable solutions.

  • Experience with Unix/Linux operating environments as well as shell scripts.

  • Strong SQL and PL/SQL development skills.

  • Strong in diagnosing various queries and Improving/suggesting improvements to SQL queries for efficiency, latency etc.

  • Ability to work in a team with highly motivated people.

  • Data analysis, modeling, and integration.

  • Database management, design, and development.

  • Strong debugging and technical troubleshooting skills.

  • Ability to prioritize and work on multiple projects concurrently.

  • A proactive approach to problem solving and decision-making.

  • Strong attention to detail.

  • Excellent written and oral communication skills.

  • Collaborate effectively with the database and other technology teams.

  • Knowledge of release management and version control practices and procedures.

National Oilwell Varco
  • Houston, TX

NOV is seeking an experienced Data Modeler to assist in building and supporting RigSystems & Aftermarket's Data Warehouse.  This resource will be responsible for separating different types of data into structures that can be easily processed by various systems. This resource will also focus on a variety of issues, such as enhancing data migration from one system to another and eliminating data redundancy.  

Duties and responsibilities include:

  • Understand and translate business needs into data models supporting long-term solutions.

  • Work with the Application Development team to implement data strategies, build data flows and develop conceptual data models.

  • Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.

  • Optimize and update logical and physical data models to support new and existing projects.

  • Maintain conceptual, logical and physical data models along with corresponding metadata.

  • Develop best practices for standard naming conventions and coding practices to ensure consistency of data models.

  • Recommend opportunities for reuse of data models in new environments.

  • Perform reverse engineering of physical data models from databases and SQL scripts.

  • Evaluate data models and physical databases for variances and discrepancies.

  • Validate business data objects for accuracy and completeness.

  • Analyze data-related system integration challenges and propose appropriate solutions.

  • Develop data models according to company standards.

  • Guide System Analysts, Engineers, Programmers and others on project limitations and capabilities, performance requirements and interfaces.

  • Review modifications to existing software to improve efficiency and performance.

  • Examine new application design and recommend corrections if required.

Qualifications / Requirements:

  • Bachelors degree in Information Technology or Computer Science

  • 3+ years experience as a Data Modeler/Data Architect

  • Proficient in the use of data modeling tools; Eriwin proficiency is a must.

  • Experience in meta data management and data integration engines such as Biztalk or Informatica

  • Experience in supporting as well as implementing Oracle and SQL data infrastructures

  • Knowledge of the entire process behind software development including design and deployment (SOA knowledge and experience is a bonus)

  • Expert analytical and problem-solving traits

  • Knowledge of the design, development and maintenance of various data models and their components

  • Understand BI tools and technologies as well as the optimization of underlying databases

Mercedes-Benz USA
  • Atlanta, GA

Job Overview

Mercedes-Benz USA is recruiting a Big Data Architect, a newly created position within the Information Technology Infrastructure Department. This position is responsible for refining and creating the next step in technology for our organization. In this role you will act as contact person and agile enabler for all questions regarding new IT infrastructure and services in context of Big Data solutions.

Responsibilities

  • Leverage sophisticated Big Data technologies into current and future business applications

  • Lead infrastructure projects for the implementation of new Big Data solutions

  • Design and implement modern, scalable data center architectures (on premise, hybrid or cloud) that meet the requirements of our business partners

  • Ensure the architecture is optimized for large dataset acquisition, analysis, storage, cleansing, transformation and reclamation

  • Create the requirements analysis, the platform selection and the design of the technical architecture

  • Develop IT infrastructure roadmaps and implement strategies around data science initiatives

  • Lead the research and evaluation of emerging technologies, industry and market trends to assist in project development and operational support actives

  • Work closely together with the application teams to exceed our business partners expectations

Qualifications 

Education

Bachelors Degree (accredited school) with emphasis in:

Computer/Information Science

Information Technology

Engineering

Management Information System (MIS)

Must have 5 7 years of experience in the following:

  • Architecture, design, implementation, operation and maintenance of Big Data solutions

  • Hands-on experience with major Big Data technologies and frameworks including Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra.

  • Experience with Big Data solutions deployed in large cloud computing infrastructures such as AWS, GCE and Azure

  • Strong knowledge of programming and scripting languages such as Java, Linus, PHP, Ruby, Phyton

  • Big Data query tools such as Pig, Hive and Impala

  • Project Management Skills:

  • Ability to develop plans/projects from conceptualization to implementation

  • Ability to organize workflow and direct tasks as well as document milestones and ROIs and resolve problems

Proven experience with the following:

  • Open source software such as Hadoop and Red Hat

  • Shell scripting

  • Servers, storage, networking, and data archival/backup solutions

  • Industry knowledge and experience in areas such as Software Defined Networking (SDN), IT infrastructure and systems security, and cloud or network systems management

Additional Skills
Focus on problem resolution and troubleshooting
Knowledge on hardware capabilities and software interfaces and applications
Ability to produce quality digital assets/products
 
EEO Statement
Mercedes-Benz USA is committed to fostering an inclusive environment that appreciates and leverages the diversity of our team. We provide equal employment opportunity (EEO) to all qualified applicants and employees without regard to race, color, ethnicity, gender, age, national origin, religion, marital status, veteran status, physical or other disability, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local law.

Mindtree
  • Dallas, TX

Mindtree delivers digital transformation and technology services from ideation to execution, enabling Global 2000 clients to outperform the competition. Born digital, Mindtree takes an agile, collaborative approach to creating customized solutions across the digital value chain. We are co-headquartered in Bangalore, India and New Jersey. Founded in 1999, we have 16,000+ Mindtree Minds across the globe. Our annual revenue exceeded $780+ million in 2016-17. Mindtree provides services in e-commerce, mobile applications, cloud computing, digital transformation, data analytics, EAI and ERP, with more than 290 clients and offices in 14 countries. Visit http://careers.mindtree.com for more details  


Location: Dallas Texas

Senior Architect Big Data  

Total # of years of experience 12 15 Years

CoE Cloud Computing, Digital Business

Role Description Architecture Experience

12-15 years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems

Experience in defining & realizing end-to-end Solution Architecture for large & complex systems

Experience in designing applications on Azure

Proficient in reviewing technical documents like Architecture views, Technology Architecture blueprint and design specification

Follows technology trends and is able to correlate them with business

Experience in Architecture consulting engagements

Ability to frame architectural decisions, provide technology leadership & direction

Excellent problem solving, hands-on engineering and communication skills

Developing and maintaining strong client relations with senior and C-level executives developing new insights into the clients business model and pain points, and delivering actionable, high-impact results

Participating and leading client engagement in developing plans and strategies of data management processes and IT programs for the clients, providing hands on assistance in data modeling, technical implementation of vendor products and practices

Scope Client requirements. Specify solutions and articulate value to customers. Provide best practice advice to customers and team members.

Technology & Engineering Expertise

Ability to work with a multi-technology/cross-functional teams and customer stakeholders to guide/managing a full life-cycle of a Spark based Data Engineering solution

Extensive experience in data modeling and database design involving any combination of - Azure Data warehousing and Business Intelligence systems and tools, Relational and MPP database platforms or Open source Bigdata stack.

Hands-on administration, configuration management, monitoring, performance tuning of Spark/Distributed platforms

Strong understanding of Big Data Analytics platforms and ETL in the context of Big Data

Hands on experience in architecting solutions involving the technologies Bigdata, Spark, Azure SQL DWH, Azure SQL, Cosmos DB, Azure App Services & Power BI

Hands on experience in working with Databricks Spark

Key Responsibilities  

Define and own Solution Architecture on Big Data , Spark & Azure services from definition phase to go-live phase

Ensure clarity on NFR and address these requirements during Architecture definition phase & make sure these are realized during engineering and deployment phase

Define Logical, Technical & Physical views of Architecture proposed for Big Data

Define reusable components/frameworks, common schemas, standards & tools to be used

Review design to make sure design is aligned with Architecture