OnlyDataJobs.com

Perficient, Inc.
  • Phoenix, AZ
At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
Perficient Data Solutions is looking for an experienced Hadoop Administrator with experience administering Cloudera on AWS. This postition is located in Boston, however the candidate can be located in any well connected city. Perficient is on a mission to help enterprises take advantage of modern data and analytics architectures, tools, and patterns to improve the business operations and better engage customers. This is an excellent opportunity for the right individual to assist Perficient and its customers to grow the capabilities necessary to improve care through better use of data and information, and in the process take their career to the next level.
Job Overview
The Hadoop System Administrator (SA) is responsible for effective provisioning, installation/configuration, operation, and maintenance of systems hardware and software and related infrastructure to enable Hadoop and analytics on Big Data. This individual participates in technical research and development to enable continuing innovation within the infrastructure. This individual ensures that system hardware, operating systems, software systems, and related procedures adhere to organizational values, enabling staff, volunteers, and Partners.
This individual will assist project teams with technical issues in the Initiation and Planning phases of our standard Project Management Methodology. These activities include the definition of needs, benefits, and technical strategy; research & development within the project life-cycle; technical analysis and design; and support of operations staff in executing, testing and rolling-out the solutions. Participation on projects is focused on smoothing the transition of projects from development staff to production staff by performing operations activities within the project life-cycle.
This individual is accountable for the following systems: Linux and Windows systems that support GIS infrastructure; Linux, Windows and Application systems that support Asset Management; Responsibilities on these systems include SA engineering and provisioning, operations and support, maintenance and research and development to ensure continual innovation.
Responsibilities
  • Provide end to end vision and hands on experience with Cloudera and AWS Platforms especially best practices around HIVE and HBASE
  • Experience automating common adminstratvie tasks in Cloudera and AWS
  • Troubleshoot and develop on Hadoop technologies including HDFS, Kafka, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools such as ODI for Big Data and APIs to extract data from source. Troubleshooting for AWS Technologies like EWR, EC2, S3, Cloud Foundation, etc.
  • Translate, load and present disparate data-sets in multiple formats and from multiple sources including JSON, Avro, text files, Kafka queues, and log data.
  • Administration of Cloudera clusters on AWS services, security, scalability, configuration and availability and access
  • Lead workshops with many teams to define data ingestion, validation, transformation, data engineering, and Data Modeling
  • Performance tune HIVE and HBASE jobs
  • Design and develop open source platform components using Spark, Sqoop, Java, Oozie, Kafka, Python, and other components is a plus
  • Lead capacity planning & requirements gathering phases including estimate, develop, test, manage projects, architect and deliver complex projects
  • Participate and lead in design sessions, demos and prototype sessions, testing and training workshops with business users and other IT associates
  • Contribute to the thought capital through the creation of executive presentations, architecture documents and articulate them to executives through presentations
Qualifications
  • 3 Plus years of Hadoop Administration
  • Cloudera and AWS certifications are strongly desired.
  • Bachelor's degree, with a technical major, such as engineering or computer science.
  • Four to six years of Linus/Unix system administration experience.
  • Ability to travel up to 50 percent, preferred.
Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
More About Perficient
Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
Select work authorization questions to ask when applicants apply
  • Are you legally authorized to work in the United States?
  • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
Perficient, Inc.
  • Detroit, MI
At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
Perficient Data Solutions is looking for an experienced Hadoop Administrator with experience administering Cloudera on AWS. This postition is located in Boston, however the candidate can be located in any well connected city. Perficient is on a mission to help enterprises take advantage of modern data and analytics architectures, tools, and patterns to improve the business operations and better engage customers. This is an excellent opportunity for the right individual to assist Perficient and its customers to grow the capabilities necessary to improve care through better use of data and information, and in the process take their career to the next level.
Job Overview
The Hadoop System Administrator (SA) is responsible for effective provisioning, installation/configuration, operation, and maintenance of systems hardware and software and related infrastructure to enable Hadoop and analytics on Big Data. This individual participates in technical research and development to enable continuing innovation within the infrastructure. This individual ensures that system hardware, operating systems, software systems, and related procedures adhere to organizational values, enabling staff, volunteers, and Partners.
This individual will assist project teams with technical issues in the Initiation and Planning phases of our standard Project Management Methodology. These activities include the definition of needs, benefits, and technical strategy; research & development within the project life-cycle; technical analysis and design; and support of operations staff in executing, testing and rolling-out the solutions. Participation on projects is focused on smoothing the transition of projects from development staff to production staff by performing operations activities within the project life-cycle.
This individual is accountable for the following systems: Linux and Windows systems that support GIS infrastructure; Linux, Windows and Application systems that support Asset Management; Responsibilities on these systems include SA engineering and provisioning, operations and support, maintenance and research and development to ensure continual innovation.
Responsibilities
  • Provide end to end vision and hands on experience with Cloudera and AWS Platforms especially best practices around HIVE and HBASE
  • Experience automating common adminstratvie tasks in Cloudera and AWS
  • Troubleshoot and develop on Hadoop technologies including HDFS, Kafka, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools such as ODI for Big Data and APIs to extract data from source. Troubleshooting for AWS Technologies like EWR, EC2, S3, Cloud Foundation, etc.
  • Translate, load and present disparate data-sets in multiple formats and from multiple sources including JSON, Avro, text files, Kafka queues, and log data.
  • Administration of Cloudera clusters on AWS services, security, scalability, configuration and availability and access
  • Lead workshops with many teams to define data ingestion, validation, transformation, data engineering, and Data Modeling
  • Performance tune HIVE and HBASE jobs
  • Design and develop open source platform components using Spark, Sqoop, Java, Oozie, Kafka, Python, and other components is a plus
  • Lead capacity planning & requirements gathering phases including estimate, develop, test, manage projects, architect and deliver complex projects
  • Participate and lead in design sessions, demos and prototype sessions, testing and training workshops with business users and other IT associates
  • Contribute to the thought capital through the creation of executive presentations, architecture documents and articulate them to executives through presentations
Qualifications
  • 3 Plus years of Hadoop Administration
  • Cloudera and AWS certifications are strongly desired.
  • Bachelor's degree, with a technical major, such as engineering or computer science.
  • Four to six years of Linus/Unix system administration experience.
  • Ability to travel up to 50 percent, preferred.
Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
More About Perficient
Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
Select work authorization questions to ask when applicants apply
  • Are you legally authorized to work in the United States?
  • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
Perficient, Inc.
  • Dallas, TX
At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
Perficient Data Solutions is looking for an experienced Hadoop Administrator with experience administering Cloudera on AWS. This postition is located in Boston, however the candidate can be located in any well connected city. Perficient is on a mission to help enterprises take advantage of modern data and analytics architectures, tools, and patterns to improve the business operations and better engage customers. This is an excellent opportunity for the right individual to assist Perficient and its customers to grow the capabilities necessary to improve care through better use of data and information, and in the process take their career to the next level.
Job Overview
The Hadoop System Administrator (SA) is responsible for effective provisioning, installation/configuration, operation, and maintenance of systems hardware and software and related infrastructure to enable Hadoop and analytics on Big Data. This individual participates in technical research and development to enable continuing innovation within the infrastructure. This individual ensures that system hardware, operating systems, software systems, and related procedures adhere to organizational values, enabling staff, volunteers, and Partners.
This individual will assist project teams with technical issues in the Initiation and Planning phases of our standard Project Management Methodology. These activities include the definition of needs, benefits, and technical strategy; research & development within the project life-cycle; technical analysis and design; and support of operations staff in executing, testing and rolling-out the solutions. Participation on projects is focused on smoothing the transition of projects from development staff to production staff by performing operations activities within the project life-cycle.
This individual is accountable for the following systems: Linux and Windows systems that support GIS infrastructure; Linux, Windows and Application systems that support Asset Management; Responsibilities on these systems include SA engineering and provisioning, operations and support, maintenance and research and development to ensure continual innovation.
Responsibilities
  • Provide end to end vision and hands on experience with Cloudera and AWS Platforms especially best practices around HIVE and HBASE
  • Experience automating common adminstratvie tasks in Cloudera and AWS
  • Troubleshoot and develop on Hadoop technologies including HDFS, Kafka, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools such as ODI for Big Data and APIs to extract data from source. Troubleshooting for AWS Technologies like EWR, EC2, S3, Cloud Foundation, etc.
  • Translate, load and present disparate data-sets in multiple formats and from multiple sources including JSON, Avro, text files, Kafka queues, and log data.
  • Administration of Cloudera clusters on AWS services, security, scalability, configuration and availability and access
  • Lead workshops with many teams to define data ingestion, validation, transformation, data engineering, and Data Modeling
  • Performance tune HIVE and HBASE jobs
  • Design and develop open source platform components using Spark, Sqoop, Java, Oozie, Kafka, Python, and other components is a plus
  • Lead capacity planning & requirements gathering phases including estimate, develop, test, manage projects, architect and deliver complex projects
  • Participate and lead in design sessions, demos and prototype sessions, testing and training workshops with business users and other IT associates
  • Contribute to the thought capital through the creation of executive presentations, architecture documents and articulate them to executives through presentations
Qualifications
  • 3 Plus years of Hadoop Administration
  • Cloudera and AWS certifications are strongly desired.
  • Bachelor's degree, with a technical major, such as engineering or computer science.
  • Four to six years of Linus/Unix system administration experience.
  • Ability to travel up to 50 percent, preferred.
Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
More About Perficient
Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
Select work authorization questions to ask when applicants apply
  • Are you legally authorized to work in the United States?
  • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
WB Solutions LLC
  • Houston, TX

Role: Hadoop Engineer

Location: Irving, TX

Duration: Long Term Contract

Requirement:

    • Majority is related to Hadoop and not on Oracle pl/sql, ODI, OBIEE
    • Data scientist experience with building statistical models and intelligence around it
    • Proven understanding and related experience with Hadoop, HBase, Hive, Pig, Sqoop, Flume, Hbase, Map/Reduce, Apache Spark as well as Unix OS Core Java programming, Scala, shell scripting experience.
    • Hands on Experience in Oozie Job Scheduling, Zookeeper, Solr, ElasticSearch, Storm, LogStash or other similar technologies
    • Solid experience in writing SQL, stored procedures, query performance tuning preferably on Oracle 12c, ODI jobs

Responsibilities:

    • Participate in Agile development on a large Hadoop-based data platform as a member of a distributed team
    • Come out with different statistical models and build data insights on revenue leakages possibilities
    • Code programs to load data from diverse data sources into ODI, OBIEE , Hive structures using SQOOP and other tools.
    • Translate complex functional and technical requirements into detailed design.
    • Analyze vast data stores.
    • Code business logic using Scala on Apache Spark.
    • Create workflows using Oozie.
    • Code and test prototypes.
    • Code to existing frameworks where applicable.

EDUCATION/CERTIFICATIONS:

    • Bachelor's/Masters in Computer Engineering or Information Technology
    • Oracle - ODI, OBIEE , Hadoop certification is added advantage
Ciber
  • Detroit, MI

Live IT Up at Ciber Global

At Ciber Global, we believe the most inspired, innovative and industrious companies should win, regardless of size or legacy. Were the small company that cares and the big company that can. We deliver breakthrough performances and powerful solutions that are anything but cookie cutter in order to give our clients the competitive advantage they deserve.

Work.

Position Description:

  • The Global Data Insight and Analytics (GDIA) Portfolio is developing a new Customer Analytics Platform utilizing multiple technologies like Hadoop, Java EE, MDM and Quality Stage.
  • Over the next several years this Application Platform will be developed globally for markets in Mexico, Europe, South America, Middle East, and Asia-Pacific.
  • This is specialty developer senior position to support the development of the Customer Analytics Platform and related projects
Job Description: 
  • Understand the strategic direction as it relates to project goals and design global solution that aligns with project goals
  • Understand and participate in designing information workflows
  • Design and Develop Hadoop/Java code/workflows to satisfy the requirements and designed solution.
  • Perform unit, functional and integration testing of developed solution
  • Provide support to other team members
  • Support integration efforts of systems and data through application consolidation / migration / conversion, application integration, and data integration

Skills Required:

  • 5+ years of experience in professional programming experience
  • Strong foundation in Computer Science fundamentals such as data structures and algorithms
  • Experience with Java
  • Passionate about Big Data technologies
  • Working knowledge of the full SW development lifecycle
  • Proven ability to develop and ship high quality software products
  • Experience with the Hortonworks Hadoop environment
  • Knowledge of Big Data security and management operations
  • Knowledge of visualization tools such as Tableau or Qlik.
  • Hortonworks Hadoop, HDFS, HIVE, HBase, Spark, Kafka, Sqoop services Ambari, Knox, Ranger, Oozie, Kerberos, Yarn, Job monitoring

Education Required:

B.S. Information Systems, Computer Science or equivalent work experience in the requested field

This position requires the successful completion of a background investigation and/or drug screen.

Ciber Global is an Equal Opportunity Employer Minorities/Females/Gender Identity/Sexual Orientation/Protected Veterans/Individuals with Disabilities.

Play.

Keep discovering! Check here: https://www.youtube.com/watch?v=8Wk6XjKBMf0

Grow.

Ciber Global is an IT consulting company who partners with organizations to develop technology strategies and solutions that deliver tangible business value. Founded in 1974, Ciber is an HTC Global Services company. For more information, visit www.Ciber.com.

ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Analytics Platform Administrator is accountable for managing big data environments, on bare-metal, container infrastructure, or on a cloud platform. This role is responsible for system design, capacity planning, performance tuning, and ongoing monitoring of the data lake environment. As a lead administrator, this role will also manage day to day work of any onshore and offshore contractors on the platforms team. The position reports to the Director of Analytic Platforms and it is in Houston, TX.
Responsibilities May Include
  • Work with IT Operations and Information Security Operations for monitoring, troubleshooting, and support of incidents to maintain service levels
  • 24/7 coverage for analytics platforms
  • Monitor the performance of the systems and ensure high uptime
  • Deploy new and maintain existing data lake environments on Hadoop or AWS/Azure stack
  • Work closely with the various teams to make sure that all the big data applications are highly available and performing as expected. The teams include data science, database, network, BI, application, etc.
  • Work with AICOE and business analysts on designing and running technology proof of concepts on Analytics platforms
  • Capacity planning of the data lake environment
  • Manage and review log files, backup and recovery, upgrades, etc.
  • Responsible security management of the platforms
  • Support of our on-premise Hortonworks Hadoop environment
Basic/Required
  • Legally authorized to work in the United States
  • 5+ years of related IT experience
  • 3+ years of Structure Querying Language experience (SQL)
  • 1+ years of experience with Hadoop technology stack (HDFS, HBase, Spark, Sqoop, Hive, Ranger, NiFi, etc.)
  • Intermediate proficiency analyzing and understanding business/technology system architectures, databases, and client applications
Preferred
  • Bachelor's Degree in Computer Science, MIS, Information Technology or other related technical discipline
  • 1+ years of experience with AWS or Azure analytics stack
  • 1+ years of experience architecting data warehouses and/or data lakes
  • 1+ years of Oil and Gas Industry Experience
  • Delivery experience with enterprise databases and/or data warehouse platforms such as Oracle, SQL Server or Teradata
  • Automation experience with Python, PowerShell or a similar technology
  • Experience with source control and automated deployment. Useful technologies include Git, Jenkins, and Ansible
  • Experience with complex networking infrastructure including firewalls, VLANs, and load balancers
  • Experience as a DBA or Linux Admin
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to learn new technologies quickly and leverage them in data analytics solutions
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Ability to work as a team player
  • Strong analytical, troubleshooting, and problem-solving skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of March 11, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Mar 4, 2019, 1:39:58 PM
phData
  • Minneapolis, MN

If you're inspired by innovation, hard work and a passion for data, this may be the ideal opportunity to leverage your background in Big Data and Software Engineering, Data Engineering or Data Analytics experience to design, develop and innovate big data solutions for a diverse set of global and enterprise clients.  


At phData, our proven success has skyrocketed the demand for our services, resulting in quality growth at our company headquarters conveniently located in Downtown Minneapolis and expanding throughout the US. Notably we've also been voted Best Company to Work For in Minneapolis for the last 2 years.   


As the world’s largest pure-play Big Data services firm, our team includes Apache committers, Spark experts and the most knowledgeable Scala development team in the industry. phData has earned the trust of customers by demonstrating our mastery of Hadoop services and our commitment to excellence.


In addition to a phenomenal growth and learning opportunity, we offer competitive compensation and excellent perks including base salary, annual bonus, extensive training, paid Cloudera certifications - in addition to generous PTO and employee equity. 


As a Solution Architect on our Big Data Consulting Team, your responsibilities will include:



  • Design, develop, and innovative Hadoop solutions; partner with our internal Infrastructure Architects and Data Engineers to build creative solutions to tough big data problems.  


  • Determine the technical project road map, select the best tools, assign tasks and priorities, and assume general project management oversight for performance, data integration, ecosystem integration, and security of big data solutions.  Mentor and coach Developers and Data Engineers. Provide guidance with project creation, application structure, automation, code style, testing, and code reviews


  • Work across a broad range of technologies – from infrastructure to applications – to ensure the ideal Hadoop solution is implemented and optimized


  • Integrate data from a variety of data sources (data warehouse, data marts) utilizing on-prem or cloud-based data structures (AWS); determine new and existing data sources




  • Design and implement streaming, data lake, and analytics big data solutions




  • Create and direct testing strategies including unit, integration, and full end-to-end tests of data pipelines




  • Select the right storage solution for a project - comparing Kudu, HBase, HDFS, and relational databases based on their strengths




  • Utilize ETL processes to build data repositories; integrate data into Hadoop data lake using Sqoop (batch ingest), Kafka (streaming), Spark, Hive or Impala (transformation)




  • Partner with our Managed Services team to design and install on prem or cloud based infrastructure including networking, virtual machines, containers, and software




  • Determine and select best tools to ensure optimized data performance; perform Data Analysis utilizing Spark, Hive, and Impala



  • Local Candidates work between client site and office (Minneapolis).  Remote US must be willing to travel 20% for training and project kick-off.


Technical Leadership Qualifications




  • 5+ years previous experience as a Software Engineer, Data Engineer or Data Analytics




  • Expertise in core Hadoop technologies including HDFS, Hive and YARN.  




  • Deep experience in one or more ecosystem products/languages such as HBase, Spark, Impala, Solr, Kudu, etc




  • Expert programming experience in Java, Scala, or other statically typed programming language




  • Ability to learn new technologies in a quickly changing field




  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries




  • Excellent communication skills including proven experience working with key stakeholders and customers




Leadership




  • Ability to translate “big picture” business requirements and use cases into a Hadoop solution, including ingestion of many data sources, ETL processing, data access and consumption, as well as custom analytics




  • Experience scoping activities on large scale, complex technology infrastructure projects




  • Customer relationship management including project escalations, and participating in executive steering meetings



  • Coaching and mentoring data or software engineers 

ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Analytics Platform Administrator is accountable for managing big data environments, on bare-metal, container infrastructure, or on a cloud platform. This role is responsible for system design, capacity planning, performance tuning, and ongoing monitoring of the data lake environment. As a lead administrator, this role will also manage day to day work of any onshore and offshore contractors on the platforms team. The position reports to the Director of Analytic Platforms and it is in Houston, TX.
Responsibilities May Include
  • Work with IT Operations and Information Security Operations for monitoring, troubleshooting, and support of incidents to maintain service levels
  • 24/7 coverage for analytics platforms
  • Monitor the performance of the systems and ensure high uptime
  • Deploy new and maintain existing data lake environments on Hadoop or AWS/Azure stack
  • Work closely with the various teams to make sure that all the big data applications are highly available and performing as expected. The teams include data science, database, network, BI, application, etc.
  • Work with AICOE and business analysts on designing and running technology proof of concepts on Analytics platforms
  • Capacity planning of the data lake environment
  • Manage and review log files, backup and recovery, upgrades, etc.
  • Responsible security management of the platforms
  • Support of our on-premise Hortonworks Hadoop environment
Basic/Required
  • Legally authorized to work in the United States
  • 5+ years of related IT experience
  • 3+ years of Structure Querying Language experience (SQL)
  • 1+ years of experience with Hadoop technology stack (HDFS, HBase, Spark, Sqoop, Hive, Ranger, NiFi, etc.)
  • Intermediate proficiency analyzing and understanding business/technology system architectures, databases, and client applications
Preferred
  • Bachelor's Degree in Computer Science, MIS, Information Technology or other related technical discipline
  • 1+ years of experience with AWS or Azure analytics stack
  • 1+ years of experience architecting data warehouses and/or data lakes
  • 1+ years of Oil and Gas Industry Experience
  • Delivery experience with enterprise databases and/or data warehouse platforms such as Oracle, SQL Server or Teradata
  • Automation experience with Python, PowerShell or a similar technology
  • Experience with source control and automated deployment. Useful technologies include Git, Jenkins, and Ansible
  • Experience with complex networking infrastructure including firewalls, VLANs, and load balancers
  • Experience as a DBA or Linux Admin
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to learn new technologies quickly and leverage them in data analytics solutions
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Ability to work as a team player
  • Strong analytical, troubleshooting, and problem-solving skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 28, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 20, 2019, 9:10:39 AM
Hays
  • Toronto, ON, Canada


Major Bank looking for a Big Data Engineer to work out of their Downtown Toronto office for 6months + ext

Big Data Engineer

Client: HSBC
Role: Data Engineer
Duration: 6 months, plus likely extension
Rate: Open *depending on experience
Location: Toronto, ON

Our client, a globally recognized bank is looking to hire a Data Engineer for a minimum 6 months based in Toronto to join their team..

Your new company
A leading bank, with multiple offices across Canada and throughout the world are looking for a Big Data Engineer for a 6 month contract in their Toronto office. They have an excellent reputation within their sector and are known as a market leader.

Your new role
You will be working as a Big Data Engineer part of the core big data technology and design team. Person would be entrusted to develop solutions/design ideas, identify design ideas to enable the software to meet the acceptance and success criteria. You will be working with architects/BA to build data components on the Big Data environment.

What you'll need to succeed
* 8+ years professional software development experience and at least 4+ years within Big data environment
* 4+ years of programming experience in Java, Scala, and Spark.
* Proficient in SQL and relational database design.
* Agile and DevOps experience - at least 2+ years
* Project planning.
* Must have excellent communication skills + have strong team-working skills
* Experienced in Java, Scala and/or Python, Unix/Linux environment on-premises and in the cloud
* Experienced in construction of robust batch and real-time data processing solutions on hadoop
* Java development and design using Java 1.7/1.8.
* Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, HBase, Presto, Python, ETL frameworks, MapReduce, SQL, RESTful services).
* Sound knowledge on working Unix/Linux Platform
* Hands-on experience building data pipelines using Hadoop components Sqoop, Hive, Pig, Spark, Spark SQL.
* Must have experience with developing Hive QL, UDF's for analysing semi structured/structured datasets
* Experience with time-series/analytics db's such as Elastic search or no SQL database.
* Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA
* Exposure to Agile Project methodology but also with exposure to other methodologies (such as Kanban)
* Understanding of data modelling techniques using relational and non-relational techniques
* Coordination between global teams
* Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects
* Nice to have: ELK experience. Knowledge of cloud computing technology such as Google Cloud Platform(GCP)
What you'll get in return


The client is offering a 6 month engagement, with a high likelihood of extension and a very competitive rate for the contract.

What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Sr. Analytics Analyst will be part of the Production, Drilling, and Projects Analytics Services Team within the Analytics Innovation Center of Excellence that enables data analytics across the ConocoPhillips global enterprise. This role works with business units and global functions to help strategically design, implement, and support data analytics solutions. This is a full-time position that provides tremendous career growth potential within ConocoPhillips.
Responsibilities May Include
  • Complete end to end delivery of data analytics solutions to the end user
  • Interacting closely with both business and developers while gathering requirements, designing, testing, implementing and supporting solutions
  • Gather business and technical specifications to support analytic, report and database development
  • Collect, analyze and translate user requirements into effective solutions
  • Build report and analytic prototypes based on initial business requirements
  • Provide status on the issues and progress of key business projects
  • Providing regular reporting on the performance of data analytics solutions
  • Delivering regular updates and maintenance on data analytics solutions
  • Championing the data analytics solutions and technologies at ConocoPhillips
  • Integrate data for data models used by the customers
  • Deliver Data Visualizations used for data driven decision making
  • Provide strategic technology direction while supporting the needs of the business
Basic/Required
  • Legally authorized to work in the United States
  • 5+ years of related IT experience
  • 5+ year of Structure Querying Language experience (ANSI SQL, T-SQL, PL/SQL)
  • 3+ years hands-on experience delivering solutions with an Analytics Tools i.e. (Spotfire, SSRS, Power BI, Tableau, Business Objects)
Preferred
  • Bachelor's Degree in Information Technology or Computer Science
  • 5+ years of Oil and Gas Industry experience
  • 5+ years hands-on experience delivering solutions with Informatica PowerCenter
  • 5+ years architecting data warehouses and/or data lakes
  • 5+ years with Extract Transform and Load (ETL) tools and best practices
  • 3+ years hands-on experience delivering solutions with Teradata
  • 1+ years developing analytics models with R or Python
  • 1+ years developing visualizations using R or Python
  • Experience with Oracle (11g, 12c) and SQL Server (2008 R2, 2010, 2016) and Teradata 15.x
  • Experience with Hadoop technologies (Hortonworks, Cloudera, SQOOP, Flume, etc.)
  • Experience with AWS technologies (S3, SageMaker, Athena, EMR, Redshift, Glue, etc.)
  • Thorough understanding of BI/DW concepts, proficient in SQL, and data modeling
  • Familiarity with ETL tools (Informatica, etc.) and ETL processes
  • Solutions oriented individual; learn quickly, understand complex problems, and apply useful solutions
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to work as a team player
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Strong analytical, troubleshooting, and problem-solving skills experience in analyzing and understanding business/technology system architectures, databases, and client applications to recognize, isolate, and resolve problems
  • Demonstrates the desire and ability to learn and utilize new technologies in data analytics solutions
  • Strong communication and presentation skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 20, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 13, 2019, 4:56:49 PM
phData, Inc.
  • Minneapolis, MN

Title: Big Data Solutions Architect (Minneapolis or US Remote)


Join the Game-Changers in Big Data  


Are you inspired by innovation, hard work and a passion for data?    


If so, this may be the ideal opportunity to leverage your background in Big Data and Software Engineering, Data Engineering or Data Analytics experience to design, develop and innovate big data solutions for a diverse set of clients.  


As a Solution Architect on our Big Data Consulting team, your responsibilities include:


    • Design, develop, and innovative Big Data solutions; partner with our internal Managed Services Architects and Data Engineers to build creative solutions to solve tough big data problems.  
    • Determine the project road map, select the best tools, assign tasks and priorities, and assume general project management oversight for performance, data integration, ecosystem integration, and security of big data solutions
    • Work across a broad range of technologies from infrastructure to applications to ensure the ideal Big Data solution is implemented and optimized
    • Integrate data from a variety of data sources (data warehouse, data marts) utilizing on-prem or cloud-based data structures (AWS); determine new and existing data sources
    • Design and implement streaming, data lake, and analytics big data solutions

    • Create and direct testing strategies including unit, integration, and full end-to-end tests of data pipelines

    • Select the right storage solution for a project - comparing Kudu, HBase, HDFS, and relational databases based on their strengths

    • Utilize ETL processes to build data repositories; integrate data into Hadoop data lake using Sqoop (batch ingest), Kafka (streaming), Spark, Hive or Impala (transformation)

    • Partner with our Managed Services team to design and install on prem or cloud based infrastructure including networking, virtual machines, containers, and software

    • Determine and select best tools to ensure optimized data performance; perform Data Analysis utilizing Spark, Hive, and Impala

    • Mentor and coach Developers and Data Engineers. Provide guidance with project creation, application structure, automation, code style, testing, and code reviews

Qualifications

  • 5+ years previous experience as a Software Engineer, Data Engineer or Data Analytics - combined with an expertise in Hadoop Technologies and Java programming
  • Technical Leadership experience leading/mentoring junior software/data engineers, as well as scoping activities on large scale, complex technology projects
  • Expertise in core Hadoop technologies including HDFS, Hive and YARN.  
  • Deep experience in one or more ecosystem products/languages such as HBase, Spark, Impala, Solr, Kudu, etc
  • Expert programming experience in Java, Scala, or other statically typed programming language
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries
  • Excellent communication skills including proven experience working with key stakeholders and customers
  • Ability to translate big picture business requirements and use cases into a Hadoop solution, including ingestion of many data sources, ETL processing, data access and consumption, as well as custom analytics
  • Customer relationship management including project escalations, and participating in executive steering meetings
  • Ability to learn new technologies in a quickly changing field
phData, Inc.
  • Minneapolis, MN

Title: Big Data Solutions Architect (Minneapolis or US Remote)


Join the Game-Changers in Big Data  


Are you inspired by innovation, hard work and a passion for data?    


If so, this may be the ideal opportunity to leverage your background in Big Data and Software Engineering, Data Engineering or Data Analytics experience to design, develop and innovate big data solutions for a diverse set of clients.  


As a Solution Architect on our Big Data Consulting team, your responsibilities include:


    • Design, develop, and innovative Big Data solutions; partner with our internal Managed Services Architects and Data Engineers to build creative solutions to solve tough big data problems.  
    • Determine the project road map, select the best tools, assign tasks and priorities, and assume general project management oversight for performance, data integration, ecosystem integration, and security of big data solutions
    • Work across a broad range of technologies from infrastructure to applications to ensure the ideal Big Data solution is implemented and optimized
    • Integrate data from a variety of data sources (data warehouse, data marts) utilizing on-prem or cloud-based data structures (AWS); determine new and existing data sources
    • Design and implement streaming, data lake, and analytics big data solutions

    • Create and direct testing strategies including unit, integration, and full end-to-end tests of data pipelines

    • Select the right storage solution for a project - comparing Kudu, HBase, HDFS, and relational databases based on their strengths

    • Utilize ETL processes to build data repositories; integrate data into Hadoop data lake using Sqoop (batch ingest), Kafka (streaming), Spark, Hive or Impala (transformation)

    • Partner with our Managed Services team to design and install on prem or cloud based infrastructure including networking, virtual machines, containers, and software

    • Determine and select best tools to ensure optimized data performance; perform Data Analysis utilizing Spark, Hive, and Impala

    • Mentor and coach Developers and Data Engineers. Provide guidance with project creation, application structure, automation, code style, testing, and code reviews

Qualifications

  • 5+ years previous experience as a Software Engineer, Data Engineer or Data Analytics - combined with an expertise in Hadoop Technologies and Java programming
  • Technical Leadership experience leading/mentoring junior software/data engineers, as well as scoping activities on large scale, complex technology projects
  • Expertise in core Hadoop technologies including HDFS, Hive and YARN.  
  • Deep experience in one or more ecosystem products/languages such as HBase, Spark, Impala, Solr, Kudu, etc
  • Expert programming experience in Java, Scala, or other statically typed programming language
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries
  • Excellent communication skills including proven experience working with key stakeholders and customers
  • Ability to translate big picture business requirements and use cases into a Hadoop solution, including ingestion of many data sources, ETL processing, data access and consumption, as well as custom analytics
  • Customer relationship management including project escalations, and participating in executive steering meetings
  • Ability to learn new technologies in a quickly changing field
phData, Inc.
  • Minneapolis, MN

Title: Big Data Solutions Architect (Minneapolis or US Remote)


Join the Game-Changers in Big Data  


Are you inspired by innovation, hard work and a passion for data?    


If so, this may be the ideal opportunity to leverage your background in Big Data and Software Engineering, Data Engineering or Data Analytics experience to design, develop and innovate big data solutions for a diverse set of clients.  


As a Solution Architect on our Big Data Consulting team, your responsibilities include:


    • Design, develop, and innovative Big Data solutions; partner with our internal Managed Services Architects and Data Engineers to build creative solutions to solve tough big data problems.  
    • Determine the project road map, select the best tools, assign tasks and priorities, and assume general project management oversight for performance, data integration, ecosystem integration, and security of big data solutions
    • Work across a broad range of technologies from infrastructure to applications to ensure the ideal Big Data solution is implemented and optimized
    • Integrate data from a variety of data sources (data warehouse, data marts) utilizing on-prem or cloud-based data structures (AWS); determine new and existing data sources
    • Design and implement streaming, data lake, and analytics big data solutions

    • Create and direct testing strategies including unit, integration, and full end-to-end tests of data pipelines

    • Select the right storage solution for a project - comparing Kudu, HBase, HDFS, and relational databases based on their strengths

    • Utilize ETL processes to build data repositories; integrate data into Hadoop data lake using Sqoop (batch ingest), Kafka (streaming), Spark, Hive or Impala (transformation)

    • Partner with our Managed Services team to design and install on prem or cloud based infrastructure including networking, virtual machines, containers, and software

    • Determine and select best tools to ensure optimized data performance; perform Data Analysis utilizing Spark, Hive, and Impala

    • Mentor and coach Developers and Data Engineers. Provide guidance with project creation, application structure, automation, code style, testing, and code reviews

Qualifications

  • 5+ years previous experience as a Software Engineer, Data Engineer or Data Analytics - combined with an expertise in Hadoop Technologies and Java programming
  • Technical Leadership experience leading/mentoring junior software/data engineers, as well as scoping activities on large scale, complex technology projects
  • Expertise in core Hadoop technologies including HDFS, Hive and YARN.  
  • Deep experience in one or more ecosystem products/languages such as HBase, Spark, Impala, Solr, Kudu, etc
  • Expert programming experience in Java, Scala, or other statically typed programming language
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries
  • Excellent communication skills including proven experience working with key stakeholders and customers
  • Ability to translate big picture business requirements and use cases into a Hadoop solution, including ingestion of many data sources, ETL processing, data access and consumption, as well as custom analytics
  • Customer relationship management including project escalations, and participating in executive steering meetings
  • Ability to learn new technologies in a quickly changing field
Perficient, Inc.
  • Dallas, TX
At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
Perficient currently has a career opportunity for a Senior MapR Solutions Architect.
Job Overview
One of our large clients has made strategic decision to move all order management and sales data from their existing EDW into MapR platform. The focus is fast ingestion and streaming analytics. This is a multiyear roadmap with many components that will piece into a larger Data Management Platform. Perficient subject matter expert will work with the client team to move this data into new environment in a fashion that will meet requirements for applications and analytics.
A Senior Solutions Architect is expected to be knowledgeable in two or more technologies within (a given Solutions/Practice area). The Solutions Architect may or may not have a programming background, but will have expert infrastructure architecture, client presales / presentation, team management and thought leadership skills.
You will provide best-fit architectural solutions for one or more projects; you will assist in defining scope and sizing of work; and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients' business needs. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions, participate in Sales and various pursuits focused on our clients' business needs.
You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, design, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. This role is considered part of the Business Unit Senior Leadership team and may mentor junior architects and other delivery team members.
Responsibilities
  • Provide vision and leadership to define the core technologies necessary to meet client needs including: development tools and methodologies, package solutions, systems architecture, security techniques, and emerging technologies
  • HANDS ON ARCHITECT with VERY STRONG Map R, HBASE, AND HIVE Skills
  • Ability to architect and design end to end on data architecture (ingestion to semantic layer). Identify best ways to export the data to the reporting/analytic layer
  • Recommend best practices and approach on Distributed architecture (Doesnt have to be Map R specific)
  • Most recent project/job to be the Architect of an end to end Big Data implementation which is deployed.
  • Need to articulate best practices on building framework for Data layer (Ingesting, Curating), Aggregation layer, and Reporting layer
  • Understand and articulate DW principles on Hadoop landscape (not just data lake)
  • Performed data model design based HBase and Hive
  • Background of database design for DW on RDBMS is preferred
  • Ability to look at the end to end and suggest physical design remediation on Hadoop
  • Ability to design solutions for different use cases
  • Worked with different data formats (Parquet, Avro, JSON, XML, etc.)
Qualifications
  • Apache framework (Kafka, Spark, Hive, HBase)
  • Map R or similar distribution (Optional)
  • Java
  • Data formats (Parquet, Avro, JSON, XML, etc.)
  • Microservices
Responsibilities
  • At least 10+ years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems
  • At least 3+ years of experience on working with large projects including the most recent project in the MapR platform
  • At least 5+ years of Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
  • Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
  • Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
  • Experience with end-to-end solution architecture for data capabilities including:
  • Experience with ELT/ETL development, patterns and tooling (Informatica, Talend)
  • Ability to produce high quality work products under pressure and within deadlines with specific references
  • VERY strong communication, solutioning, and client facing skills especially non-technical business users
  • At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
  • At least 5+ years of working with a complex Big Data environment
  • 5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management toolsets
Preferred Skills And Education
Masters degree in Computer Science or related field
Certification in Azure platform
Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
More About Perficient
Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
Select work authorization questions to ask when applicants apply
  • Are you legally authorized to work in the United States?
  • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
Perficient, Inc.
  • Houston, TX
At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
Perficient currently has a career opportunity for a Senior MapR Solutions Architect.
Job Overview
One of our large clients has made strategic decision to move all order management and sales data from their existing EDW into MapR platform. The focus is fast ingestion and streaming analytics. This is a multiyear roadmap with many components that will piece into a larger Data Management Platform. Perficient subject matter expert will work with the client team to move this data into new environment in a fashion that will meet requirements for applications and analytics.
A Senior Solutions Architect is expected to be knowledgeable in two or more technologies within (a given Solutions/Practice area). The Solutions Architect may or may not have a programming background, but will have expert infrastructure architecture, client presales / presentation, team management and thought leadership skills.
You will provide best-fit architectural solutions for one or more projects; you will assist in defining scope and sizing of work; and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients' business needs. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions, participate in Sales and various pursuits focused on our clients' business needs.
You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, design, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. This role is considered part of the Business Unit Senior Leadership team and may mentor junior architects and other delivery team members.
Responsibilities
  • Provide vision and leadership to define the core technologies necessary to meet client needs including: development tools and methodologies, package solutions, systems architecture, security techniques, and emerging technologies
  • HANDS ON ARCHITECT with VERY STRONG Map R, HBASE, AND HIVE Skills
  • Ability to architect and design end to end on data architecture (ingestion to semantic layer). Identify best ways to export the data to the reporting/analytic layer
  • Recommend best practices and approach on Distributed architecture (Doesnt have to be Map R specific)
  • Most recent project/job to be the Architect of an end to end Big Data implementation which is deployed.
  • Need to articulate best practices on building framework for Data layer (Ingesting, Curating), Aggregation layer, and Reporting layer
  • Understand and articulate DW principles on Hadoop landscape (not just data lake)
  • Performed data model design based HBase and Hive
  • Background of database design for DW on RDBMS is preferred
  • Ability to look at the end to end and suggest physical design remediation on Hadoop
  • Ability to design solutions for different use cases
  • Worked with different data formats (Parquet, Avro, JSON, XML, etc.)
Qualifications
  • Apache framework (Kafka, Spark, Hive, HBase)
  • Map R or similar distribution (Optional)
  • Java
  • Data formats (Parquet, Avro, JSON, XML, etc.)
  • Microservices
Responsibilities
  • At least 10+ years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems
  • At least 3+ years of experience on working with large projects including the most recent project in the MapR platform
  • At least 5+ years of Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
  • Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
  • Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
  • Experience with end-to-end solution architecture for data capabilities including:
  • Experience with ELT/ETL development, patterns and tooling (Informatica, Talend)
  • Ability to produce high quality work products under pressure and within deadlines with specific references
  • VERY strong communication, solutioning, and client facing skills especially non-technical business users
  • At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
  • At least 5+ years of working with a complex Big Data environment
  • 5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management toolsets
Preferred Skills And Education
Masters degree in Computer Science or related field
Certification in Azure platform
Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
More About Perficient
Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
Select work authorization questions to ask when applicants apply
  • Are you legally authorized to work in the United States?
  • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
Perficient, Inc.
  • Dallas, TX
At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
Perficient currently has a career opportunity for a Big Data Engineer(Microservices Developer),
Job Overview
One of our large clients has made strategic decision to move all order management and sales data from their existing EDW into MapR platform. The focus is fast ingestion and streaming analytics. This is a multiyear roadmap with many components that will piece into a larger Data Management Platform. Perficient subject matter expert will work with the client team to move this data into new environment in a fashion that will meet requirements for applications and analytics. As a lead developer, you will be responsible for Microservices development.
Responsibilities
  • Ability to focus on framework for Dev Ops, Ingestion, and Reading/writing into HDFSWorked with different data formats (Parquet, Avro, JSON, XML, etc.)
  • Worked on containerized solutions (Kubernetes..)
  • Provide end to end vision and hands on experience with MapR Platform especially best practices around HIVE and HBASE
  • Should be a Rockstar in HBase and Hive Best Practices
  • Ability to focus on framework for Dev Ops, Ingestion, and Reading/writing into HDFS
  • Worked with different data formats (Parquet, Avro, JSON, XML, etc.)
  • Worked on containerized solutions (Spring Boot and Docker)
  • Translate, load and present disparate data-sets in multiple formats and from multiple sources including JSON, Avro, text files, Kafka queues, and log data.
  • Lead workshops with many teams to define data ingestion, validation, transformation, data engineering, and Data MOdeling
  • Performance tune HIVE and HBASE jobs with a focus on ingestion
  • Design and develop open source platform components using Spark, Sqoop, Java, Oozie, Kafka, Python, and other components
  • Lead the technical planning & requirements gathering phases including estimate, develop, test, manage projects, architect and deliver complex projects
  • Participate and lead in design sessions, demos and prototype sessions, testing and training workshops with business users and other IT associates
  • Contribute to the thought capital through the creation of executive presentations, architecture documents and articulate them to executives through presentations
Qualifications
    • Spring, Docker, Hibernate /Spring , JPA, Pivotal, Kafka, NoSQL,
      Hadoop Containers Docker, work, Spring boot .
    • At least 3+ years of experience on working with large projects including the most recent project in the MapR platform
    • At least 5+ years of Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
    • Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
    • Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
    • Experience with end-to-end solution architecture for data capabilities including:
    • Experience with ELT/ETL development, patterns and tooling (Informatica, Talend)
    • Ability to produce high quality work products under pressure and within deadlines with specific references
    • VERY strong communication, solutioning, and client facing skills especially non-technical business users
    • At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
    • At least 5+ years of working with a complex Big Data environment
    • 5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management toolsets
  • Preferred Skills And Education
    Masters degree in Computer Science or related field
    Certification in Azure platform
    Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
    More About Perficient
    Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
    Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
    Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
    Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
    Select work authorization questions to ask when applicants apply
    • Are you legally authorized to work in the United States?
    • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
    Perficient, Inc.
    • San Diego, CA
    At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
    Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
    Perficient currently has a career opportunity for a Senior MapR Solutions Architect.
    Job Overview
    One of our large clients has made strategic decision to move all order management and sales data from their existing EDW into MapR platform. The focus is fast ingestion and streaming analytics. This is a multiyear roadmap with many components that will piece into a larger Data Management Platform. Perficient subject matter expert will work with the client team to move this data into new environment in a fashion that will meet requirements for applications and analytics.
    A Senior Solutions Architect is expected to be knowledgeable in two or more technologies within (a given Solutions/Practice area). The Solutions Architect may or may not have a programming background, but will have expert infrastructure architecture, client presales / presentation, team management and thought leadership skills.
    You will provide best-fit architectural solutions for one or more projects; you will assist in defining scope and sizing of work; and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients' business needs. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions, participate in Sales and various pursuits focused on our clients' business needs.
    You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, design, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. This role is considered part of the Business Unit Senior Leadership team and may mentor junior architects and other delivery team members.
    Responsibilities
    • Provide vision and leadership to define the core technologies necessary to meet client needs including: development tools and methodologies, package solutions, systems architecture, security techniques, and emerging technologies
    • HANDS ON ARCHITECT with VERY STRONG Map R, HBASE, AND HIVE Skills
    • Ability to architect and design end to end on data architecture (ingestion to semantic layer). Identify best ways to export the data to the reporting/analytic layer
    • Recommend best practices and approach on Distributed architecture (Doesnt have to be Map R specific)
    • Most recent project/job to be the Architect of an end to end Big Data implementation which is deployed.
    • Need to articulate best practices on building framework for Data layer (Ingesting, Curating), Aggregation layer, and Reporting layer
    • Understand and articulate DW principles on Hadoop landscape (not just data lake)
    • Performed data model design based HBase and Hive
    • Background of database design for DW on RDBMS is preferred
    • Ability to look at the end to end and suggest physical design remediation on Hadoop
    • Ability to design solutions for different use cases
    • Worked with different data formats (Parquet, Avro, JSON, XML, etc.)
    Qualifications
    • Apache framework (Kafka, Spark, Hive, HBase)
    • Map R or similar distribution (Optional)
    • Java
    • Data formats (Parquet, Avro, JSON, XML, etc.)
    • Microservices
    Responsibilities
    • At least 10+ years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems
    • At least 3+ years of experience on working with large projects including the most recent project in the MapR platform
    • At least 5+ years of Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
    • Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
    • Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
    • Experience with end-to-end solution architecture for data capabilities including:
    • Experience with ELT/ETL development, patterns and tooling (Informatica, Talend)
    • Ability to produce high quality work products under pressure and within deadlines with specific references
    • VERY strong communication, solutioning, and client facing skills especially non-technical business users
    • At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
    • At least 5+ years of working with a complex Big Data environment
    • 5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management toolsets
    Preferred Skills And Education
    Masters degree in Computer Science or related field
    Certification in Azure platform
    Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
    More About Perficient
    Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
    Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
    Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
    Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
    Select work authorization questions to ask when applicants apply
    • Are you legally authorized to work in the United States?
    • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
    Perficient, Inc.
    • Houston, TX
    At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
    Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
    Perficient currently has a career opportunity for a MapR Spark Developer
    Job Overview
    One of our large clients has made strategic decision to move all order management and sales data from their existing EDW into MapR platform. The focus is fast ingestion and streaming analytics. This is a multiyear roadmap with many components that will piece into a larger Data Management Platform. Perficient subject matter expert will work with the client team to move this data into new environment in a fashion that will meet requirements for applications and analytics.
    As a Data developer, you will be incharge of developing data pipelines using Spark and other ingestion framework.
    Responsibilities
      Prov
      • ide end to end vision and hands on experience with MapR Platform especially best practices around HIVE and HBASE Shoul
      • d be a Rockstar in HBase and Hive Best PracticesAbili
      • ty to focus on ingestion and transformationUnder
      • stand MDM and think about MDM in HDFSAbili
      • ty to design solutions for different use cases
      • Worked with different data formats (Parquet, Avro, JSON, XML, etc.)
      • Troubleshoot and develop on Hadoop technologies including HDFS, Kafka, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools such as ODI for Big Data and APIs to extract data from source.
      • Translate, load and present disparate data-sets in multiple formats and from multiple sources including JSON, Avro, text files, Kafka queues, and log data.
      • Lead workshops with many teams to define data ingestion, validation, transformation, data engineering, and Data MOdeling
      • Performance tune HIVE and HBASE jobs with a focus on ingestion
      • Design and develop open source platform components using Spark, Sqoop, Java, Oozie, Kafka, Python, and other components
      • Lead the technical planning & requirements gathering phases including estimate, develop, test, manage projects, architect and deliver complex projects
      • Participate and lead in design sessions, demos and prototype sessions, testing and training workshops with business users and other IT associates
      • Contribute to the thought capital through the creation of executive presentations, architecture documents and articulate them to executives through presentations
    Qualifications
    • At least 3+ years of experience on working with large projects including the most recent project in the MapR platform
    • At least 5+ years of Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
    • Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
    • Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
    • Experience with end-to-end solution architecture for data capabilities including:
    • Experience with ELT/ETL development, patterns and tooling (Informatica, Talend)
    • Ability to produce high quality work products under pressure and within deadlines with specific references
    • VERY strong communication, solutioning, and client facing skills especially non-technical business users
    • At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
    • At least 5+ years of working with a complex Big Data environment
    • 5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management toolset
  • Preferred Skills And Education
    Masters degree in Computer Science or related field
    Certification in Azure platform
    Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
    More About Perficient
    Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
    Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
    Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
    Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
    Select work authorization questions to ask when applicants apply
    • Are you legally authorized to work in the United States?
    • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
    Perficient, Inc.
    • Philadelphia, PA
    At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
    Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
    Perficient currently has a career opportunity for a Senior MapR Solutions Architect.
    Job Overview
    One of our large clients has made strategic decision to move all order management and sales data from their existing EDW into MapR platform. The focus is fast ingestion and streaming analytics. This is a multiyear roadmap with many components that will piece into a larger Data Management Platform. Perficient subject matter expert will work with the client team to move this data into new environment in a fashion that will meet requirements for applications and analytics.
    A Senior Solutions Architect is expected to be knowledgeable in two or more technologies within (a given Solutions/Practice area). The Solutions Architect may or may not have a programming background, but will have expert infrastructure architecture, client presales / presentation, team management and thought leadership skills.
    You will provide best-fit architectural solutions for one or more projects; you will assist in defining scope and sizing of work; and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients' business needs. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions, participate in Sales and various pursuits focused on our clients' business needs.
    You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, design, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. This role is considered part of the Business Unit Senior Leadership team and may mentor junior architects and other delivery team members.
    Responsibilities
    • Provide vision and leadership to define the core technologies necessary to meet client needs including: development tools and methodologies, package solutions, systems architecture, security techniques, and emerging technologies
    • HANDS ON ARCHITECT with VERY STRONG Map R, HBASE, AND HIVE Skills
    • Ability to architect and design end to end on data architecture (ingestion to semantic layer). Identify best ways to export the data to the reporting/analytic layer
    • Recommend best practices and approach on Distributed architecture (Doesnt have to be Map R specific)
    • Most recent project/job to be the Architect of an end to end Big Data implementation which is deployed.
    • Need to articulate best practices on building framework for Data layer (Ingesting, Curating), Aggregation layer, and Reporting layer
    • Understand and articulate DW principles on Hadoop landscape (not just data lake)
    • Performed data model design based HBase and Hive
    • Background of database design for DW on RDBMS is preferred
    • Ability to look at the end to end and suggest physical design remediation on Hadoop
    • Ability to design solutions for different use cases
    • Worked with different data formats (Parquet, Avro, JSON, XML, etc.)
    Qualifications
    • Apache framework (Kafka, Spark, Hive, HBase)
    • Map R or similar distribution (Optional)
    • Java
    • Data formats (Parquet, Avro, JSON, XML, etc.)
    • Microservices
    Responsibilities
    • At least 10+ years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems
    • At least 3+ years of experience on working with large projects including the most recent project in the MapR platform
    • At least 5+ years of Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
    • Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
    • Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
    • Experience with end-to-end solution architecture for data capabilities including:
    • Experience with ELT/ETL development, patterns and tooling (Informatica, Talend)
    • Ability to produce high quality work products under pressure and within deadlines with specific references
    • VERY strong communication, solutioning, and client facing skills especially non-technical business users
    • At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
    • At least 5+ years of working with a complex Big Data environment
    • 5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management toolsets
    Preferred Skills And Education
    Masters degree in Computer Science or related field
    Certification in Azure platform
    Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
    More About Perficient
    Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
    Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
    Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
    Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
    Select work authorization questions to ask when applicants apply
    • Are you legally authorized to work in the United States?
    • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?
    Perficient, Inc.
    • Salt Lake City, UT
    At Perficient youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.
    Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.
    Perficient currently has a career opportunity for a Big Data Engineer(Microservices Developer),
    Job Overview
    One of our large clients has made strategic decision to move all order management and sales data from their existing EDW into MapR platform. The focus is fast ingestion and streaming analytics. This is a multiyear roadmap with many components that will piece into a larger Data Management Platform. Perficient subject matter expert will work with the client team to move this data into new environment in a fashion that will meet requirements for applications and analytics. As a lead developer, you will be responsible for Microservices development.
    Responsibilities
    • Ability to focus on framework for Dev Ops, Ingestion, and Reading/writing into HDFSWorked with different data formats (Parquet, Avro, JSON, XML, etc.)
    • Worked on containerized solutions (Kubernetes..)
    • Provide end to end vision and hands on experience with MapR Platform especially best practices around HIVE and HBASE
    • Should be a Rockstar in HBase and Hive Best Practices
    • Ability to focus on framework for Dev Ops, Ingestion, and Reading/writing into HDFS
    • Worked with different data formats (Parquet, Avro, JSON, XML, etc.)
    • Worked on containerized solutions (Spring Boot and Docker)
    • Translate, load and present disparate data-sets in multiple formats and from multiple sources including JSON, Avro, text files, Kafka queues, and log data.
    • Lead workshops with many teams to define data ingestion, validation, transformation, data engineering, and Data MOdeling
    • Performance tune HIVE and HBASE jobs with a focus on ingestion
    • Design and develop open source platform components using Spark, Sqoop, Java, Oozie, Kafka, Python, and other components
    • Lead the technical planning & requirements gathering phases including estimate, develop, test, manage projects, architect and deliver complex projects
    • Participate and lead in design sessions, demos and prototype sessions, testing and training workshops with business users and other IT associates
    • Contribute to the thought capital through the creation of executive presentations, architecture documents and articulate them to executives through presentations
    Qualifications
    • Spring, Docker, Hibernate /Spring , JPA, Pivotal, Kafka, NoSQL,
      Hadoop Containers Docker, work, Spring boot .
    • At least 3+ years of experience on working with large projects including the most recent project in the MapR platform
    • At least 5+ years of Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
    • Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
    • Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 2 years)
    • Experience with end-to-end solution architecture for data capabilities including:
    • Experience with ELT/ETL development, patterns and tooling (Informatica, Talend)
    • Ability to produce high quality work products under pressure and within deadlines with specific references
    • VERY strong communication, solutioning, and client facing skills especially non-technical business users
    • At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
    • At least 5+ years of working with a complex Big Data environment
    • 5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management toolsets
  • Preferred Skills And Education
    Masters degree in Computer Science or related field
    Certification in Azure platform
    Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work.
    More About Perficient
    Perficient is the leading digital transformation consulting firm serving Global 2000 and enterprise customers throughout North America. With unparalleled information technology, management consulting and creative capabilities, Perficient and its Perficient Digital agency deliver vision, execution and value with outstanding digital experience, business optimization and industry solutions.
    Our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers, suppliers and partners; and reduce costs. Perficient's professionals serve clients from a network of offices across North America and offshore locations in India and China. Traded on the Nasdaq Global Select Market, Perficient is a member of the Russell 2000 index and the S&P SmallCap 600 index.
    Perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national, origin, disability status, protected veteran status, or any other characteristic protected by law.
    Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time.
    Select work authorization questions to ask when applicants apply
    • Are you legally authorized to work in the United States?
    • Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?