OnlyDataJobs.com

ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The purpose of this role is to enable and support Citizen Data Scientists (CDS) to develop analytical workflows and to manage the adoption and implementation of the latest innovations within the ConocoPhillips preferred analytics tools for Citizen Data Science.
This position will enable analytics tools and solutions for customers including; the facilitation of solution roadmap, the adoption of new analytics functionality, the integration between applications based on value driven workflows, the support and training of users on the new capabilities.
Responsibilities May Include
  • Work with customers to enable the latest data analytics capabilities
  • Understand and help implement the latest innovations available within ConocoPhillips preferred analytics platform including Spotfire, Statistica, ArcGIS Big Data (Spatial Analytics), Teradata and Python
  • Help users with the implementation of analytics workflows through integration of the analytics applications
  • Manage analytics solutions roadmap and implementation timeline enabling geoscience customers to take advantage of the latest features or new functionality
  • Communicate with vendors and COP community on analytics technology functionality upgrades, prioritized enhancements and adoption
  • Test and verify that existing analytics workflows are supported within the latest version of the technology
  • Guide users on how to enhance their current workflows with the latest analytics technology
  • Facilitate problem solving with analytics solutions
  • Work with other AICOE teams to validate and implement new technology or version upgrades into production
Specific Responsibilities May Include
    Provi
    • de architectural guidance for building integrated analytical solutions Under
    • stands analytics product roadmaps, product development and the implementation of new featuresPromo
    • tes new analytics product features within customer base and demonstrates how it enables analytics workflowsManag
    • e COP analytics product adoption roadmapCaptu
    • re product enhancement list and coordinate prioritization with the vendorTest
    • new capabilities and map them to COP business workflowsCoord
    • inate with the AICOE team the timely upgrades of the new features Provi
    • des support to CDS for:
    • analytics modelling best practices
    • know how implementation of analytics workflows based on new technology
  • Liaise with the AICOE Infrastructure team for timely technology upgrades
  • Work on day to day end user support activities for Citizen Data Science tools; Advanced Spotfire, Statistica, GIS Big Data
  • Provides technical consulting and guidance to Citizen Data Scientist for the design and development of complex analytics workflows
  • Communicates analytics technology roadmap to end users
  • Communicates and demonstrates the value of new features to COP business
  • Train and mentor Citizen Data Science on analytics solutions
Basic/Required
  • Legally authorized to work in the United States
  • Bachelor's degree in Information Technology, Computer Sciences, Geoscience, Engineering, Statistics or related field
  • 5+ years of experience in oil & gas and geoscience data and workflows
  • 3+ years of experience with Tibco Spotfire
  • 3+ years of experience Teradata or using SQL databases
  • 1+ years of experience with ArcGIS spatial analytics tools
  • Advanced knowledge and experience of Integration platform
Preferred
  • Masters degree in Analytics or related field
  • 1+ years of experience with Tibco Statistica or equivalent statistics-based analytics package
  • Prior experience in implementing and supporting visual, prescriptive and predictive analytics
  • In-depth understanding of the analytics applications and integration points
  • Experience implementing data science workflows in Oil & Gas
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 27, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 13, 2019, 4:51:37 PM
ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Sr. Analytics Analyst will be part of the Production, Drilling, and Projects Analytics Services Team within the Analytics Innovation Center of Excellence that enables data analytics across the ConocoPhillips global enterprise. This role works with business units and global functions to help strategically design, implement, and support data analytics solutions. This is a full-time position that provides tremendous career growth potential within ConocoPhillips.
Responsibilities May Include
  • Complete end to end delivery of data analytics solutions to the end user
  • Interacting closely with both business and developers while gathering requirements, designing, testing, implementing and supporting solutions
  • Gather business and technical specifications to support analytic, report and database development
  • Collect, analyze and translate user requirements into effective solutions
  • Build report and analytic prototypes based on initial business requirements
  • Provide status on the issues and progress of key business projects
  • Providing regular reporting on the performance of data analytics solutions
  • Delivering regular updates and maintenance on data analytics solutions
  • Championing the data analytics solutions and technologies at ConocoPhillips
  • Integrate data for data models used by the customers
  • Deliver Data Visualizations used for data driven decision making
  • Provide strategic technology direction while supporting the needs of the business
Basic/Required
  • Legally authorized to work in the United States
  • 5+ years of related IT experience
  • 5+ year of Structure Querying Language experience (ANSI SQL, T-SQL, PL/SQL)
  • 3+ years hands-on experience delivering solutions with an Analytics Tools i.e. (Spotfire, SSRS, Power BI, Tableau, Business Objects)
Preferred
  • Bachelor's Degree in Information Technology or Computer Science
  • 5+ years of Oil and Gas Industry experience
  • 5+ years hands-on experience delivering solutions with Informatica PowerCenter
  • 5+ years architecting data warehouses and/or data lakes
  • 5+ years with Extract Transform and Load (ETL) tools and best practices
  • 3+ years hands-on experience delivering solutions with Teradata
  • 1+ years developing analytics models with R or Python
  • 1+ years developing visualizations using R or Python
  • Experience with Oracle (11g, 12c) and SQL Server (2008 R2, 2010, 2016) and Teradata 15.x
  • Experience with Hadoop technologies (Hortonworks, Cloudera, SQOOP, Flume, etc.)
  • Experience with AWS technologies (S3, SageMaker, Athena, EMR, Redshift, Glue, etc.)
  • Thorough understanding of BI/DW concepts, proficient in SQL, and data modeling
  • Familiarity with ETL tools (Informatica, etc.) and ETL processes
  • Solutions oriented individual; learn quickly, understand complex problems, and apply useful solutions
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to work as a team player
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Strong analytical, troubleshooting, and problem-solving skills experience in analyzing and understanding business/technology system architectures, databases, and client applications to recognize, isolate, and resolve problems
  • Demonstrates the desire and ability to learn and utilize new technologies in data analytics solutions
  • Strong communication and presentation skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 20, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 13, 2019, 4:56:49 PM
Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Raleigh, NC
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
R121 LLC
  • Philadelphia, PA

R121's direct end-user client is looking for a Salesforce Developer with expertise in custom object development, knowledge in standard modules such as Sales Cloud and Service Cloud as well as AppExchange/App cloud application integration. This is a 6 month contract located outside Philadelphia, PA. This a good opportunity to work on several applications and spread your wings as a developer!


US citizens and those authorized to work in the US are encouraged to apply because we are unable to transfer or sponsor H1b candidates now.


Qualifications

  • 5-7 of Salesforce development
  • Development experience w/C#, .NET skills WCF, WebAPL, MVC, Entity Framework
  • Good experience with Visualforce, Apex
  • Healthcare Industry experience a plus
  • Experience working with applications developed by other users
  • Excellent communication skills are required


R121 Company Description

R121 is an internationally recognized IT Consulting Services firm established since 2003 with a specialty in SAP, Cloud Solutions & Big Data. We understand the marketplace and pride ourselves on serving IT candidates as individuals, not commodities. We recognize a candidate's personalized skills and can match them with both direct end clients and select consulting partners.


Additional Information


If you are interested, please respond to this ad with an updated resume and a summary of your skills. We look forward to hearing from you soon.

All your information will be kept confidential according to EEO guidelines.

*SAP is the trademark of SAP AG in Germany and in several other countries

Signify Health
  • Dallas, TX

Position Overview:

Signify Health is looking for a savvy Data Engineer to join our growing team of deep learning specialists. This position would be responsible for evolving and optimizing data and data pipeline architectures, as well as, optimizing data flow and collection for cross-functional teams. The Data Engineer will support software developers, database architects, data analysts, and data scientists. The ideal candidate would be self-directed, passionate about optimizing data, and comfortable supporting the Data Wrangling needs of multiple teams, systems and products.

If you enjoy providing expert level IT technical services, including the direction, evaluation, selection, configuration, implementation, and integration of new and existing technologies and tools while working closely with IT team members, data scientists, and data engineers to build our next generation of AI-driven solutions, we will give you the opportunity to grow personally and professionally in a dynamic environment. Our projects are built on cooperation and teamwork, and you will find yourself working together with other talented, passionate and dedicated team member, all working towards a shared goal.

Essential Job Responsibilities:

  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing data models for greater scalability, etc.
  • Leverage Azure for extraction, transformation, and loading of data from a wide variety of data sources in support of AI/ML Initiatives
  • Design and implement high performance data pipelines for distributed systems and data analytics for deep learning teams
  • Create tool-chains for analytics and data scientist team members that assist them in building and optimizing AI workflows
  • Work with data and machine learning experts to strive for greater functionality in our data and model life cycle management capabilities
  • Communicate results and ideas to key decision makers in a concise manner
  • Comply with applicable legal requirements, standards, policies and procedures including, but not limited to the Compliance requirements and HIPAA.


Qualifications:Education/Licensing Requirements:
  • High school diploma or equivalent.
  • Bachelors degree in Computer Science, Electrical Engineer, Statistics, Informatics, Information Systems, or another quantitative field. or related field or equivalent work experience.


Experience Requirements:
  • 5+ years of experience in a Data Engineer role.
  • Experience using the following software/tools preferred:
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with AWS or Azure cloud services.
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C#, etc.
  • Strong work ethic, able to work both collaboratively, and independently without a lot of direct supervision, and solid problem-solving skills
  • Must have strong communication skills (written and verbal), and possess good one-on-one interpersonal skills.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
  • 2 years of experience in data modeling, ETL development, and Data warehousing
 

Essential Skills:

  • Fluently speak, read, and write English
  • Fantastic motivator and leader of teams with a demonstrated track record of mentoring and developing staff members
  • Strong point of view on who to hire and why
  • Passion for solving complex system and data challenges and desire to thrive in a constantly innovating and changing environment
  • Excellent interpersonal skills, including teamwork and negotiation
  • Excellent leadership skills
  • Superior analytical abilities, problem solving skills, technical judgment, risk assessment abilities and negotiation skills
  • Proven ability to prioritize and multi-task
  • Advanced skills in MS Office

Essential Values:

  • In Leadership Do whats right, even if its tough
  • In Collaboration Leverage our collective genius, be a team
  • In Transparency Be real
  • In Accountability Recognize that if it is to be, its up to me
  • In Passion Show commitment in heart and mind
  • In Advocacy Earn trust and business
  • In Quality Ensure what we do, we do well
Working Conditions:
  • Fast-paced environment
  • Requires working at a desk and use of a telephone and computer
  • Normal sight and hearing ability
  • Use office equipment and machinery effectively
  • Ability to ambulate to various parts of the building
  • Ability to bend, stoop
  • Work effectively with frequent interruptions
  • May require occasional overtime to meet project deadlines
  • Lifting requirements of
Ultra Tendency
  • Riga, Lettland

You are a developer that loves to take a look at infrastructure as well? You are a systems engineer that likes to write code? Ultra Tendency is looking for you! 


Your Responsibilities:



  • Support our customers and development teams in transitioning capabilities from development and testing to operations

  • Deploy and extend large-scale server clusters for our clients

  • Fine-tune and optimize our clusters to process millions of records every day 

  • Learn something new every day and enjoy solving complex problems


Job Requirements:



  • You know Linux like the back of your hand

  • You love to automate all the things – SaltStack, Ansible, Terraform and Puppet are your daily business

  • You can write code in Python, Java, Ruby or similar languages.

  • You are driven by high quality standards and attention to detail

  • Understanding of the Hadoop ecosystem and knowledge of Docker is a plus


We offer:



  • Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager

  • Work on the open-source community and become a contributor. Learn from open-source enthusiasts which you will find nowhere else in Germany!

  • Work in an English-speaking, international environment

  • Work with cutting edge equipment and tools

Ultra Tendency
  • Berlin, Deutschland

You love writing high quality code? You enjoy designing algorithms for large-scale Hadoop clusters? Spark is your daily business? We have new challenges for you!


Your Responsibilities:



  • Solve Big Data problems for our customers in all phases of the project life cycle

  • Build program code, test and deploy to various environments (Cloudera, Hortonworks, etc.)

  • Enjoy being challenged and solve complex data problems on a daily basis

  • Be part of our newly formed team in Berlin and help driving its culture and work attitude


Job Requirements



  • Strong experience developing software using Java or a comparable language

  • At least 2 years of experience with data ingestion, analysis, integration, and design of Big Data applications using Apache open-source technologies

  • Strong background in developing on Linux

  • Solid computer science fundamentals (algorithms, data structures and programming skills in distributed systems)

  • Sound knowledge of SQL, relational concepts and RDBMS systems is a plus

  • Computer Science (or equivalent degree) preferred or comparable years of experience

  • Being able to work in an English-speaking, international environment 


We offer:



  • Fascinating tasks and unique Big Data challenges in various industries

  • Benefit from 10 years of delivering excellence to our customers

  • Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager

  • Work on the open-source community and become a contributor

  • Fair pay and bonuses

  • Work with cutting edge equipment and tools

  • Enjoy our additional benefits such as a free BVG ticket and fresh fruits in the office

Comcast
  • Englewood, CO

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Summary:

Software engineering skills combined with the demands of a high volume, highly-visible analytics platform make this an exciting challenge for the right candidate.

Are you passionate about digital media, entertainment, and software services? Do you like big challenges and working within a highly motivated team environment?

As a software engineer in the Data Experience (DX) team, you will research, develop, support, and deploy solutions in real-time distributing computing architectures. The DX big data team is a fast-moving team of world-class experts who are innovating in providing user-driven, self-service tools for making sense and making decisions with high volumes of data. We are a team that thrives on big challenges, results, quality, and agility.

Who does the data engineer work with?

Big Data software engineering is a diverse collection of professionals who work with a variety of teams ranging from other software engineering teams whose software integrates with analytics services, service delivery engineers who provide support for our product, testers, operational stakeholders with all manner of information needs, and executives who rely on big data for data backed decisioning.

What are some interesting problems you'll be working on?

Develop systems capable of processing millions of events per second and multi-billions of events per day, providing both a real-time and historical view into the operation of our wide-array of systems. Design collection and enrichment system components for quality, timeliness, scale and reliability. Work on high-performance real-time data stores and a massive historical data store using best-of-breed and industry-leading technology.

Where can you make an impact?

Comcast DX is building the core components needed to drive the next generation of data platforms and data processing capability. Running this infrastructure, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robust big data architecture capable of providing insights that would otherwise be drowned in an ocean of data.

Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

  • Develop solutions to big data problems utilizing common tools found in the ecosystem.
  • Develop solutions to real-time and offline event collecting from various systems.
  • Develop, maintain, and perform analysis within a real-time architecture supporting large amounts of data from various sources.
  • Analyze massive amounts of data and help drive prototype ideas for new tools and products.
  • Design, build and support APIs and services that are exposed to other internal teams
  • Employ rigorous continuous delivery practices managed under an agile software development approach
  • Ensure a quality transition to production and solid production operation of the software

Skills & Requirements:

  • 5+ years programming experience
  • Bachelors or Masters in Computer Science, Statistics or related discipline
  • Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem.
  • Experience working on big data platforms in the cloud or on traditional Hadoop platforms
  • AWS Core
  • Kinesis
  • IAM
  • S3/Glacier
  • Glue
  • DynamoDB
  • SQS
  • Step Functions
  • Lambda
  • API Gateway
  • Cognito
  • EMR
  • RDS/Auora
  • CloudFormation
  • CloudWatch
  • Languages
  • Python
  • Scala/Java
  • Spark
  • Batch, Streaming, ML
  • Performance tuning at scale
  • Hadoop
  • Hive
  • HiveQL
  • YARN
  • Pig
  • Scoop
  • Ranger
  • Real-time Streaming
  • Kafka
  • Kinesis
  • Data File Formats:
  • Avro, Parquet, JSON, ORC, CSV, XML
  • NoSQL / SQL
  • Microservice development
  • RESTful API development
  • CI/CD pipelines
  • Jenkins / GoCD
  • AWS
    • CodeCommit
    • CodeBuild
    • CodeDeploy
    • CodePipeline
  • Containers
  • Docker / Kubernetes
  • AWS
    • Lambda
    • Fargate
    • EKS
  • Analytics
  • Presto / Athena
  • QuickSight
  • Tableau
  • Test-driven development/test automation, continuous integration, and deployment automation
  • Enjoy working with data data analysis, data quality, reporting, and visualization
  • Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly.
  • Great design and problem solving skills, with a strong bias for architecting at scale.
  • Adaptable, proactive and willing to take ownership.
  • Keen attention to detail and high level of commitment.
  • Good understanding in any: advanced mathematics, statistics, and probability.
  • Experience working in agile/iterative development and delivery environments. Comfort in working in such an environment. Requirements change quickly and our team needs to constantly adapt to moving targets.

About Comcast DX (Data Experience):

Data Experience(DX) is a results-driven, data platform research and engineering team responsible for the delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization. The mission of DX is to gather, organize, make sense of Comcast data, and make it universally accessible to empower, enable, and transform Comcast into an insight-driven organization. Members of the DX team define and leverage industry best practices, work on extremely large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines

Comcast is an EOE/Veterans/Disabled/LGBT employer

Pythian
  • Dallas, TX

Google Cloud Solutions Architect (Pre Sales)

United States | Canada | Remote | Work from Home

Why You?

Are you a US or Canada based Cloud Solutions Architect who likes to operate with a high degree of autonomy and have diverse responsibilities that require strong leadership, deep technology skills and a dedication to customer service? Do you have Big data and Data centric skills? Do you want to take part in the strategic planning of organizations data estate with a focus of fulfilling business requirements around cost, scalability and flexibility of the platform? Can you draft technology roadmaps and document best practice gaps with precise steps of how to get there? Can you implement the details of the backlogs you have helped build? Do you demonstrate consistent best practices and deliver strong customer satisfaction? Do you enjoy pre sales? Can you demonstrate adoption of new technologies and frameworks through the development of proofs of concepts?

If you have a passion for solving complex problems and for pre sales then this could be the job for you!

What Will You Be Doing?  

  • Collaborating with and supporting Pythian sales teams in the pre-sales & account management process from the technical perspective, remotely and on-site (approx 75%).
  • Defining solutions for current and future customers that efficiently address their needs. Leading through example and influence, as a master of applying technology solutions to solve business problems.
  • Developing Proof of Concepts (PoC) in order to demonstrate feasibility and value to Pythians customers (approx 25%).
  • Defining solutions for current and future customers that efficiently address their needs.
  • Identifying then executing solutions with a commitment to excellent customer service
  • Collaborating with others in refining solutions presented to customers
  • Conducting technical audits of existing architectures (Infrastructure, Performance, Security, Scalability and more) document best practices and recommendations
  • Providing component or site-wide performance optimizations and capacity planning
  • Recommending best practices & improvements to current operational processes
  • Communicating status and planning activities to customers and team members
  • Participate in periodic overtime (occasionally on short notice) travelling up to approx 50%).

What Do We Need From You?

While we realise you might not have everything on the list to be the successful candidate for the Solutions Architect job you will likely have at least 10 years experience in a variety of positions in IT. The position requires specialized knowledge and experience in performing the following:

  • Undergraduate degree in computer science, computer engineering, information technology or related field or relevant experience.
  • Systems design experience
  • Understanding and experience with Cloud architectures specifically: Google Cloud Platform (GCP) or Microsoft Azure
  • In-depth knowledge of popular database and data warehouse technologies from Microsoft, Amazon and/or Google (Big Data & Conventional RDBMS), Microsoft Azure SQL Data Warehouse, Teradata, Redshift,  BigQuery, Snowflake etc.
  • Be fluent in a few languages, preferably Java and Python, and having familiarity with Scala and Go would be a plus.
  • Proficient in SQL. (Experience with Hive and Impala would be great)
  • Proven ability to work with software engineering teams and understand complex development systems, environments and patterns.
  • Experience presenting to high level executives (VPs, C Suite)
  • This is a North American based opportunity and it is preferred that the candidate live on the West Coast, ideally in San Francisco or the Silicon Valley area but strong candidates may be considered from anywhere in the US or Canada.
  • Ability to travel and work across North America frequently (occasionally on short notice) up to 50% with some international travel also expected.

Nice-to-Haves:

  • Experience Architecting Big Data platforms using Apache Hadoop, Cloudera, Hortonworks and MapR distributions.
  • Knowledge of real-time Hadoop query engines like Dremel, Cloudera Impala, Facebook Presto or Berkley Spark/Shark.
  • Experience with BI platforms, reporting tools, data visualization products, ETL engines.
  • Experience with any MPP (Oracle Exadata/DW, Teradata, Netezza, etc)
  • Understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc)
  • Prior experience working as/with Machine Learning Engineers, Data Engineers, or Data Scientists.
  • A certification such as Google Cloud Professional Cloud Architect, Google Professional Data Engineer or related AWS Certified Solutions Architect / Big Data or Microsoft Azure Architect
  • Experience or strong interest in people management, in a player-coach style of leadership longer term would be great.

What Do You Get in Return?

  • Competitive total rewards package
  • Flexible work environment: Why commute? Work remotely from your home, theres no daily travel requirement to the office!
  • Outstanding people: Collaborate with the industrys top minds.
  • Substantial training allowance: Hone your skills or learn new ones; participate in professional development days, attend conferences, become certified, whatever you like!
  • Amazing time off: Start with a minimum 3 weeks vacation, 7 sick days, and 2 professional development days!
  • Office Allowance: A device of your choice and personalise your work environment!  
  • Fun, fun, fun: Blog during work hours; take a day off and volunteer for your favorite charity.
Applied Resource Group
  • Atlanta, GA

Applied Resource Group is seeking a talented and experienced Data Engineer for our client, an emerging leader in the transit solutions space. As an experienced Data Engineer on the Data Services team, you will lead the design, development and maintenance of comprehensible data pipelines and distributed systems for data extraction, analysis, transformation, modelling and visualization. They're looking for independent thinkers that are passionate about technology and building solutions that continually improve the customer experience. Excellent communication skills and the ability to work collaboratively with teams is critical.
 

Job Duties/Responsibilities:

    • Building a unified data services platform from scratch, leveraging the most suitable Big Data tools following technical requirements and needs
    • Exploring and working with cutting edge data processing technologies
    • Work with distributed, scalable cloud-based technologies
    • Collaborating with a talented team of Software Engineers working on product development
    • Designing and delivering BI solutions to meet a wide range of reporting needs across the organization
    • Providing and maintaining up to date documentation to enable a clear outline of solutions
    • Managing task lists and communicating updates to stakeholders and team members following Agile Scrum methodology
    • Working as a key member of the core team to support the timely and efficient delivery of critical data solutions

 
Experience Needed:
 

    • Experience with AWS technologies are desired, especially those used for Data Analytics, including some of these: EMR, Glue, Data Pipelines, Lambda, Redshift, Athena, Kinesis, Elasticache, Aurora
    • Minimum of 5 years working in developing and building data solutions
    • Experience as an ETL/Data warehouse developer with knowledge in design, development and delivery of end-to-end data integration processes
    • Deep understanding of data storage technologies for structured and unstructured data
    • Background in programming and knowledge of programming languages such as Java, Scala, Node.js, Python.
    • Familiarity with cloud services (AWS, Azure, Google Cloud)
    • Experience using Linux as a primary development environment
    • Knowledge of Big data systems - Hadoop, pig, hive, shark/spark etc. a big plus.
    • Knowledge of BI platforms such as Tableau, Jaspersoft etc.
    • Strong communication and analytical skills
    • Capable of working independently under the direction of the Head of Data Services
    • Excellent communication, analytical and problem-solving skills
    • Ability to initially take direction and then work on own initiative
    • Experience working in AGILE

 
Nice-to-have experience and skills:

    • Masters in Computer-Science, Computer Engineering or equivalent  
    • Building data pipelines to perform real-time data processing using Spark Streaming and Kafka, or similar technologies.
Booz Allen Hamilton - Tagged
  • San Diego, CA
Job Description
Job Number: R0042382
Data Scientist, Mid
The Challenge
Are you excited at the prospect of unlocking the secrets held by a data set? Are you fascinated by the possibilities presented by machine learning, artificial intelligence advances, and IoT? In an increasingly connected world, massive amounts of structured and unstructured data open up new opportunities. As a data scientist, you can turn these complex data sets into useful information to solve global challenges. Across private and public sectors-from fraud detection, to cancer research, to national intelligence-you know the answers are in the data.
We have an opportunity for you to use your analytical skills to improve the DoD and federal agencies. Youll work closely with your customer to understand their questions and needs, then dig into their data-rich environment to find the pieces of their information puzzle. Youll develop algorithms, write scripts, build predictive analytics, use automation, and apply machine learning to turn disparate data points into objective answers to help our nations services and leaders make data-driven decisions. Youll provide your customer with a deep understanding of their data, what it all means, and how they can use it. Join us as we use data science for good in the DoD and federal agencies.
Empower change with us.
Build Your Career
At Booz Allen, we know the power of data science and machine intelligence and were dedicated to helping you grow as a data scientist. When you join Booz Allen, you can expect:
  • access to online and onsite training in data analysis and presentation methodologies, and tools like Hortonworks, Docker, Tableau, Splunk, and other open source and emerging tools
  • a chance to change the world with the Data Science Bowlthe worlds premier data science for social good competition
  • participation in partnerships with data science leaders, like our partnership with NVIDIA to deliver Deep Learning Institute (DLI) training to the federal government
Youll have access to a wealth of training resources through our Analytics University, an online learning portal specifically geared towards data science and analytics skills, where you can access more than 5000 functional and technical, certifications, and books. Build your technical skills through hands-on training on the latest tools and state-of-the-art tech from our in-house experts. Pursuing certifications? Take advantage of our tuition assistance, on-site bootcamps, certification training, academic programs, vendor relationships, and a network of professionals who can give you helpful tips. Well help you develop the career you want as you chart your own course for success.
You Have
  • Experience with one or more statistical analytical programming languages, including Python or R
  • Experience with source control and dependency management software, including Git or Maven
  • Experience with using relational databases, including MySQL
  • Experience with identifying analytic insight in data, developing visualizations, and presenting findings to stakeholders
  • Knowledge of object-oriented programming, including Java and C++
  • Knowledge of various machine learning algorithms and their designs, capabilities, and limitations
  • Knowledge of statistical analysis techniques
  • Ability to build complex extraction, transformation, and loading (ETL) pipelines to clean and fuse data together
  • Ability to obtain a security clearance
  • BA or BS degree
Nice If You Have
  • Experience with designing and implementing custom machine learning algorithms
  • Experience with graph algorithms and semantic Web
  • Experience with designing and setting up relational databases
  • Experience with Big Data computing environments, including Hadoop
  • Experience with Navy mission systems
  • MA degree in Mathematics, CS, or a related quantitative field
Clearance
Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information.
Were an EOE that empowers our peopleno matter their race, color, religion, sex, gender identity, sexual orientation, national origin, disability, or veteran statusto fearlessly drive change.
, CJ1, GD13, MPPC, SIG2017
phData, Inc.
  • Minneapolis, MN

Title: Big Data Solutions Architect (Minneapolis or US Remote)


Join the Game-Changers in Big Data  


Are you inspired by innovation, hard work and a passion for data?    


If so, this may be the ideal opportunity to leverage your background in Big Data and Software Engineering, Data Engineering or Data Analytics experience to design, develop and innovate big data solutions for a diverse set of clients.  


As a Solution Architect on our Big Data Consulting team, your responsibilities include:


    • Design, develop, and innovative Big Data solutions; partner with our internal Managed Services Architects and Data Engineers to build creative solutions to solve tough big data problems.  
    • Determine the project road map, select the best tools, assign tasks and priorities, and assume general project management oversight for performance, data integration, ecosystem integration, and security of big data solutions
    • Work across a broad range of technologies from infrastructure to applications to ensure the ideal Big Data solution is implemented and optimized
    • Integrate data from a variety of data sources (data warehouse, data marts) utilizing on-prem or cloud-based data structures (AWS); determine new and existing data sources
    • Design and implement streaming, data lake, and analytics big data solutions

    • Create and direct testing strategies including unit, integration, and full end-to-end tests of data pipelines

    • Select the right storage solution for a project - comparing Kudu, HBase, HDFS, and relational databases based on their strengths

    • Utilize ETL processes to build data repositories; integrate data into Hadoop data lake using Sqoop (batch ingest), Kafka (streaming), Spark, Hive or Impala (transformation)

    • Partner with our Managed Services team to design and install on prem or cloud based infrastructure including networking, virtual machines, containers, and software

    • Determine and select best tools to ensure optimized data performance; perform Data Analysis utilizing Spark, Hive, and Impala

    • Mentor and coach Developers and Data Engineers. Provide guidance with project creation, application structure, automation, code style, testing, and code reviews

Qualifications

  • 5+ years previous experience as a Software Engineer, Data Engineer or Data Analytics - combined with an expertise in Hadoop Technologies and Java programming
  • Technical Leadership experience leading/mentoring junior software/data engineers, as well as scoping activities on large scale, complex technology projects
  • Expertise in core Hadoop technologies including HDFS, Hive and YARN.  
  • Deep experience in one or more ecosystem products/languages such as HBase, Spark, Impala, Solr, Kudu, etc
  • Expert programming experience in Java, Scala, or other statically typed programming language
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries
  • Excellent communication skills including proven experience working with key stakeholders and customers
  • Ability to translate big picture business requirements and use cases into a Hadoop solution, including ingestion of many data sources, ETL processing, data access and consumption, as well as custom analytics
  • Customer relationship management including project escalations, and participating in executive steering meetings
  • Ability to learn new technologies in a quickly changing field
phData, Inc.
  • Minneapolis, MN

Title: Big Data Solutions Architect (Minneapolis or US Remote)


Join the Game-Changers in Big Data  


Are you inspired by innovation, hard work and a passion for data?    


If so, this may be the ideal opportunity to leverage your background in Big Data and Software Engineering, Data Engineering or Data Analytics experience to design, develop and innovate big data solutions for a diverse set of clients.  


As a Solution Architect on our Big Data Consulting team, your responsibilities include:


    • Design, develop, and innovative Big Data solutions; partner with our internal Managed Services Architects and Data Engineers to build creative solutions to solve tough big data problems.  
    • Determine the project road map, select the best tools, assign tasks and priorities, and assume general project management oversight for performance, data integration, ecosystem integration, and security of big data solutions
    • Work across a broad range of technologies from infrastructure to applications to ensure the ideal Big Data solution is implemented and optimized
    • Integrate data from a variety of data sources (data warehouse, data marts) utilizing on-prem or cloud-based data structures (AWS); determine new and existing data sources
    • Design and implement streaming, data lake, and analytics big data solutions

    • Create and direct testing strategies including unit, integration, and full end-to-end tests of data pipelines

    • Select the right storage solution for a project - comparing Kudu, HBase, HDFS, and relational databases based on their strengths

    • Utilize ETL processes to build data repositories; integrate data into Hadoop data lake using Sqoop (batch ingest), Kafka (streaming), Spark, Hive or Impala (transformation)

    • Partner with our Managed Services team to design and install on prem or cloud based infrastructure including networking, virtual machines, containers, and software

    • Determine and select best tools to ensure optimized data performance; perform Data Analysis utilizing Spark, Hive, and Impala

    • Mentor and coach Developers and Data Engineers. Provide guidance with project creation, application structure, automation, code style, testing, and code reviews

Qualifications

  • 5+ years previous experience as a Software Engineer, Data Engineer or Data Analytics - combined with an expertise in Hadoop Technologies and Java programming
  • Technical Leadership experience leading/mentoring junior software/data engineers, as well as scoping activities on large scale, complex technology projects
  • Expertise in core Hadoop technologies including HDFS, Hive and YARN.  
  • Deep experience in one or more ecosystem products/languages such as HBase, Spark, Impala, Solr, Kudu, etc
  • Expert programming experience in Java, Scala, or other statically typed programming language
  • Strong working knowledge of SQL and the ability to write, debug, and optimize distributed SQL queries
  • Excellent communication skills including proven experience working with key stakeholders and customers
  • Ability to translate big picture business requirements and use cases into a Hadoop solution, including ingestion of many data sources, ETL processing, data access and consumption, as well as custom analytics
  • Customer relationship management including project escalations, and participating in executive steering meetings
  • Ability to learn new technologies in a quickly changing field
Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build robust and scalable micro-services

  • End to end ownership of backend services: Ideate, review design, build, code-review, test, load-test, launch, monitor performance

  • Identify opportunities to optimize ad delivery algorithm – measure and monitor ad-break utilization for ad count and ad duration.

  • Work with product team to translate requirements into well-defined technical implementation

  • Define technical and operational KPIs to measure ad delivery health

  • Build Functional and Qualitative Test frameworks for ad server

  • Challenge our team and software to be even better


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 7+ years of relevant software engineering experience

  • Strong analytical skills

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with non-relational database technologies (MongoDB, Cassandra, DynamoDB)

  • Experience with Redis and/or MemCache

  • Experience with Apache Kafka and/or Kinesis

  • AWS

  • Big Data technologies and data warehouses – Spark, Hadoop, Redshift

eBay
  • Berlin, Germany
(Sr) Software Engineer – Scala / Java



This is an exciting opportunity to work on a new, innovative product and we are looking for you as a (Senior) Software Engineer – Scala / Java (f/m).

This role sits within the Data- and Tech- organization and directly partners with Data Scientists and Product Managers to help create valuable products for our users.

We’re looking for a highly skilled engineer, who is passionate about turning data into valuable product for our customers.

 

Description


  • Successfully partner with Data Scientists to build advanced services and architecture to provide scalable data products for a high-traffic web platform

  • Successfully partner with system operators to design data generation and provisioning architectures.

  • Design scalable architectures for real-time and batch data processing

  • Research opportunities for data acquisition and new uses for existing data

  • Help to build a healthy, sustainable and scalable data infrastructure


What we expect from you:


  • Solid experience as a Scala (and preferably Java) developer with deep understanding of the surrounding technology ecosystem

  • Strong analytical skills and ability to produce clean, well-maintainable code

  • Experience with building microservices and/or distributed systems and an understanding of web tech

  • Experience with Big Data related technologies (e.g. Elastic Search, Kafka, Spark, Hadoop), with NoSQL databases (e.g. Mongo, Cassandra) and SQL databases

  • Knowledge of streaming architecture is a big plus

  • Ideally you are interested in machine learning algorithms, data structures and related libraries

  • Desire to quickly learn new technologies and the pragmatism to use the tool that fits a problem best

  • Able to work self-driven, but also operate as part of a strong, cohesive team

  • Strong communication skills and fluency in English


 You have a degree in Computer Sciences or comparable topic.

 

What you can expect from us: 


  • A harmonious, informal, international and playful work environment; 

  • Work with cool modern technologies, processes and consumer facing products; 

  • Access to tools and resources to do your job;  

  • Ability to join multiple internal interest groups in eBay in trending topics like Data Science, Mobile Development, Customer Experience and more.


What we offer:


  • Dynamic team with exciting personalities, passion for e-commerce and professionalism

  • International career opportunities throughout eBay Inc.

  • Work with advanced techniques and analytical standards.

  • We are early adopter of new analytical trends

  • We offer continuous development through active trainings and workshops & conferences.

  • An excellent working environment with flat hierarchies and lots of flexibility

  • Challenging work with a diverse, highly skilled and humorous team

  • Competitive salary and great benefits e.g. conference & education budget (language courses, soft skill training and mentoring), company pension scheme, employee stock plans, mobile phone for business and personal use.

  • Home office and working in our Motor-Talk office located in Berlin-Friedrichshain are possible depending on schedule

Antuit
  • Dallas, TX

Location: Dallas, TX or Chicago, IL. Open to talk to candidates from across locations


Antuit seeks a Data Scientist/Senior Data Scientist to develop machine learning algorithms in Supply Chain and Forecasting domain with data science toolkits that include Python, R or SAS. This role is also participates in the design process and it is responsible for implementation. The ideal candidate will view the role as an excellent opportunity to master and support solving world-class data science problems.

 Data Scientist responsibilities and duties:

    Devel
    • op machine learning algorithms in Supply chain and Forecasting domain with data science toolkits that include Python, R or SAS Furth
    • er design processes and implement them
    • Research and develop efficient and robust machine learning algorithms
    • Collaborate and work closely with cross-functional Antuit teams and domain experts to identify gaps and structure problems
    • Create meaningful presentations and analyses that tell a story, focused on insights, to communicate results and ideas to key decision makers at Antuit and client companies

Data Scientist qualifications and skills:

  • Experience / Education. Masters or PhD in Computer Science, Computer Engineering, Electrical Engineering, Statistics, Applied Math or another related field. 4-10 years work experience involving quantitative data analyses for problem solving (work experience negotiable for recent PhDs with relevant research experience). Experience working with cloud Big Data Stack to orchestrate data gathering, cleansing, preparation and modelling. Additional experience with forecasting and optimization problems; and implementing data analytics solutions with Python, R or SAS
  • Knowledge. Exceptionally skilled in machine learning, data analytics, pattern recognition and predictive modelling
  • Strong communication and presentation skills. Effective communication and story-telling skills
  • Energy and enthusiasm. Passion for learning and contributing to development
  • A true team plater. Collaborative mindset for effective communication across teams

EEOC

Antuit is an at-will, equal opportunity employer. All qualified applicants will receive consideration for employment witout regard to race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law. 

Perficient, Inc.
  • Dallas, TX

At Perficient, youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.

Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.

About Our Data Governance Practice:


We provide exceptional data integration services in the ETL, Data Catalog, Data Quality, Data Warehouse, Master Data Management (MDM), Metadata Management & Governance space.

Perficient currently has a career opportunity for a Python Developer who resides in the vicinity of Jersey City, NJ or Dallas,TX.

Job Overview:

As a Python developer, you will participate in all aspects of the software development lifecycle which includes estimating, technical design, implementation, documentation, testing, deployment, and support of application developed for our clients. As a member working in a team environment, you will take direction from solution architects and Leads on development activities.


Required skills:

  • 6+ years of experience in architecting, building and maintaining software platforms and large-scale data infrastructures in a commercial or open source environment
  • Excellent knowledge of Python
  • Good knowledge of and hands on experience working with quant/data Python libraries (pandas/numpy etc)
  • Good knowledge of and hands on experience designing APIs in Python (using Django/Flask etc)

Nice to have skills (in the order of priority):

  • Comfortable and Hands on experience with AWS cloud (S3, EC2, EMR, Lambda, Athena, QuickSight etc.) and EMR tools (Hive, Zeppelin etc)
  • Experience building and optimizing big data data pipelines, architectures and data sets.
  • Hands on experience in Hadoop MapReduce or other big data technologies and pipelines (Hadoop, Spark/pyspark, MapReduce, etc.)
  • Bash Scripting
  • Understanding of Machine Learning and Data Science processes and techniques
  • Experience in Java / Scala


Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities, and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues with great benefits are just part of what makes Perficient a great place to work.

Computer Staff
  • Fort Worth, TX

We have been retained by our client located in Fort Worth, Texas (south Ft Worth area), to deliver a Risk Modeler on a regular full-time basis.   We prefer SAS experience but are interviewing candidates with R, SPSS, WPS, MatLab or similar statistical package experience if candidate has experience from financial loan credit risk analysis industry. Enjoy all the resources of a big company, none of problems that small companies have. This company has doubled in size in 3 years. We have a keen interest in finding a business minded statistical modeling candidate with some credit risk experience to build statistical models within the marketing, direct mail areas of financial services, lending, loans. We are seeking a candidate with statistical modeling, and data analysis skills, interested in creating better ways to solve problems in order to increase loan originations, and decrease loan defaults, and more. Our client is in business to find prospective borrowers, originate loans, provide loans, service loans, process loans and collect loan payments. The team works with third party data vendors, credit reporting agencies and data service providers, data augmentation, address standardization, fraud detection; decision sciences, analytics, and this position includes create of statistical models. They support the one of the, if not the largest profile of decision management in the US.  


We require experience with statistical analysis tools such as SAS, Matlab, R, WPS or SPSS or Python if to do statistical analysis. This is a statistical modeling, risk modeling, model building, decision science, data analysis and statistical analysis type of role requiring SQL and/or SQL Server experience and critical thinking skills to solve problems.   We prefer candidates with experience with data analysis, SQL queries, joins (left, inner, outer, right), reporting from data warehouses with tools such as Tableau, COGNOS, Looker, Business Objects. We prefer candidates with financial and loan experience especially knowledge of loan originations, borrower profiles or demographics, modeling loan defaults, statistical analysis i.e. Gini Coefficients and K-S test / Kolmogorov-Smirnov test for credit scoring and default prediction and modeling.


However, primarily critical thinking skills, and statistical modeling and math / statistics skills are needed to fulfill the tasks of this very interesting and important role, including playing an important role growing your skills within this small risk/modeling team. Take on challenges in the creation and use of statistical models. There is no use for Hadoop, or any NoSQL databases in this position this is not a big data type of position. no "big data" type things needed. There is no Machine Learning or Artificial Intelligence needed in this role. Your role is to create and use those statistical models. Create statistical models for direct mail in financial lending space to reach the right customers with the right profiles / demographics / credit ratings, etc. Take credit risk, credit analysis, loan data and build a new model, or validate the existing model, or recalibrate it or rebuild it completely.   The models are focused on delivering answers to questions or solutions to problems within these areas financial loan lending: Risk Analysis, Credit Analysis, Direct Marketing, Direct Mail, and Defaults. Logistical regression in SAS or Knowledge Studio, and some light use of Looker as the B.I. tool on top of SQL Server data.   Deliver solutions or ways for this business to make improvements in these areas and help the business be more profitable. Seek answers to questions. Seek solutions to problems. Create models. Dig into the data. Explore and find opportunities to improve the business. Expected to fit within the boundaries of defaults or loan values and help drive the business with ideas to get a better models in place, or explore data sources to get better models in place. Use critical thinking to solve problems.


Answer questions or solve problems such as:

What are the statistical models needed to produce the answers to solve risk analysis and credit analysis problems?

What are customer profiles have the best demographics or credit risk for loans to send direct mail items to as direct marketing pieces?

Why are loan defaults increasing or decreasing? What is impacting the increase or decrease of loan defaults?  



Required Skills

Bachelors degree in Statistics or Finance or Economics or Management Information Systems or Math or Quantitative Business Analysis or Analytics any other related math or science or finance degree. Some loan/lending business domain work experience.

Masters degree preferred, but not required.

Critical thinking skills.

must have SQL skills (any database SQL Server, MS Access, Oracle, PostgresSQL, Postgres) and the ability to write queries, joins, inner joins, left joins, right joins, outer joins. SQL Server is highly preferred.

Any statistical analysis systems / packages experience including statistical modeling experience, and excellent math skills:   SAS, Matlab, R, WPS, SPSS or Python with R language if used in statistical analysis. Must have significant statistical modeling skills and experience.



Preferred Skills:
Loan Credit Analysis highly preferred.   SAS highly preferred.
Experience with Tableu, Cognos, Business Objects, Looker or similar data warehouse data slicing and dicing and data warehouse reporting tools.   Creating reports from data warehouse data, or data warehouse reporting. SQL Server SSAS but only to pull reports. Direct marketing, direct mail marketing, loan/lending to somewhat higher risk borrowers.



Employment Type:   Regular Full-Time

Salary Range: $85,000 130,000 / year    

Benefits:  health, medical, dental, vision only cost employee about $100 per month.
401k 4% matching after 1 year, Bonus structure, paid vacation, paid holidays, paid sick days.

Relocation assistance is an option that can be provided, for a very well qualified candidate. Local candidates are preferred.

Location: Fort Worth, Texas
(area south of downtown Fort Worth, Texas)

Immigration: US citizens and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1b candidates at this time.

Please apply with resume (MS Word format preferred), and also Apply with your Resume or apply with your Linked In Profile via the buttons on the bottom of this Job Posting page:  

http://www.computerstaff.com/?jobIdDescription=314  


Please call 817-424-1411 or please send a Text to 817-601-7238 to inquire or to follow up on your resume application. Yes, we recommend you call to leave a message, or send a text with your name, at least.   Thank you for your attention and efforts.

IT People Corporation
  • Raleigh, NC

Senior Big Data Platform Architect w/Data Migration- Direct Hire- Raleigh, NC

Want to take your career to the next level and work for a company that truly cares about their employees and the community around them?

We have a great a direct hire career opportunity for a Senior Big Data Platform Architect w/Data Migration expertise.

Our client is one of the most revolutionary and trusted resources for IT and information services. They play a vital role in supporting business processes and provide business intelligence that their clients can truly rely upon to increase productivity and achieve better operational efficiency.

With a generous benefits package- our client is one of the best places to work in the area.  They offer:
Competitive Compensation, Annual Review and Bonus, Employee Assistance Program On-Site Workout Facility, Recreational Activities, Flexible Work Arrangements, Ergonomic Work Stations, Medical Coverage Dental Coverage, Vision Coverage, 401(k) Retirement Program with matching, 12 paid holidays, Generous allowance for Vacation and Sick Days , Flexible Spending Accounts, Dependent Care Life Insurance, Short-Term and Long-Term Disability Insurance, and Supplemental Long-Term Disability Insurance .

Position Summary:

The Senior Big Data Platform Architect will provide thought leadership and technical direction for the data engineering team and work with the lead of the advanced analytics capability to develop technical strategies and mature the technical stack towards improving operational outcomes and usability, as well as, keeping current with new emerging technologies. Will lead project teams through POC efforts related to new technologies or new use of existing technologies.  

Minimum Requirements

  • Extensive experience troubleshooting issues in complex, distributed systems
  • 5+ years experience architecting, developing, releasing, and maintaining large-scale enterprise data platforms both on premise as well as cloud. 5+ years of experience analyzing data with SQL and implementing large-scale RDBMS. 5+ years experience designing software for performance, reliability and scalability.
  • 5+ years of programming proficiency in a subset of Python, R, Java, and Scala.
  • 2+ years of experience with building solutions leveraging NoSQL and highly distributed databases such as HBase and Cassandra.
  • 2+ years of experience implementing cloud-based systems (AWS/Azure/GCP)
  • 3+ years proficiency in configuring and deploying applications on Linux-based systems
  • 5+ years of experience implementing data pipelines in large-scale data analysis systems such as Hadoop or MPP databases. 3+ years of experience Spark or similar engines. 5+ years of experience in data flow and systems integration. 3+ Experience operationalizing and integrating analytics models and solutions within products and applications
  • Experience of hands-on platform architecture and solutions design and implementation (5+ years).
  • Deep understanding of algorithms, data structures, performance optimization techniques, and design patterns for building highly scalable Big Data Solutions and distributed applications
  • Machine Learning is a big plus
  • Experience collaborating with business and IT counterparts, as well summarizing and presenting complex technical architectures and solutions to a wide variety of stakeholders
  • Ability to manage multiple activities in a deadline-oriented environment
  • Superior problem-solving skills
  • Ability to work independently in unstructured environments in a self-directed way, with accuracy and attention to detail. Ability to take a leadership role on engagements and with customers.
  • Strong teamwork skills and ability to work effectively with multiple internal customers
  • Ability to provide technical expertise to others and explain concepts to technical staff and leadership team
  • Ability to quickly learn and master recent technologies and various business applications
  • Ability to build business acumen and understand business domain. Experience mentoring other technical resources and leading technical implementations.  

Education

Bachelors degree in Computer Science or equivalent field and 10+ years of technical experience or Masters Degree in Computer Science or equivalent field and 7+ years of technical experience

Responsibilities

Provide thought leadership and technical direction for the data engineering team and work with the lead of the advanced analytics capability to develop technical strategies and mature the technical stack towards improving operational outcomes and usability, as well as, keeping current with new emerging technologies. Will lead project teams through POC efforts related to new technologies or new use of existing technologies.  

Responsible for assisting product managers and the analytics teams in translating business requirements into solutions that meet business value objectives and are aligned with best practices and industry standards. Document architectural decisions through the depiction of concepts, relationships, constraints, and operations

 

Salary range is negotiable and is contingent upon level of expertise and years of experience.


For immediate consideration for this consulting opportunity, please submit your resume attachment to:  Dianne Lancaster, Technical Recruiter at IT People, the appropriate email is: dianne.lancaster@itpeoplecorp.com .

NO 3rd parties please!


NO Sponsorship available at this time