OnlyDataJobs.com

Freeport-McMoRan
  • Phoenix, AZ

Provide management and leadership to the Big Data project teams. Directs initiatives in the Freeport-McMoRan Big Data program. Provides analytical direction, expertise and support for the Big Data program; this includes project leadership for initiatives, coordination with business subject matter experts and travel to mine sites. This will be a global role that will coordinate with site and corporate stakeholders to ensure global alignment on service and project delivery. The role will also work with business operations management to ensure the program is focusing in areas most beneficial to the company.


  • Work closely with business, engineering and technology teams to develop solution to data-intensive business problems
  • Supervise internal and external science teams
  • Perform quality control of deliverables
  • Prepare reports and presentations, and communicate with Executives
  • Provide thought leadership in algorithmic and process innovations, and creativity in solving unconventional problems
  • Use statistical and programming tools such as R and Python to analyze data and develop machine-learning models
  • Perform other duties as required


Minimum Qualifications


  • Bachelors degree in an analytical field (statistics, mathematics, etc.) and eight (8) years of relevant work experience, OR
  • Masters degree in an analytical field (statistics, mathematics, etc.) and six (6) years of relevant work experience, OR
  • Proven track record of collaborating with business partners to translate business problems and needs into data-based analytical solutions
  • Proficient in predictive modeling:
  • Linear and logistic regression
  • Tree based techniques (CART, Random Forest, Gradient Boosting)
  • Time-Series Analysis
  • Anomaly detection
  • Survival Analysis
  • Strong Experience with SQL/Hive environments
  • Skilled with R and/or Python analysis environments
  • Experience with Big Data tools for machine learning, R, Hive, Python
  • Good communication skills


Preferred Qualifications


  • Doctorate degree in an analytical field
  • Willing and able to travel 20-30% or more


Criteria/Conditions


  • Ability to understand and apply verbal and written work and safety-related instructions and procedures given in English
  • Ability to communicate in English with respect to job assignments, job procedures, and applicable safety standards
  • Must be able to work in a potentially stressful environment
  • Position is in busy, non-smoking office located in downtown Phoenix, AZ
  • Location requires mobility in an office environment; each floor is accessible by elevator
  • Occasionally work will be performed in a mine, outdoor or manufacturing plant setting
  • Must be able to frequently sit, stand and walk
  • Must be able to frequently lift and carry up to ten (10) pounds
  • Personal protective equipment is required when performing work in a mine, outdoor, manufacturing or plant environment, including hard hat, hearing protection, safety glasses, safety footwear, and as needed, respirator, rubber steel-toe boots, protective clothing, gloves and any other protective equipment as required
  • Freeport-McMoRan promotes a drug/alcohol-free work environment through the use of mandatory pre-employment drug testing and on-going random drug testing as allowed by applicable State laws


Freeport-McMoRan has reviewed the jobs at its various office and operating sites and determined that many of these jobs require employees to perform essential job functions that pose a direct threat to the safety or health of the employees performing these tasks or others. Accordingly, the Company has designated the following positions as safety-sensitive:


  • Site-based positions, or positions which require unescorted access to site-based operational areas, which are held by employees who are required to receive MSHA, OSHA, DOT, HAZWOPER and/or Hazard Recognition Training; or
  • Positions which are held by employees who operate equipment, machinery or motor vehicles in furtherance of performing the essential functions of their job duties, including operating motor vehicles while on Company business or travel (for this purpose motor vehicles includes Company owned or leased motor vehicles and personal motor vehicles used by employees in furtherance of Company business or while on Company travel); or
  • Positions which Freeport-McMoRan has designated as safety sensitive positions in the applicable job or position description and which upon further review continue to be designated as safety-sensitive based on an individualized assessment of the actual duties performed by a specifically identified employee.


Equal Opportunity Employer/Protected Veteran/Disability


Requisition ID
1900606 

Freeport-McMoRan
  • Phoenix, AZ

Supports the activities for all Freeport-McMoRan Big Data programs. Provides analytical support and expertise for the Big Data program; this includes coordination with business subject matter experts and travel to mine sites. The role will provide analyses and statistical models as part of Big Data projects, and may be the project lead on analytics initiatives. The role will also provide visualizations and descriptive results of the analysis. This will be a global role that will coordinate with site and corporate stakeholders to ensure alignment on project delivery.


    Work
    • closely with business, engineering and technology teams to analyze data-intensive business problems.
    • Research and develop appropriate statistical methodology to translate these business problems into analytics solutions
    • Perform quality control of deliverables
    • Develop visualizations of results and prepare deliverable reports and presentations, and communicate with business partners
    • Provide thought leadership in algorithmic and process innovations, and creativity in solving unconventional problems
    • Develop, implement and maintain analytical solutions in the Big Data environment
    • Work with onshore and offshore resources to implement and maintain analytical solutions
    • Perform variable selection and other standard modeling tasks
    • Produce model performance metrics
    • Use statistical and programming tools such as R and Python to analyze data and develop machine-learning models
    • Perform other duties as requested


Minimum Qualifications


  • Bachelors degree in an analytical field (statistics, mathematics, etc.) and five (5) years of relevant work experience, OR 
  • Masters degree in an analytical field (statistics, mathematics, etc.) and three (3) years of relevant work experience

  • Proven track record of collaborating with business partners to translate operational problems and needs into data-based analytical solutions

  • Proficient in predictive modeling:

  • Linear and logistic regression

  • Tree based techniques (CART, Random Forest, Gradient Boosting)

  • Time-Series Analysis

  • Anomaly detection

  • Survival Analysis

  • Strong experience with SQL/Hive environments

  • Skilled with R and/or Python analysis environments

  • Experience with Big Data tools for machine learning, R, Hive, Python

  • Good communication skills


Preferred Qualifications


  • Masters degree in an analytical field
  • Willing and able to travel 20-30% or more


Criteria/Conditions


  • Ability to understand and apply verbal and written work and safety-related instructions and procedures given in English
  • Ability to communicate in English with respect to job assignments, job procedures, and applicable safety standards

  • Must be able to work in a potentially stressful environment

  • Position is in busy, non-smoking office located in Phoenix, AZ

  • Location requires mobility in an office environment; each floor is accessible by elevator and internal staircase

  • Occasionally work may be performed in a mine, outdoor or manufacturing plant setting

  • Must be able to frequently sit, stand and walk

  • Must be able to frequently lift and carry up to ten (10) pounds

  • Personal protective equipment is required when performing work in a mine, outdoor, manufacturing or plant environment, including hard hat, hearing protection, safety glasses, safety footwear, and as needed, respirator, rubber steel-toe boots, protective clothing, gloves and any other protective equipment as required

  • Freeport-McMoRan promotes a drug/alcohol free work environment through the use of mandatory pre-employment drug testing and on-going random drug testing as per applicable State Laws


Freeport-McMoRan has reviewed the jobs at its various office and operating sites and determined that many of these jobs require employees to perform essential job functions that pose a direct threat to the safety or health of the employees performing these tasks or others. Accordingly, the Company has designated the following positions as safety-sensitive:


  • Site-based positions, or positions which require unescorted access to site-based operational areas, which are held by employees who are required to receive MSHA, OSHA, DOT, HAZWOPER and/or Hazard Recognition Training; or
  • Positions which are held by employees who operate equipment, machinery or motor vehicles in furtherance of performing the essential functions of their job duties, including operating motor vehicles while on Company business or travel (for this purpose motor vehicles includes Company owned or leased motor vehicles and personal motor vehicles used by employees in furtherance of Company business or while on Company travel); or
  • Positions which Freeport-McMoRan has designated as safety sensitive positions in the applicable job or position description and which upon further review continue to be designated as safety-sensitive based on an individualized assessment of the actual duties performed by a specifically identified employee.


Equal Opportunity Employer/Protected Veteran/Disability


Requisition ID
1900604 

Limelight Networks
  • Phoenix, AZ

Job Purpose:

The Sr. Data Services Engineer assists in maintaining the operational aspects of Limelight Networks platforms, provides guidance to the Operations group and acts as an escalation point for advanced troubleshooting of systems issues. The Sr. Data Services Engineer assists in the execution of tactical and strategic operational infrastructure initiatives by building and managing complex computing systems and processes that facilitate the introduction of new products and services while allowing existing services to scale.


Qualifications: Experience and Education (minimums)

  • Bachelors Degree or equivalent experience.
  • 2+ years experience working with MySQL (or other relational databases: Mongo DB, Cassandra, Hadoop, etc.) in a large-scale enterprise environment.
  • 2+ years Linux Systems Administration experience.
  • 2+ years Version Control and Shell scripting and one or more scripting languages including Python, Perl, Ruby and PHP.
  • 2+ Configuration Management Systems, using Puppet, Chef or SALT.
  • Experienced w/MySQL HA/Clustering solutions; Corosync, Pacemaker and DRBD preferred.
  • Experience supporting open-source messaging solutions such as RabbitMQ or ActiveMQ preferred.

Knowledge, Skills & Abilities

  • Collaborative in a fast-paced environment while providing exceptional visibility to management and end-toend ownership of incidents, projects and tasks.
  • Ability to implement and maintain complex datastores.
  • Knowledge of configuration management and release engineering processes and methodologies.
  • Excellent coordination, planning and written and verbal communication skills.
  • Knowledge of the Agile project management methodologies preferred.
  • Knowledge of a NoSQL/Big Data platform; Hadoop, MongoDB or Cassandra preferred.
  • Ability to participate in a 24/7 on call rotation.
  • Ability to travel when necessary.

Essential Functions:

  • Develop and maintain core competencies of the team in accordance with applicable architectures and standards.
  • Participate in capacity management of services and systems.
  • Maintain plans, processes and procedures necessary for the proper deployment and operation of systems and services.
  • Identify gaps in the operation of products and services and drive enhancements.
  • Evaluate release processes and tools to find areas for improvement.
  • Contribute to the release and change management process by collaborating with the developers and other Engineering groups.
  • Participate in development meetings and implement required changes to the operational architecture, standards, processes or procedures and ensure they are in place prior to release (e.g., monitoring, documentation and metrics).
  • Maintain a positive demeanor and a high level of professionalism at all times.
  • Implement proactive monitoring capabilities that ensure minimal disruption to the user community including: early failure detection mechanisms, log monitoring, session tracing and data capture to aid in the troubleshooting process.
  • Implement HA and DR capabilities to support business requirements.
  • Troubleshoot and investigate database related issues.
  • Maintain migration plans and data refresh mechanisms to keep environments current and in sync with production.
  • Implement backup and recovery procedures utilizing various methods to provide flexible data recovery capabilities.
  • Work with management and security team to assist in implementing and enforcing security policies.
  • Create and manage user and security profiles ensuring application security policies and procedures are followed.

Huntech USA LLC
  • San Diego, CA

Great opportunity to work with the leader in semiconductor industry who unveiled the worlds first 7 nanometer PC platform, created from the ground up for the next generation of personal computing by bringing new features with thin and light designs, allowing for new form factors in the always-on, always-connected category. It features the new octa-core CPU, the fastest CPU ever designed and built, with a larger cache than previous compute platforms, faster multi-tasking and increased productivity for users, disrupting the performance expectations of current thin, light and fanless PC designs. This platform is currently sampling to customers and is expected to begin shipping in commercial devices in Q3 of 2019.


Staff Data Analyst

You will study the performance of the Global Engineering Grid/ Design workflows across engineering grid and provide insights in effective analytics in support of Grid 2.0 program. You will conduct research, design statistical studies and analyze data in support of Grid 2.0 program.  This job will challenge you to dive deep into the engineering grid/ design flow world and understand the unique challenges in operating engineering grid at a scale unrivaled in the industry.  You should have experience working in an EDA or manufacturing environment and comfortable workings in an environment where problems are not always well-defined.


Responsibilities:

  • Identify and pursue opportunities to improve the efficiency of global engineering grid and design workflows.
  • Develop systems to invest, analyze, and take automated action across real-time feeds of high volume data.
  • Research and implement new analytics approaches effective deployment of machine learning/ data modeling to solve business problems Identify patterns and trends from large, high-dimensional data sets; manipulate data into digestible and actionable reports.
  • Make business recommendations (e.g. cost-benefit, experiment analysis) with effectivepresentations of findings at multiple levels of stakeholders through visual displays of quantitative information.
  • Plan effectively to set priorities and manage projects, identify roadblocks and come up technical options.


Leverage your 8+ years of experience articulating business questions and using mathematical techniques to arrive at an answer using available data. 3 - 4 yrs advance Tableau is a must. Experience translating analysis results into business recommendations. Experience with statistical software (e.g., R, Python, MATLAB, pandas, scala) and database languages like SQL Experience with data warehousing concepts (Hadoop, mapR) and visualization tools (e.g. QlikView, Tableau, Angular, Thoughtspot). Strong business acumen,critical thinking ability, and attention to detail.


Background in data science, applied mathematics, or computational science and a history of solving difficult problems using a scientific approach with MS or BS degree in a quantitative discipline (e.g., Statistics, Applied Mathematics, Operations Research, Computer Science, Electrical Engineering) and understand how to design scientific studies. You should be familiar with the state of the art in machine learning/ data modeling/ forecasting and optimization techniques in a big data environment.



Data Analytics Software Test Engineer

As a member of the Corporate Engineering Services Group software test team, you will be responsible for testing various cutting edge data analytics products and solutions. You will be working with a dynamic engineering team to develop test plans, execute test plans, automate test cases, and troubleshoot and resolve issues.


Leverage your 1+ years of experience in the following:

  • Testing and systems validation for commercial software systems.
  • Testing of systems deployed in AWS Cloud.
  • Knowledge of SQL and databases.
  • Developing and implementing software and systems test plans.
  • Test automation development using Python or Java.
  • Strong problem solving and troubleshooting skills.
  • Experience in testing web-based and Android applications.
  • Familiar with Qualcomm QXDM and APEX tools.
  • Knowledge of software development in Python.
  • Strong written and oral communication skills
  • Working knowledge of JIRA and GitHub is preferred.


Education:

  • Required: Bachelor's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering
  • Preferred: Master's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering or equivalent experience


Interested? Please send a resume to our Founder & CEO, Raj Dadlani at raj@huntech.com and he will respond to interested candidates within 24 hours of resume receipt. We are dealing with a highly motivated hiring manager and shortlisting viable candidates by February 22, 2019.

Vector Consulting, Inc
  • Atlanta, GA
 

Our Government client is looking for an experienced ETL Developer on a renewable contract in Atlanta, GA

Position ETL Developer

The desired candidate will be responsible for design, development, testing, maintenance and support of complex data extract, transformation and load (ETL) programs for an Enterprise Data Warehouse. An understanding of how complex data should be transformed from the source and loaded into the data warehouse is a critical part of this job.

  • Deep hands-on experience on OBIEE RPD & BIP Reporting Data models, Development for seamless cross-functional and cross-systems data reporting
  • Expertise and solid experience in BI Tools OBIEE, Oracle Data Visualization and Power BI
  • Strong Informatica technical knowledge in design, development and management of complex Informatica mappings, sessions and workflows on Informatica Designer Components -Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Strong programming skills, relational database skills with expertise in Advanced SQL and PL/SQL, indexing and query tuning
  • Having implemented Advanced Analytical models in Python or R
  • Experienced in Business Intelligence and Data warehousing concepts and methodologies.
  • Extensive experience in data analysis and root cause analysis and proven problem solving and analytical thinking capabilities.
  • Analytical capabilities to slice and dice data and display data in reports for best user experience.
  • Demonstrated ability to review business processes and translate into BI reporting and analysis solutions.
  • Ability to follow Software Development Lifecycle (SDLC) process and should be able to work under any project management methodologies used.
  • Ability to follow best practices and standards.
  • Ability to identify BI application performance bottlenecks and tune.
  • Ability to work quickly and accurately under pressure and project time constraints
  • Ability to prioritize workload and work with minimal supervision
  • Basic understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, Version Control and Office software
  • Exposure Data Modeling using Star/Snowflake Schema Design, Data Marts, Relational and Dimensional Data Modeling, Slowly Changing Dimensions, Fact and Dimensional tables, Physical and Logical data modeling and in big data technologies
  • Experience with Big Data Lake / Hadoop implementations

 Required Qualifications:

  • A bachelors degree in Computer Science or related field
  • 6 to 10 years of experience working with OBIEE / Data Visualization / Informatica / Python
  • Ability to design and develop complex Informatica mappings, sessions, workflows and identify areas of optimizations
  • Experience with Oracle RDBMS 12g
  • Effective communication skills (both oral and written) and the ability to work effectively in a team environment are required
  • Proven ability and desire to mentor/coach others in a team environment
  • Strong analytical, problem solving and presentation skills.

Preferred Qualifications:

  • Working knowledge with Informatica Change Data Capture installed on DB2 z/OS
  • Working knowledge of Informatica Power Exchange
  • Experience with relational, multidimensional and OLAP techniques and technology
  • Experience with OBIEE tools version 10.X
  • Experience with Visualization tools like MS Power BI, Tableau, Oracle DVD
  • Experience with Python building predictive models

Soft Skills:

  • Strong written and oral communication skills in English Language
  • Ability to work with Business and communicate technical solution to solve business problems

About Vector:

Vector Consulting, Inc., (Headquartered in Atlanta) is an IT Talent Acquisition Solutions firm committed to delivering results. Since our founding in 1990, we have been partnering with our customers, understanding their business, and developing solutions with a commitment to quality, reliability and value. Our continuing growth has been and continues to be built around successful relationships that are based on our organization's operating philosophy and commitment to ** People, Partnerships, Purpose and Performance - THE VECTOR WAY

Brighter Brain
  • Atlanta, GA

Brighter Brain is seeking a skilled professional to serve as an internal resource for our consulting firm in the field of Data Science Development. Brighter Brain provides Fortune 500 clients throughout the United States with IT consultants in a wide-ranging technical sphere.

In order to fully maintain our incoming nationwide and international hires, we will be hiring a Senior Data Science SME (ML) with practical experience to relocate to Atlanta and coach/mentor our incoming classes of consultants. If you have a strong passion for the Data Science platform and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

Brighter Brain is an IT Management & Consultingfirm providing a unique take on IT Consulting. We currently offer expertise to US clients in the field of Mobile Development (iOS and Android), Hadoop, Microsoft SharePoint, and Exchange/ Office 365. We are currently seeking a highly skilled professional to serve as an internal resource for our company in the field of Data Science with expertise in Machine Learning (ML)

The ideal candidatewill be responsible for establishing our Data Science practice. The responsibilities include creation of a comprehensive training program, training, mentoring, and supporting ideal candidates, as they progress towards building their career in Data Science Consulting. This position is based out of our head office in Atlanta, GA.

If you have a strong passion for Data Science and are looking to join a seasoned team of IT professionals, this could be an advantageous next step.

The Senior Data Science SMEwill take on the following responsibilities:

-       Design, develop and maintain Data Science training material, focused around: ML Knowledge of DL, NN & NLP is a plus.

-       Interview potential candidates to ensure that they will be successful in the Data Science domain and training.

-       Train, guide and mentor junior to mid-level Data Science developers.

-       Prepare mock interviews to enhance the learning process provided by the company.

-       Prepare and support consultants for interviews for specific assignments involving development and implementation of Data Science.

-       Act as a primary resource for individuals working on a variety of projects throughout the US.

-       Interact with our Executive and Sales team to ensure that projects and employees are appropriately matched.

The ideal candidatewill not only possess a solid knowledge of the realm, but must also have the fluency in the following areas:

-       Hands-on expertise in using Data Science and building machine learning models and Deep learning models

-       Statistics and data modeling experience

-       Strong understanding of data sciences

-       Understanding of Big Data

-       Understanding of AWS and/or Azure

-       Understand the difference between Tensorflow, MxNet, etc

Skills Include:

  • Masters Degree in the Computer Science or mathematics fields

    10+ Years of professional experience in the IT Industry, in the AI realm

  • Strong understanding of MongoDB, Scala, Node.js, AWS, & Cognitive applications
  • Excellent knowledge in Python, Scala, JavaScript and its libraries, Node.js, Python, R and MatLab C/C++ Lua or any proficient AI language of choice
  • NoSQL databases, bot framework, data streaming and integrating unstructured Data Rules engines e.g. drools, ESBs e.g. MuleSoft Computer
  • Vision,Recommendation Systems, Pattern Recognition, Large Scale Data Mining or Artificial Intelligence, Neural Networks
  • Deep Learning frameworks like Tensorflow, Torch, Caffee, Theano, CNTK, cikit-
  • learn, numpy, scipy
  • Working knowledge of ML such as: Naïve Bayes Classification, Ordinary Least
  • Square
  • Regression, Logic Regression, Supportive Vector Machines, Ensemble Methods,
  • Clustering
  • Algorithms, Principal Component Analysis, Singular Value Decomposition, and
  • Independent Component Analysis.  
  • Natural Language Processing (NLP) concepts such as topic modeling, intents,
  • entities, and NLP frameworks such as SpaCy, NLTK, MeTA, gensim or other
  • toolkits for Natural Language Understanding (NLU)
  • Experience data profiling, data cleansing, data wrangling/mungline, ETL
  • Familiarity with Spark MLlib, Mahout Google, Bing, and IBM Watson APIs
  • Hands on experience as needed with training a variety of consultants
  • Analytical and problem-solving skills
  • Knowledge of IOT space
  • Understand Academic Data Science vs Corporate Data Science
  • Knowledge of the Consulting/Sales structure

Additional details about the position:

-       Able to relocate to Atlanta, Ga (Relocation package available)

-       Work schedule of 9 AM to 6 PM EST

Questions: Send your resume to Ansel Butler at Brighter Brain; make sure that there is a valid phone number and Skype ID either on the resume, or in the body of the email.

Ansel Essic Butler

EMAIL: ANSEL.BUTLER@BRIGHTERBRAIN.COM

404 791 5128

SKYPE: ANSEL.BUTLER@OUTLOOK.COM

Senior Corporate Recruiter

Brighter Brain LLC.

1785 The Exchange, Suite 200

Atlanta, GA. 30339

Expedia, Inc.
  • Bellevue, WA

What is the first thing you do while planning your travel? Do you want to work on a team that helps travelers like you go places and make our world more connected?

Expedia Flights team is the traffic powerhouse for the Expedia group and our flights shopping platform is one of the largest in the world serving over 150 million queries a day and powering some of the strongest brands in the industry like Orbitz, Expedia, Travelocity, Wotif, Hotwire and ebookers. 

Our technology operations are global, with representation in US, Mexico, Australia and India.


What makes Flights technology unique?



  • We are one of the few companies in the world that develop a proprietary flight search engine which is used by millions of users every single day

  • We are moving one of the world’s biggest flights platforms to AWS

  • We handle several 100 thousand booking transactions daily and connect with all major GDS partners you can think of in the world

  • We collect terabytes of flight data and are actively looking to use ML to show the right content to our customers


Expedia is looking for an extraordinary Distinguished Engineer to join the Flight Search Team.  Best Fare Search, Expedia’s proprietary flight search and pricing engine, performs complex manipulations on massive and highly volatile datasets to power airline flight shopping for millions of customers every single day.


You will have the opportunity to understand and shape the marketplace. This role will pursue extremely hard problems, craft solutions and make design decisions which can have a large impact across the company. The systems you design and implement will be expected to meet the levels of scalability and robustness needed for this high-volume and high-visibility product.


Bring your programming smarts, problem solving skills, and passion for software engineering and join us as we solidify and grow our position as the leaders in the travel industry.


What you’ll do: 



  • Lead, influence, and be a contributor across our entire technology team while acting as an area expert for your team and flight search services

  • Primary designer and architect for the continued evolution of Best Fare Search and flight search services for Expedia Group

  • Design for high-performance, highly scalable, and reliable server applications in our data center and the cloud

  • Produce production quality code and have a strong eye for the operational aspects of the platform such as performance tuning, monitoring, and fault-tolerance

  • Design, interpret, analyze and work with large amounts of data to identify issues and patterns

  • Contribute to advancing the team’s design methodology and quality programming practices

  • Technical ownership of critical flight search systems and services from inception through operating in production


Who you are: 



  • Functional Expertise

  • At least 15 years of industry experience in a variety of contexts, during which you’ve built remarkably scalable, robust, and fault-tolerant systems

  • Expertise in solving large scale flight search problems a significant plus

  • Exceptional coding skills in C#, C++ or Java and proficiency with XML and SQL

  • Experience working in a cloud or virtual environment

  • Expertise with continuous integration/delivery and leveraging a dev ops mindset

  • Previous experience delivering data insights by querying dataset in a big data environment(Hadoop, SQL, AWS Aurora, S3 etc.) and performing real-time streaming analytics

  • Production focus: previous history of being hands on in solving critical production issues that affect our valued customers and drive those insights back into the product in true dev ops style

  • Knowledge of airline and/or global distribution system (GDS) preferred


People Leadership
Inspiring and approachable as a leader
Create an environment where people can realize their full potential
Be humble and lead with open, candid relationships
Inspire peripheral relationships across Expedia Group
Passionate about engaging and developing talent; attract, develop, engage and retain talented individuals with a compelling, unifying vision that steers and motivates
Strong people skills and ability to successfully lead up, down, and across the organization
Demonstrated the ability to mentor and grow more junior developers into strong, leading engineers
Proven capacity to establish trusted, effective relationships across diverse sets of partners


Additional Competencies



  • Natural bar-raiser: curious and passionate, with a desire to continuously learn more, which you use to understand basic business operations and the organizational levers that drive profitable growth

  • Bias to action, being familiar with methods and approaches needed to get things done in a collaborative, lean and fast-moving environment

  • Respond effectively to complex and ambiguous problems and situations

  • Lead mostly with questions rather than opinions, thriving on the opportunity to own, innovate, create, and constantly re-evaluate

  • Comfortable making recommendations across competing and equally critical business needs

  • Simplify, clearly and succinctly convey complex information and ideas to individuals at all levels of the organization

  • Motivated by goal achievement and continuous improvement, with the enthusiasm and drive to motivate your team and the wider organization



Why join us:
Expedia Group recognizes our success is dependent on the success of our people.  We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business.  Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so. 


Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach.  We relentlessly strive for better, but not at the cost of the customer.  We act with humility and optimism, respecting ideas big and small.  We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground, so we can act fast and stay simple.  Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.


If you have a hunger to make a difference with one of the most loved consumer brands in the world and to work in the dynamic travel industry, this is the job for you.


Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.



Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

Ultra Tendency
  • Berlin, Deutschland

Big Data Software Engineer


Lead your own development team and our customers to success! Ultra Tendency is looking for someone who convinces not just by writing excellent code, but also through strong presence and leadership. 


At Ultra Tendency you would:



  • Work in our office in Berlin/Magdeburg and on-site at our customer's offices

  • Make Big Data useful (build program code, test and deploy to various environments, design and optimize data processing algorithms for our customers)

  • Develop outstanding Big Data application following the latest trends and methodologies

  • Be a role model and strong leader for your team and oversee the big picture

  • Prioritize tasks efficiently, evaluating and balancing the needs of all stakeholders


Ideally you have:



  • Strong experience in developing software using Python, Scala or a comparable language

  • Proven experience with data ingestion, analysis, integration, and design of Big Data applications using Apache open-source technologies

  • Profound knowledge about with data engineering technology, e.g. Kafka, SPARK, HBase, Kubernetes

  • Strong background in developing on Linux

  • Solid computer science fundamentals (algorithms, data structures and programming skills in distributed systems)

  • Languages: Fluent English and German is a plus


We offer:



  • Fascinating tasks and unique Big Data challenges of major players from various industries (automotive, insurance, telecommunication, etc.)

  • Fair pay and bonuses

  • Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager

  • International diverse team

  • Possibility to work with the open-source community and become a contributor

  • Work with cutting edge equipment and tools


Confidentiality guaranteed

Webtrekk GmbH
  • Berlin, Deutschland
Your responsibilities:

In this role, you will set up your full-fledged research and development team of developers and data science engineers. You will evaluate and choose appropriate technologies and develop products that are powered by Artificial Intelligence and Machine Learning



  • Fast pace development of experimental prototypes, POCs and products for our >400 customers

  • Manage fast feedback cycles, adopt learnings and feedbacks and ultimately deliver AI powered products

  • You will develop new and optimise existing components always with an eye on scalability, performance and maintenance

  • Organize and lead team planning meetings and provide advice, clarification and guidance during the execution of sprints

  • Lead your teams' technical vision and drive the design and development of new innovative products and services from the technical side

  • Lead discussions with the team and management to define best practices and approaches

  • Set goals, objectives and priorities. Mentor team members and provide guidance by regular performance reviews.




The assets you bring to the team:


  • Hands on experience in agile software development on all levels based on profound technical understanding

  • Relevant experience in managing a team of software developers in an agile environment

  • At least 3 years of hands-on experience with developing in Frontend Technologies like Angular or React

  • Knowledge of backend technologies such as Java, Python or Scala are a big plus

  • Experience with distributed systems based on RESTful services

  • DevOps mentality and practical experience with tools for build and deployment automation (like Maven, Jenkins, Ansible, Docker)

  • Team and project-oriented leader with excellent problem solving and interpersonal skills

  • Excellent communication, coaching and conflict management skills as well as a strong assertiveness

  • Strong analytical capability, discipline, commitment and enthusiasm

  • Fluent in English, German language skills are a big plus




What we offer:


  • Prospect: We are a continuously growing team with experts in the most future-oriented fields of customer intelligence. We are dealing with real big data scenarios and data from various business models and industries. Apart from interesting tasks we offer you considerable freedom for your ideas and perspectives for the development of your professional and management skills.

  • Team oriented atmosphere: Our culture embraces integrity, team work and innovation. Our employees value the friendly atmosphere that is the most powerful driver within our company.

  • Goodies: Individual trainings, company tickets, team events, table soccer, fresh fruits and a sunny roof terrace.

  • TechCulture: Work with experienced developers who share the ambition for well-written and clean code. Choose your hardware, OS and IDE. Bring in your own ideas, work with open source and have fun at product demos, hackathons and meetups.

Citizens Advice
  • London, UK
  • Salary: £40k - 45k

As a Database engineer in the DevOps team here at Citizens Advice you will help us develop and implement our data strategy. You will have the opportunity to work with both core database technologies and big data solutions.


Past


Starting from scratch, we have built a deep tech-stack with AWS services at its core. We created a new CRM system, migrated a huge amount of data to AWS Aurora PG and used AWS RDS to run some of our business critical databases.


You will have gained a solid background and in-depth knowledge of AWS RDS, SQL/Admin against DBMS's such as PostgreSql / MySQL / SQL Server, Dynamo / Aurora. You will have dealt with Data Warehousing, ETL, DB Mirroring/Replication, and DB Security Mechanisms & Techniques.


Present


We use AWS RDS including Aurora as the standard DB implementation for our applications. We parse data in S3 using Spark jobs and we are planning to implement a data lake solution in AWS.


Our tools and technologies include:



  • Postgres on AWS RDS

  • SQL Server for our Data Warehouse

  • Liquibase for managing the DW schema

  • Jenkins 2 for task automation

  • Spark / Parquet / AWS Glue for parsing raw data

  • Docker / docker-compose for local testing


You will be developing, supporting and maintaining automation tools to drive database, reporting and maintenance tasks.


As part of our internal engineering platform offering, R&D time will give you the opportunity to develop POC solutions to integrate with the rest of the business.


Future


You will seek continuous improvement and implement solutions to help Citizens Advice deliver digital products better and quicker.


You will be helping us implement a data lake solution to improve operations and to offer innovative services.


You will have dedicated investment time at Citizens Advice to learn new skills, technologies, research topics or work on tools that make this possible.

Man AHL
  • London, UK

The Role


As a Quant Platform Developer at AHL you will be building the tools, frameworks, libraries and applications which power our Quantitative Research and Systematic Trading. This includes responsibility for the continued success of “Raptor”, our in-house Quant Platform, next generation Data Engineering, and evolution of our production Trading System as we continually expand the markets and types of assets we trade, and the styles in which we trade them. Your challenges will be varied and might involve building new high performance data acquisition and processing pipelines, cluster-computing solutions, numerical algorithms, position management systems, visualisation and reporting tools, operational user interfaces, continuous build systems and other developer productivity tools.


The Team


Quant Platform Developers at AHL are all part of our broader technology team, members of a group of over sixty individuals representing eighteen nationalities. We have varied backgrounds including Computer Science, Mathematics, Physics, Engineering – even Classics - but what unifies us is a passion for technology and writing high-quality code.



Our developers are organised into small cross-functional teams, with our engineering roles broadly of two kinds: “Quant Platform Developers” otherwise known as our “Core Techs”, and “Quant Developers” which we often refer to as “Sector Techs”. We use the term “Sector Tech” because some of our teams are aligned with a particular asset class or market sector. People often rotate teams in order to learn more about our system, as well as find the position that best matches their interests.


Our Technology


Our systems are almost all running on Linux and most of our code is in Python, with the full scientific stack: numpy, scipy, pandas, scikit-learn to name a few of the libraries we use extensively. We implement the systems that require the highest data throughput in Java. For storage, we rely heavily on MongoDB and Oracle.



We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, OpenStack for our private cloud, Ansible for architecture automation, and HipChat for internal communication. But our technology list is never static: we constantly evaluate new tools and libraries.


Working Here


AHL has a small company, no-attitude feel. It is flat structured, open, transparent and collaborative, and you will have plenty of opportunity to grow and have enormous impact on what we do.  We are actively engaged with the broader technology community.



  • We host and sponsor London’s PyData and Machine Learning Meetups

  • We open-source some of our technology. See https://github.com/manahl

  • We regularly talk at leading industry conferences, and tweet about relevant technology and how we’re using it. See @manahltech



We’re fortunate enough to have a fantastic open-plan office overlooking the River Thames, and continually strive to make our environment a great place in which to work.



  • We organise regular social events, everything from photography through climbing, karting, wine tasting and monthly team lunches

  • We have annual away days and off-sites for the whole team

  • We have a canteen with a daily allowance for breakfast and lunch, and an on-site bar for in the evening

  • As well as PC’s and Macs, in our office you’ll also find numerous pieces of cool tech such as light cubes and 3D printers, guitars, ping-pong and table-football, and a piano.



We offer competitive compensation, a generous holiday allowance, various health and other flexible benefits. We are also committed to continuous learning and development via coaching, mentoring, regular conference attendance and sponsoring academic and professional qualifications.


Technology and Business Skills


At AHL we strive to hire only the brightest and best and most highly skilled and passionate technologists.



Essential



  • Exceptional technology skills; recognised by your peers as an expert in your domain

  • A proponent of strong collaborative software engineering techniques and methods: agile development, continuous integration, code review, unit testing, refactoring and related approaches

  • Expert knowledge in one or more programming languages, preferably Python, Java and/or C/C++

  • Proficient on Linux platforms with knowledge of various scripting languages

  • Strong knowledge of one or more relevant database technologies e.g. Oracle, MongoDB

  • Proficient with a range of open source frameworks and development tools e.g. NumPy/SciPy/Pandas, Pyramid, AngularJS, React

  • Familiarity with a variety of programming styles (e.g. OO, functional) and in-depth knowledge of design patterns.



Advantageous



  • An excellent understanding of financial markets and instruments

  • Experience of front office software and/or trading systems development e.g. in a hedge fund or investment bank

  • Expertise in building distributed systems with service-based or event-driven architectures, and concurrent processing

  • A knowledge of modern practices for data engineering and stream processing

  • An understanding of financial market data collection and processing

  • Experience of web based development and visualisation technology for portraying large and complex data sets and relationships

  • Relevant mathematical knowledge e.g. statistics, asset pricing theory, optimisation algorithms.


Personal Attributes



  • Strong academic record and a degree with high mathematical and computing content e.g. Computer Science, Mathematics, Engineering or Physics from a leading university

  • Craftsman-like approach to building software; takes pride in engineering excellence and instils these values in others

  • Demonstrable passion for technology e.g. personal projects, open-source involvement

  • Intellectually robust with a keenly analytic approach to problem solving

  • Self-organised with the ability to effectively manage time across multiple projects and with competing business demands and priorities

  • Focused on delivering value to the business with relentless efforts to improve process

  • Strong interpersonal skills; able to establish and maintain a close working relationship with quantitative researchers, traders and senior business people alike

  • Confident communicator; able to argue a point concisely and deal positively with conflicting views.

FELFEL
  • Zürich, Switzerland

FULL-STACK PYTHON DEVELOPER ANALYTICS


Start: As soon as possible


Location: Zürich, 100%


THE FELFEL TECHNOLOGY TEAM & YOUR IMPACT


FELFEL is a Start-up based in Zürich, Switzerland. The company has revolutionized how people eat at work with its intelligent technology or ‘the FELFEL fridge’. The company is among the fastest growing start-ups in Switzerland with over 60 employees.


FELFEL's technology team, of 7 engineers, has built the heart of FELFEL's product and is responsible for the company's success as a leading European foodtech company. The team of seven developers consists of Front-End Developers, Back-End Developers, Data Scientists, and a Mobile Developer.


As a team member of the technology team, you will work very directly with our founders, the CTO, our other teams (product, growth, sales, support, etc.) to improve the experience for our end-customers eating at work every day, as well as for our internal users of our technology.


WHAT WE OFFER:




  • Lots of good (like really good ;)) food: just steps away from your desk – we make all food dreams come true…




  • Modern way of working: we use the latest technologies and offer lots time flexibility to individual team members (it matters ‘what’ you do not ‘when’ in the day).




  • Beautiful office in Zürich, Switzerland: a large open space on the top floor with lots of room to hide or room to socialize also with other team members. And of course, there is free lunch every day at our long, wooden table. P.s. there is a Lausanne office too where you can escape every now and then if you like…




  • International start-up atmosphere: the company language is English – our team is very diversified and consists of over 10 nationalities.




  • Work with direct impact: Exciting backend challenges to solve that will have very direct impact on our product.




  • Very little red tape. If you like to get things done using the latest in tech rather than talking about it, you'll feel right at home. You work directly with the CTO and other decision makers in the team (very little hierarchy).




  • Great place to work: we are a family business with strong values & beliefs - a great, warm-hearted team is waiting for you




  • Relocation support if you are based outside of Switzerland. We know that moving can be a challenge in terms of housing & documentation – we will do our best to make it as smooth as possible




  • Great team: we are very selective – and only hire the best chess players, cooks or video gamers ;)




YOUR ROLE:



  • You will develop and own part of the technical infrastructure for our data scientists to maintain high reliability and up-time

  • You will build and maintain live data visualizations and business intelligence tools to build a single source of truth for all of our teams to use on a daily basis

  • You will build and maintain customer e-mail engines to enhance engagement with our end-customers


WHO YOU ARE:



  • You are an experienced full-stack developer with broad programming skills - meaning you are able to build production-ready software that meets the needs of our data science team

  • Python has been your language of choice for a long time

  • You can architect and implement robust software that embraces change, and you understand distributed systems (microservice) architecture and APIs

  • You enjoy building tools for data visualization and reporting

  • You are either highly interested in Machine Learning and/or Big Data

  • You possess a degree in Computer Science or related fields

  • You thrive in a collaborative environment involving different stakeholders and subject matter experts

  • We like to learn new, fail fast and explore new territories - so do you

  • Your Attitude: ‘The Answer is Yes’

  • The Company Language is English - you are fluent in written and spoken English. If you speak German and/or French - even better. 

  • Authentic Love & Passion for good food

  • You never smell like cigarettes and don’t smoke during the day: FELFEL is about healthy living.

  • EU/EFTA or Swiss work permit.


HOW YOU APPLY:


Interested? Then please send your CV to hr@felfel.ch including a short motivation letter why you are the person who should take on this role.


Please make sure to mention the job title 'FULL-STACK PYTHON DEVELOPER ANALYTICS’ in the subject line!


WHO WE ARE, THE FELFEL COMPANY


FELFEL revolutionizes how people eat at work with smart technology.  The intelligent fridge 'FELFEL' makes it possible - good food all day long for employees at work by the best local chefs.


Over 30'000 employees in Switzerland already benefit from a FELFEL fridge at work in over 250 companies. 


FELFEL is a family-owned company and supports small, local producers in Switzerland. Sustainability is a core company value. The company has won several prestigious awards, among others the Swiss Economic Forum Award in 2017. 


The company was founded in 2013, and counts over 60 employees today. Friendship, respect, 'eating good food together' are key elements of its company culture. More insights on our company and our team you can find on our LinkedInFacebookInstagram and of course our Website

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-5-8 years of Java experience, Scala and Python experience a plus

-3+ years of experience as an analyst, data scientist, or related quantitative role.

-3+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Comcast
  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As a Data Science Engineer in Comcastdx, you will research, model, develop, support data pipelines and deliver insights for key strategic initiatives. You will develop or utilize complex programmatic and quantitative methods to find patterns and relationships in data sets; lead statistical modeling, or other data-driven problem-solving analysis to address novel or abstract business operation questions; and incorporate insights and findings into a range of products.

Assist in design and development of collection and enrichment components focused on quality, timeliness, scale and reliability. Work on real-time data stores and a massive historical data store using best-of-breed and industry leading technology.

Responsibilities:

-Develop and support data pipelines

-Analyze massive amounts of data both in real-time and batch processing utilizing Spark, Kafka, and AWS technologies such as Kinesis, S3, ElastiSearch, and Lambda

-Create detailed write-ups of processes used, logic applied, and methodologies used for creation, validation, analysis, and visualizations. Write ups shall occur initially, within a week of when process is created, and updated in writing when changes occur.

-Prototype ideas for new ML/AI tools, products and services

-Centralize data collection and synthesis, including survey data, enabling strategic and predictive analytics to guide business decisions

-Provide expert and professional data analysis to implement effective and innovative solutions meshing disparate data types to discover insights and trends.

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Support DevOps practices to deploy and operate our systems

-Automate and streamline our operations and processes

-Troubleshoot and resolve issues in our development, test and production environments

Here are some of the specific technologies and concepts we use:

-Spark Core and Spark Streaming

-Machine learning techniques and algorithms

-Java, Scala, Python, R

-Artificial Intelligence

-AWS services including EMR, S3, Lambda, ElasticSearch

-Predictive Analytics

-Tableau, Kibana

-Git, Maven, Jenkins

-Linux

-Kafka

-Hadoop (HDFS, YARN)

Skills & Requirements:

-3-5years of Java experience, Scala and Python experience a plus

-2+ years of experience as an analyst, data scientist, or related quantitative role.

-2+ years of relevant quantitative and qualitative research and analytics experience. Solid knowledge of statistical techniques.

-Bachelors in Statistics, Math, Engineering, Computer Science, Statistics or related discipline. Master's Degree preferred.

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem

-Experience with more advanced modeling techniques (eg ML.)

-Distinctive problem solving and analysis skills and impeccable business judgement.

-Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection

-Experience with real-time data pipelines and components including Kafka, Spark Streaming

-Proficient in Unix/Linux environments

-Test-driven development/test automation, continuous integration, and deployment automation

-Excellent communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Team player is a must

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Attention to detail and high level of commitment

-Thrives in a fast-paced agile environment

About Comcastdx:

Comcastdxis a result driven big data engineering team responsible for delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization.dxhas an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of thedxteam define and leverage industry best practices, work on large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Comcast is an EOE/Veterans/Disabled/LGBT employer

Accenture
  • San Diego, CA
Job Title: Accenture Digital, Accenture Analytics, Data Science, Consultant
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.
People in our Client & Market career track drive profitable growth by developing market-relevant insights to increase market share or create new markets. They progress through required promotion into market-facing roles that have a direct impact on sales.
Accenture Analytics
Accenture Analytics delivers insight driven outcomes at scale to help organizations improve performance. Our extensive capabilities range from accessing and reporting on data to advanced modeling, forecasting, and sophisticated statistical analysis. Specifically we...
    • Help companies better understand consumers and how to connect with them across markets and channels, using data and analytics
    • Design and develop reusable analytical assets using advanced statistical and computational methods
    • Proactively monitor and analyze complex systems to understand, diagnose, and continuously improve key performance indicators
    • Pilot sophisticated advanced analytics & innovative analytics solutions to prove value
    • Work closely with business leaders to understand needs and assist with architecting and deploying innovative solutions
    • Operate at the frontier of innovative analytics; introduce and implement newest market developments & trends in analytics
    • Optimize Resources processes and integrate across the enterprise in order to maximize the opportunities of data, analytics, and outcomes
    • Build, deploy, maintain and scale advanced analytic solutions that reduce complexity and cost
    • Partner and team with technology solution providers to deliver the best solution meeting the needs of our clients
    • Offer our clients end to end solutions and services
Job Description
The Consultant is responsible for delivery on Resources analytics projects. This process requires the use of advanced statistical analysis in a Resources industry.
The Key Responsibilities Include
    • Client-facing interaction including providing analyses, recommendations, presentations and advice to clients.
    • Adapts existing methods and procedures to create possible alternative solutions to moderately complex problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Understands client use cases / user stories and maps them to solutions based on best practice.
    • Uses considerable judgment to determine solution and seeks guidance on complex problems.
    • Primary upward interaction is with direct supervisor. May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance.
    • Decisions often impact the team in which they reside.
    • Project-based analytics including but not limited to: Machine Learning, Predictive Analytics, Comparative Effectiveness Analysis, Failure Analysis, Big Data Analytics, Optimization, Demand Forecasting, Customer Segmentation, Customer Analytic Record.
Basic Qualifications
    • Minimum of Bachelor's Degree required in related field; strong preference for fields of study in the data science, mathematics, economics, statistics, engineering and information management
    • Minimum of 3 years delivery experience in advanced modeling environment: strong understanding of statistical concepts and predictive modeling. (e.g., AI neural networks, multi-scalar dimensional models, logistic regression techniques, machine-based learning, big data platforms, SQL, etc.)
    • Minimum 3 years experience with predictive analytics tools, including at least two of the following: R, SAS, Alteryx, Python, Spark, and Tableau
Preferred Qualifications
    • Minimum of 2 years of experience with consulting or implementing transformational change
    • Experience in the analysis of marketing databases using SAS or other statistical modeling tools.
    • Experience in the following areas: Applied Statistics/Econometrics, Statistical Programming, Database Management & Operations, Digital, Comparative Effectiveness Research
    • Possess a blend of marketing acumen, consulting expertise and analytical capabilities that can create value and insights for our clients.
All of our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. Youll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Infosys
  • Houston, TX
Responsibilities

-Hands on experience with Big Data systems, building ETL pipelines, data processing, and analytics tools
-Understanding of data structures & common methods in data transformation.
-Familiar with the concepts of dimensional modeling.
-Sound knowledge of one programming language - Python or Java
-Programming experience using tools such as Hadoop and Spark.
-Strong proficiency in using query languages such as SQL, Hive and SparkSQL
-Experience in Kafka & Scala would be a plus

Impetus
  • Phoenix, AZ

      Multiple Positions I Multiple Locations : Phoenix, AZ/ Richmond, VA/ Tampa, FL

      Emplyment Type :: Full time || Contract


      As a Big Data Engineer, you will have the opportunity to make a significant impact, both to our organization and those of our Fortune 500 clients. You will work directly with clients at the intersection of business and technology. You will leverage your experience with Hadoop and software engineering to help our clients use insights gleaned from their data to drive Value.


      You will also be given substantial opportunities to develop and expand your technical skillset with emerging Big Data technologies so you can continually innovate, learn, and hit the gas pedal on your career.



Required:
  • 4+ years of IT experience
  • Very good experience in Hadoop, Hive, Spark Batch. (Streaming exp is good to have)
  • Good to have experience with 1 NoSQL - HBase/ Cassandra.
  • Experience with Java/J2EE & Web Services, Scala/ Python is good to have
  • AWS (ETL implementation with AWS on Hadoop) good to have
  • Writing utilities/program to enhance product capability to fulfill specific customer requirement
  • Learning new technology/solution to solve customer problems
  • Provide feedback/learning to product team


Soft Skills:

    A team player who understands the roles and responsibilities of all the team members and facilitates a one team culture
    Strong communication skills both verbal and written
    Quick learner who can work independently on the tasks assigned after initial hand holding
Migo
  • Taipei, Taiwan

  • Responsibility 

    • Collaborate with data scientists to phase in statistical, predictive machine learning and AI models to production scale and continuously optimizing performance.

    • Design, build, optimize, launch and support new and existing data models and ETL processes in production based on data products and stakeholder needs.

    • Define and manage SLA and accuracy for all data sets in allocated areas of ownership.

    • Design and continuously improve data infrastructure and identify infra issues and drive to resolution.

    • Support software development team to build and maintain data collectors in Migo application ecosystem based on data warehouse and analytics user requirements.





  • Basic Qualification:

    • Bachelor's degree in Computer Science, Information Management or related field.

    • 2+ years hands-on experience in the data warehouse space, custom ETL design, implementation and maintenance.

    • 2+ years hands-on experience in SQL or similar languages and development experience in at least one scripting language (Python preferred).

    • Strong data architecture, data modeling, schema design and effective project management skills.

    • Excellent communication skills and proven experience in leading data driven projects from definition through interpretation and execution.

    • Experience with large data sets and data profiling techniques.

    • Ability to initiate and drive projects, and communicate data warehouse plans to internal clients/stakeholders.





  • Preferred Qualification:

    • Experience with big data and distributed computing technology such as Hive, Spark, Presto, Parquet

    • Experience building and maintaining production level data lake with Hadoop Cluster or AWS S3.

    • Experience with batch processing and streaming data pipeline/architecture design patterns such as lambda architecture or kappa architecture.








    • AI

    • ETL

    • (SLA)


    • Migo







    • 2data warehouse space, custom ETL

    • 2SQL (Python)

    • data modeling






    • Hive, Spark, Presto, Parquet

    • Hadoop Cluster or AWS S3.

    • lambda architecture or kappa architecture.


Subsurface Consultants & Associates, LLC
  • Houston, TX

An oil and gas company is looking for a Machine Learning R&D Engineer to join their Agile team in west Houston. The team is working to create a prototype prediction tool and the Machine Learning Engineer will be responsible for developing the code for the tool and training other team members on Machine Learning principals and practices.


Candidates should fulfil the following minimum qualifications:


  • Ph.D. in Computer Science or a related field (Mathematics, Statistics, Physics or Electrical and/or Computer engineering) with a focus on machine learning (deep learning)
  • 5 years experience in Machine Learning
  • 15 years experience in systems development and programming 
  • Expert level Python 3 development skills
  • Advanced visualization using Python libraries
  • Experience with Big Data technologies and utilities
Catapult Systems
  • Houston, TX

High performing team members. Challenging projects. A stable and profitable company. And a great place to work! This is what you can expect if you join the Catapult Systems team. Founded in 1993 and headquartered in Austin, Texas, Catapult is an award winning Microsoft National Solution Provider and was recently named the Microsoft Partner of the Year (U.S.) and Microsoft Partner of the Year Finalist in Cloud Productivity.


What do we attribute our award-winning success to? The people we hire, of course! We provide you the tools and leadership that you need to be successful, and then let you do what you do best. We enable you to make the decisions that you feel are in the best interest of our clients, and we trust your judgment. This type of ownership and independence, and an ongoing commitment to solving real business problems through the innovative use of Microsoft technologies, has resulted in Catapult being voted one of the best places to work year after year!


It is a very exciting time of growth for Catapult Systems, and we are currently hiring a Data Analytics Developer to provide technical leadership for our expanding team.

What will my role be?


As a Data Analytics Developer you will work with customers to identify opportunities and scenarios where Power BI and Azure Data Services can benefit their business. You will deliver short and long term projects utilizing strong business, technical and data modeling skills.

Responsibilities will include:

    • Working with customers to analyze business requirements and define functional specifications
    • Facilitating client meetings and strategy sessions
    • Providing pre-sales technical support by attending sales calls and creating demos for customers
    • Support and implementation of Data Analytics projects

Whats required?

    • First and foremost, you should enjoy what you do and enjoy working in teams!
    • Ability to engage in customer settings and discern client business needs
    • Strong working knowledge and track record of Data Analytics development
    • 5+ years of experience sourcing, star schema & relational data modeling, ETL and processing
    • Expert level knowledge around SSIS, SSAS, SSRS, PowerBI and tools such as SSMS and SSDT
    • Experience supporting large scale analytical platforms
    • Experience designing automated processing, data validation, error checks and alerts, and performance testing techniques
    • Experience working with SQL Azure and cloud data solutions
    • 5+ years of experience with Microsoft SQL Server and proficiency in T-SQL
    • 1+ years of experience in migrating from on-prem to cloud (PaaS or IaaS)
    • Excellent presentation, verbal and written communication, and time management skills
    • Ability to travel up to 25%

What else would make me stand out?

    • Previous consulting experience
    • Knowledge of database optimization techniques
    • Experience with Python and/or R
    • Proficiency in MDX and/or DAX queries
    • Experience with Microsoft Office 365 and cloud data solutions
    • Reporting experience with SharePoint and CRM
    • Relevant Microsoft Certifications and Non-Microsoft data platform certifications
    • Ability to work with cloud and hybrid environments
    • Good understanding of statistics
    • Knowledge in government analytics and policy objectives
    • Experience working with Big Data technologies and NoSQL
    • Multidimensional and Tabular Cube design, development, performance tuning and troubleshooting
    • Experience working with Data Visualization, Auditing, Data Validation, and Data Mining

 So what are you waiting for?? If you are passionate about being a leader and want to work with smart people that are committed to accomplishing great things, then apply today!

Catapult offers an outstanding benefits package including 401(k) match, paid time off, flex spending accounts, identity theft protection, and medical, dental, and life insurance just to name a few.

Catapult was recently named a Texas Monthly magazine Best Place to Work!