OnlyDataJobs.com

ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The purpose of this role is to enable and support Citizen Data Scientists (CDS) to develop analytical workflows and to manage the adoption and implementation of the latest innovations within the ConocoPhillips preferred analytics tools for Citizen Data Science.
This position will enable analytics tools and solutions for customers including; the facilitation of solution roadmap, the adoption of new analytics functionality, the integration between applications based on value driven workflows, the support and training of users on the new capabilities.
Responsibilities May Include
  • Work with customers to enable the latest data analytics capabilities
  • Understand and help implement the latest innovations available within ConocoPhillips preferred analytics platform including Spotfire, Statistica, ArcGIS Big Data (Spatial Analytics), Teradata and Python
  • Help users with the implementation of analytics workflows through integration of the analytics applications
  • Manage analytics solutions roadmap and implementation timeline enabling geoscience customers to take advantage of the latest features or new functionality
  • Communicate with vendors and COP community on analytics technology functionality upgrades, prioritized enhancements and adoption
  • Test and verify that existing analytics workflows are supported within the latest version of the technology
  • Guide users on how to enhance their current workflows with the latest analytics technology
  • Facilitate problem solving with analytics solutions
  • Work with other AICOE teams to validate and implement new technology or version upgrades into production
Specific Responsibilities May Include
    Provi
    • de architectural guidance for building integrated analytical solutions Under
    • stands analytics product roadmaps, product development and the implementation of new featuresPromo
    • tes new analytics product features within customer base and demonstrates how it enables analytics workflowsManag
    • e COP analytics product adoption roadmapCaptu
    • re product enhancement list and coordinate prioritization with the vendorTest
    • new capabilities and map them to COP business workflowsCoord
    • inate with the AICOE team the timely upgrades of the new features Provi
    • des support to CDS for:
    • analytics modelling best practices
    • know how implementation of analytics workflows based on new technology
  • Liaise with the AICOE Infrastructure team for timely technology upgrades
  • Work on day to day end user support activities for Citizen Data Science tools; Advanced Spotfire, Statistica, GIS Big Data
  • Provides technical consulting and guidance to Citizen Data Scientist for the design and development of complex analytics workflows
  • Communicates analytics technology roadmap to end users
  • Communicates and demonstrates the value of new features to COP business
  • Train and mentor Citizen Data Science on analytics solutions
Basic/Required
  • Legally authorized to work in the United States
  • Bachelor's degree in Information Technology, Computer Sciences, Geoscience, Engineering, Statistics or related field
  • 5+ years of experience in oil & gas and geoscience data and workflows
  • 3+ years of experience with Tibco Spotfire
  • 3+ years of experience Teradata or using SQL databases
  • 1+ years of experience with ArcGIS spatial analytics tools
  • Advanced knowledge and experience of Integration platform
Preferred
  • Masters degree in Analytics or related field
  • 1+ years of experience with Tibco Statistica or equivalent statistics-based analytics package
  • Prior experience in implementing and supporting visual, prescriptive and predictive analytics
  • In-depth understanding of the analytics applications and integration points
  • Experience implementing data science workflows in Oil & Gas
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 27, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 13, 2019, 4:51:37 PM
ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Sr. Analytics Analyst will be part of the Production, Drilling, and Projects Analytics Services Team within the Analytics Innovation Center of Excellence that enables data analytics across the ConocoPhillips global enterprise. This role works with business units and global functions to help strategically design, implement, and support data analytics solutions. This is a full-time position that provides tremendous career growth potential within ConocoPhillips.
Responsibilities May Include
  • Complete end to end delivery of data analytics solutions to the end user
  • Interacting closely with both business and developers while gathering requirements, designing, testing, implementing and supporting solutions
  • Gather business and technical specifications to support analytic, report and database development
  • Collect, analyze and translate user requirements into effective solutions
  • Build report and analytic prototypes based on initial business requirements
  • Provide status on the issues and progress of key business projects
  • Providing regular reporting on the performance of data analytics solutions
  • Delivering regular updates and maintenance on data analytics solutions
  • Championing the data analytics solutions and technologies at ConocoPhillips
  • Integrate data for data models used by the customers
  • Deliver Data Visualizations used for data driven decision making
  • Provide strategic technology direction while supporting the needs of the business
Basic/Required
  • Legally authorized to work in the United States
  • 5+ years of related IT experience
  • 5+ year of Structure Querying Language experience (ANSI SQL, T-SQL, PL/SQL)
  • 3+ years hands-on experience delivering solutions with an Analytics Tools i.e. (Spotfire, SSRS, Power BI, Tableau, Business Objects)
Preferred
  • Bachelor's Degree in Information Technology or Computer Science
  • 5+ years of Oil and Gas Industry experience
  • 5+ years hands-on experience delivering solutions with Informatica PowerCenter
  • 5+ years architecting data warehouses and/or data lakes
  • 5+ years with Extract Transform and Load (ETL) tools and best practices
  • 3+ years hands-on experience delivering solutions with Teradata
  • 1+ years developing analytics models with R or Python
  • 1+ years developing visualizations using R or Python
  • Experience with Oracle (11g, 12c) and SQL Server (2008 R2, 2010, 2016) and Teradata 15.x
  • Experience with Hadoop technologies (Hortonworks, Cloudera, SQOOP, Flume, etc.)
  • Experience with AWS technologies (S3, SageMaker, Athena, EMR, Redshift, Glue, etc.)
  • Thorough understanding of BI/DW concepts, proficient in SQL, and data modeling
  • Familiarity with ETL tools (Informatica, etc.) and ETL processes
  • Solutions oriented individual; learn quickly, understand complex problems, and apply useful solutions
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to work as a team player
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Strong analytical, troubleshooting, and problem-solving skills experience in analyzing and understanding business/technology system architectures, databases, and client applications to recognize, isolate, and resolve problems
  • Demonstrates the desire and ability to learn and utilize new technologies in data analytics solutions
  • Strong communication and presentation skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 20, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 13, 2019, 4:56:49 PM
Burtch Works
  • Atlanta, GA

Our client in the Atlanta area is seeking a Director of Statistical Modeling to lead a team that will be developing scores and analytic products for their clients in the insurance industry. This group will work closely with other internal teams to develop best in class products and deliver statistical analytics of the highest caliber. You will also work closely with the sales organization to deliver results and present analytics as necessary. Strong leadership skills and business acumen is crucial to the success of this role.

Requirements:

Graduate degree in a quantitative field.

At least 10 years of experience in analytics.

At least 2 years of people management experience.

Experience with SAS, R, Python or equivalent analytic software.

Previous experience in an analytical leadership role as well as the ability to think strategically.

Thorough understanding of statistical methodologies including linear regression, logistic regression, CHAID/CART and neural networks.

Salary range up to mid-$100's on base + bonus. Great benefits.

KEYWORDS: statistical modeling, SAS, analytics, insurance, predictive modeling, regression, risk, R, Python, logistic regression, linear regression, neural networks, CHAID/CART

Accenture
  • Atlanta, GA
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Accenture
  • Raleigh, NC
Are you ready to step up to the New and take your technology expertise to the next level?
As part of our Data Business Group, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a highly collaborative and growing network of technology and data experts, who are taking on todays biggest, most complex business challenges using the latest data and analytics technologies. We will nurture your talent in an inclusive culture that values diversity. You will have an opportunity to work in roles such as Data Scientist, Data Engineer, or Chief Data Officer covering all aspects of Data including Data Management, Data Governance, Data Intelligence, Knowledge Graphs, and IoT. Come grow your career in Technology at Accenture!
Join Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture, and make delivering innovative work part of your extraordinary career.
People in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.
As part of our Advanced Technology & Architecture (AT&A) practice, you will lead technology innovation for our clients through robust delivery of world-class solutions. You will build better software better! There will never be a typical day and thats why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing technology landscape. You will be part of a growing network of technology experts who are highly collaborative taking on todays biggest, most complex business challenges. We will nurture your talent in an inclusive culture that values diversity. Come grow your career in technology at Accenture!
Job Description
Data and Analytics professionals define strategies, develop and deliver solutions that enable the collection, processing and management of information from one or more sources, and the subsequent delivery of information to audiences in support of key business processes.
    • Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality software designs that are well-documented.
    • Demonstrate an understanding of technology and digital frameworks in the context of data integration.
    • Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Big Data professionals develop deep next generation Analytics skills to support Accenture's data and analytics agendas, including skills such as Data Modeling, Business Intelligence, and Data Management.
A professional at this position level within Accenture has the following responsibilities:
    • Utilizes existing methods and procedures to create designs within the proposed solution to solve business problems.
    • Understands the strategic direction set by senior management as it relates to team goals.
    • Contributes to design of solution, executes development of design, and seeks guidance on complex technical challenges where necessary.
    • Primary upward interaction is with direct supervisor.
    • May interact with peers, client counterparts and/or management levels within Accenture.
    • Understands methods and procedures on new assignments and executes deliverables with guidance as needed.
    • May interact with peers and/or management levels at a client and/or within Accenture.
    • Determines methods and procedures on new assignments with guidance .
    • Decisions often impact the team in which they reside.
    • Manages small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture.
Basic Qualifications
    • Minimum 2 plus years of hands-on technical experience implementing or supporting Big Data solutions utilizing Hadoop.
      Expe
    • rience developing solutions utilizing at least two of the following: Kaf
        k
      • a based streaming services
      • R Studio
      • Cassandra, MongoDB
      • MapReduce, Pig, Hive
      • Scala, Spark
      • Knowledge on Jenkins, Chef, Puppet
Preferred Qualifications
    • Hands-on technical experience utilizing Python
    • Full life cycle development experience
    • Experience with delivering Big Data Solutions in the cloud with AWS or Azure
    • Ability to configure and support API and OpenSource integrations
    • Experience administering Hadoop or other Data Science and Analytics platforms using the technologies above
    • Experience with DevOps support
    • Designing ingestion, low latency, visualization clusters to sustain data loads and meet business and IT expectations
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
State Farm
  • Atlanta, GA

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

For over 95 years, data has been key to State Farm.  As a member of our data science team with the Enterprise Data & Analytics department under our Chief Data & Analytics Officer, you will work across the organization to solve business problems and help achieve business strategies.  You will employ sophisticated, statistical approaches and state of the art technology.  You will build and refine our tools/techniques and engage w/internal stakeholders across the organization to improve our products & services.


Implementing solutions is critical for success. You will do problem identification, solution proposal & presentation to a wide variety of management & technical audiences. This challenging career requires you to work on multiple concurrent projects in a community setting, developing yourself and others, and advancing data science both at State Farm and externally.


Skills & Professional Experience

·        Develop hypotheses, design experiments, and test feasibility of proposed actions to determine probable outcomes using a variety of tools & technologies

·        Masters, other advanced degrees, or five years experience in an analytical field such as data science quantitative marketing, statistics, operations research, management science, industrial engineering, economics, etc. or equivalent practical experience preferred.

·        Experience with SQL, Python, R, Java, SAS or MapReduce, SPARK

·        Experience with unstructured data sets: text analytics, image recognition etc.

·        Experience working w/numerous large data sets/data warehouses & ability to pull from such data sets using relevant programs & coding including files, RDBMS & Hadoop based storage systems

·        Knowledge in machine learning methods including at least one of the following: Time series analysis, Hierarchical Bayes; or learning techniques such as Decision Trees, Boosting, Random Forests.

·        Excellent communication skills and the ability to manage multiple diverse stakeholders across businesses & leadership levels.

·        Exercise sound judgment to diagnose & resolve problems within area of expertise

·        Familiarity with CI/CD development methods, Git and Docker a plus


Multiple location opportunity. Locations offered are: Atlanta, GA, Bloomington, IL, Dallas, TX and Phoenix, AZ


Remote work option is not available.


There is no sponsorship for an employment visa for the position at this time.


Competencies desired:
Critical Thinking
Leadership
Initiative
Resourcefulness
Relationship Building
State Farm
  • Dallas, TX

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

For over 95 years, data has been key to State Farm.  As a member of our data science team with the Enterprise Data & Analytics department under our Chief Data & Analytics Officer, you will work across the organization to solve business problems and help achieve business strategies.  You will employ sophisticated, statistical approaches and state of the art technology.  You will build and refine our tools/techniques and engage w/internal stakeholders across the organization to improve our products & services.


Implementing solutions is critical for success. You will do problem identification, solution proposal & presentation to a wide variety of management & technical audiences. This challenging career requires you to work on multiple concurrent projects in a community setting, developing yourself and others, and advancing data science both at State Farm and externally.


Skills & Professional Experience

·        Develop hypotheses, design experiments, and test feasibility of proposed actions to determine probable outcomes using a variety of tools & technologies

·        Masters, other advanced degrees, or five years experience in an analytical field such as data science quantitative marketing, statistics, operations research, management science, industrial engineering, economics, etc. or equivalent practical experience preferred.

·        Experience with SQL, Python, R, Java, SAS or MapReduce, SPARK

·        Experience with unstructured data sets: text analytics, image recognition etc.

·        Experience working w/numerous large data sets/data warehouses & ability to pull from such data sets using relevant programs & coding including files, RDBMS & Hadoop based storage systems

·        Knowledge in machine learning methods including at least one of the following: Time series analysis, Hierarchical Bayes; or learning techniques such as Decision Trees, Boosting, Random Forests.

·        Excellent communication skills and the ability to manage multiple diverse stakeholders across businesses & leadership levels.

·        Exercise sound judgment to diagnose & resolve problems within area of expertise

·        Familiarity with CI/CD development methods, Git and Docker a plus


Multiple location opportunity. Locations offered are: Atlanta, GA, Bloomington, IL, Dallas, TX and Phoenix, AZ


Remote work option is not available.


There is no sponsorship for an employment visa for the position at this time.


Competencies desired:
Critical Thinking
Leadership
Initiative
Resourcefulness
Relationship Building
Riccione Resources
  • Dallas, TX

Sr. Data Engineer Hadoop, Spark, Data Pipelines, Growing Company

One of our clients is looking for a Sr. Data Engineer in the Fort Worth, TX area! Build your data expertise with projects centering on large Data Warehouses and new data models! Think outside the box to solve challenging problems! Thrive in the variety of technologies you will use in this role!

Why should I apply here?

    • Culture built on creativity and respect for engineering expertise
    • Nominated as one of the Best Places to Work in DFW
    • Entrepreneurial environment, growing portfolio and revenue stream
    • One of the fastest growing mid-size tech companies in DFW
    • Executive management with past successes in building firms
    • Leader of its technology niche, setting the standards
    • A robust, fast-paced work environment
    • Great technical challenges for top-notch engineers
    • Potential for career growth, emphasis on work/life balance
    • A remodeled office with a bistro, lounge, and foosball

What will I be doing?

    • Building data expertise and owning data quality for the transfer pipelines that you create to transform and move data to the companys large Data Warehouse
    • Architecting, constructing, and launching new data models that provide intuitive analytics to customers
    • Designing and developing new systems and tools to enable clients to optimize and track advertising campaigns
    • Using your expert skills across a number of platforms and tools such as Ruby, SQL, Linux shell scripting, Git, and Chef
    • Working across multiple teams in high visibility roles and owning the solution end-to-end
    • Providing support for existing production systems
    • Broadly influencing the companys clients and internal analysts

What skills/experiences do I need?

    • B.S. or M.S. degree in Computer Science or a related technical field
    • 5+ years of experience working with Hadoop and Spark
    • 5+ years of experience with Python or Ruby development
    • 5+ years of experience with efficient SQL (Postgres, Vertica, Oracle, etc.)
    • 5+ years of experience building and supporting applications on Linux-based systems
    • Background in engineering Spark data pipelines
    • Understanding of distributed systems

What will make my résumé stand out?

    • Ability to customize an ETL or ELT
    • Experience building an actual data warehouse schema

Location: Fort Worth, TX

Citizenship: U.S. citizens and those authorized to work in the U.S. are encouraged to apply. This company is currently unable to provide sponsorship (e.g., H1B).

Salary: 115 130k + 401k Match

---------------------------------------------------


~SW1317~

Apex Clearing Corporation
  • Dallas, TX

Business Analyst  

Dallas, TX


Apex Clearing was brought to life by the idea that the development of integrative technology can improve how business is done. The passion for that idea powers our firm still. As a Business Analyst at Apex you are free to explore unique solutions and try fresh ideas that may benefit our financial services business. Youll collaborate with Apex's sharpest minds to use data to drive business decision making. Working at Apex Clearing means that youll always be presented with a variety of new possibilities as you continue to enhance your skills.

Were looking for someone who:

  • Is passionate. You have a genuine passion for data, modeling and analysis to drive decision making across all levels of the business. You love using technology differently to maximize insight and business impact and you have a way of bringing out that same fire in the people you work with
  • Is motivated. Youre driven to be the best whether thats decreasing onboarding time or making an innovative change to how its always been done. You challenge yourself by setting goals and exceeding them
  • Is collaborative. Youre excited to work with fellow data junkies and big thinkers. You know how to collaborate across the organization and communicate with less technical team members to find a solution to pain points
  • Wants to make an impact. Youre looking to do amazing work. Youre all about using technology to improve efficiency and effect the company's bottom line.

What youll do all day:

  • Defining and analyzing data that will generate insights and help support our core business
  • Building state of the art models
  • Acting as a thought leader for the firm with your ideas and projects enabling data driven decision making
  • Helping to define best practices within our analytics team to increase efficiency and thus create even more business impact.
  • Collaborating with Analytics team members and people across the business to get the best out of the collective team

A few reasons why you might love us:

  • Were a leader in the space.   Apex is recognized for disrupting the financial services industry, enabling fintech standouts like Stash, Robinhood and Betterment. Weve got an amazing track record of success and we foster ongoing innovation. So you get all the benefits of a proven, growing company, while enjoying a very entrepreneurial culture
  • We see data analytics differently. Youll work with people who apply innovative approaches to analysis and modeling. We are passionate engineers dedicated to finding new and different ways to use technology and data to drive the business forward.
  • Your work will have immediate impact. Youll be able to see your direct impact on our business. You wont be just another talented analyst chained to a desk or ticket queue.

And a few reasons why you may not like working for us:

  • You dont like change. This is not a job for someone who likes predictable. We embrace and drive the business toward needed change for our future success.
  • Youre not the collaborative type. We work together to ensure the best possible solutions for the business. We think two brains are better than one so we do most of our work together and often with other departments. Team work makes the dream work on this team.
  • Youre not a people person. We work with people across the organization to understand the data from a business perspective and ensure our results will be actionable by Apex.

The skills youll need to succeed:  

  • 2-4 years working as a Business Analyst, Quantitative Analyst or comparable experience
  • Experience with MS Excel
  • Experience with SQL and Python or R
  • Experience with relational databases
  • Experience with statistical analysis techniques
  • Familiarity with Sisense, Tableau or similar Data visualization tools
Signify Health
  • Dallas, TX

Position Overview:

Signify Health is looking for a savvy Data Engineer to join our growing team of deep learning specialists. This position would be responsible for evolving and optimizing data and data pipeline architectures, as well as, optimizing data flow and collection for cross-functional teams. The Data Engineer will support software developers, database architects, data analysts, and data scientists. The ideal candidate would be self-directed, passionate about optimizing data, and comfortable supporting the Data Wrangling needs of multiple teams, systems and products.

If you enjoy providing expert level IT technical services, including the direction, evaluation, selection, configuration, implementation, and integration of new and existing technologies and tools while working closely with IT team members, data scientists, and data engineers to build our next generation of AI-driven solutions, we will give you the opportunity to grow personally and professionally in a dynamic environment. Our projects are built on cooperation and teamwork, and you will find yourself working together with other talented, passionate and dedicated team member, all working towards a shared goal.

Essential Job Responsibilities:

  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing data models for greater scalability, etc.
  • Leverage Azure for extraction, transformation, and loading of data from a wide variety of data sources in support of AI/ML Initiatives
  • Design and implement high performance data pipelines for distributed systems and data analytics for deep learning teams
  • Create tool-chains for analytics and data scientist team members that assist them in building and optimizing AI workflows
  • Work with data and machine learning experts to strive for greater functionality in our data and model life cycle management capabilities
  • Communicate results and ideas to key decision makers in a concise manner
  • Comply with applicable legal requirements, standards, policies and procedures including, but not limited to the Compliance requirements and HIPAA.


Qualifications:Education/Licensing Requirements:
  • High school diploma or equivalent.
  • Bachelors degree in Computer Science, Electrical Engineer, Statistics, Informatics, Information Systems, or another quantitative field. or related field or equivalent work experience.


Experience Requirements:
  • 5+ years of experience in a Data Engineer role.
  • Experience using the following software/tools preferred:
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with AWS or Azure cloud services.
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C#, etc.
  • Strong work ethic, able to work both collaboratively, and independently without a lot of direct supervision, and solid problem-solving skills
  • Must have strong communication skills (written and verbal), and possess good one-on-one interpersonal skills.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
  • 2 years of experience in data modeling, ETL development, and Data warehousing
 

Essential Skills:

  • Fluently speak, read, and write English
  • Fantastic motivator and leader of teams with a demonstrated track record of mentoring and developing staff members
  • Strong point of view on who to hire and why
  • Passion for solving complex system and data challenges and desire to thrive in a constantly innovating and changing environment
  • Excellent interpersonal skills, including teamwork and negotiation
  • Excellent leadership skills
  • Superior analytical abilities, problem solving skills, technical judgment, risk assessment abilities and negotiation skills
  • Proven ability to prioritize and multi-task
  • Advanced skills in MS Office

Essential Values:

  • In Leadership Do whats right, even if its tough
  • In Collaboration Leverage our collective genius, be a team
  • In Transparency Be real
  • In Accountability Recognize that if it is to be, its up to me
  • In Passion Show commitment in heart and mind
  • In Advocacy Earn trust and business
  • In Quality Ensure what we do, we do well
Working Conditions:
  • Fast-paced environment
  • Requires working at a desk and use of a telephone and computer
  • Normal sight and hearing ability
  • Use office equipment and machinery effectively
  • Ability to ambulate to various parts of the building
  • Ability to bend, stoop
  • Work effectively with frequent interruptions
  • May require occasional overtime to meet project deadlines
  • Lifting requirements of
DISYS
  • Minneapolis, MN
Client: Banking/Financial Services
Location: 100% Remote
Duration: 12 month contract-to-hire
Position Title: NLU/NLP Predictive Modeling Consultant


***Client requirements will not allow OPT/CPT candidates for this position, or any other visa type requiring sponsorship. 

This is a new team within the organization set up specifically to perform analyses and gain insights into the "voice of the customer" through the following activities:
Review inbound customer emails, phone calls, survey results, etc.
Review data that is unstructured "natural language" text and speech data
Maintain focus on customer complaint identification and routing
Build machine learning models to scan customer communication (emails, voice, etc)
Identify complaints from non-complaints.
Classify complaints into categories
Identify escalated/high-risk complaints, e.g. claims of bias, discrimination, bait/switch, lying, etc...
Ensure routed to appropriate EO for special

Responsible for:
Focused on inbound retail (home mortgage/equity) emails
Email cleansing: removal of extraneous information (disclaimers, signatures, headers, PII)
Modeling: training models using state of art techniques
Scoring: "productionalizing" models to be consumed by the business
Governance: model documentation and Q/A with model risk group.
Implementation of model monitoring processes

Desired Qualifications:
Real-world experience building/deploying predictive models, any industry (must)
SQL background (must)
Self-starter, able to excel in fast-paced environment w/o a ton of direction (must)
Good communication skills (must)
Experience in text/speech analytics (preferred)
Python, SAS background (preferred)
Linux (nice to have)
Spark (Scala or PySpark) (nice to have)

Sentek Global
  • San Diego, CA

Sentek Global is seeking a Software Engineer to provide support to PMW 150 in San Diego, CA!


Responsibilities
  • Design, build and maintain software, develop software infrastructure and development environments, and transition older products and capabilities to the new architectures.
  • Produce effective and powerful solutions to complex problems in areas such assoftware engineering, data analytics, automation,and cybersecurity.
  • Perform analysis of existing and emerging operational and functional requirements to support the current and future systems capabilities and requirements.
  • Provide technical expertise, guidance, architecture, development and support in many different technologies directly to government customers.
  • Perform schedule planning and program management tasks as required.
  • Perform Risk Analysis for implementation of program requirements.
  • Assist in the development of requirements documents.
  • Other duties as required.


Qualifications
  • A current active secret clearance is required to be considered for this role.
  • A Bachelors Degree in data science, data analytics, computer science, or a related technical discipline is required.
  • Three to five (3-5) years providing software engineering support to a DoD program office.
  • Experience working with data rich problems through research or programs.
  • Experience with computer programming or user experience/user interface.
  • Demonstrated knowledge completing projects with large or incomplete data and ability to recommend solutions.
  • Experience with Machine Learning algorithms including convolutional neural networks (CNN), regression, classification, clustering, etc.
  • Experience using deep learning frameworks (preferably TensorFlow).
  • Experience designing and developing professional software using Linux, Python, C++, JAVA, etc.
    • Experience applying Deep/Machine Learning technology to solve real-world problems:
    • Selecting features, building and optimizing classifiers using machine learning techniques.
    • Data mining using state-of-the-art methods.
    • Extending companys data with third party sources of information when needed.
    • Enhancing data collection procedures to include information that is relevant for building analytic systems.
  • Experience processing, cleansing, and verifying the integrity of data used for analysis.
  • Experience performing ad-hoc analyses and presenting results in a clear manner.
  • Experience creating automated anomaly detection systems and constant tracking of its performance.
  • Must be able to travel one to three (1-3) times per year.
ettain group
  • Raleigh, NC

Role: R/S Network Engineer

Pay: 50-60/hr

Location: Raleigh, NC (some flexibility with remote after inital rampup)

18 month contract


Who You'll Work With:

The POV Services Team (dCloud, CPOC, CXC, etc) provides services, tools and content for Cisco field sales and channel partners, enabling them to highlight Cisco solutions and technologies to customers.

What You'll Do

As a Senior Engineer, you are responsible for the development, delivery, and support of a wide range of Enterprise Networking content and services for Cisco Internal, Partner and Customer audiences.

Content Roadmap, Design and Project Management 25%

  • You will document and scope all projects prior to entering project build phase.
  • Youll work alongside our platform/automation teams to review applicable content to be hosted on Cisco dCloud.
  • You specify and document virtual and hardware components, resources, etc. required for content delivery.
  • You can identify and prioritize all project-related tasks while working with Project Manager to develop a timeline with high expectations to meet project deadlines.\
  • You will successfully collaborate and work with a globally-dispersed team using collaboration tools, such as email, instant messaging (Cisco Jabber/Spark), and teleconferencing (WebEx and/or TelePresence).

Content Engineering and Documentation 30%

  • Document device connectivity requirements of all components (virtual and HW) and build as part of pre-work.
  • Work with the Netops team to rack, cabling, imaging, and access required for the content project.
  • As part of the development cycle, the developer will work collaboratively with the business unit technical marketing engineers (TME) and WW EN Sales engineers to configure solution components, including Cisco routers, switches, wireless LAN controllers (WLC), SD-Access, DNA Center, Meraki, SD-WAN (Viptela), etc.
  • Work with BU, WW EN Sales and marketing resources to draft, test and troubleshoot compelling demo/lab/story guides that contribute to the field sales teams and generate high interest and utilization.
  • Work with POV Services Technical Writer to format/edit/publish content and related documents per POV Services standards.
  • Work as the liaison to the operations and support teams to resolve issues identified during the development and testing process, providing technical support and making design recommendations for fixes.
  • Perform resource studies using VMware vCenter to ensure an optimal balance of content performance, efficiency and stability before promoting/publishing production content.

Content Delivery 25%

  • SD-Access POV, SD-WAN POV Presentations, Webex and Video recordings, TOI, SE Certification Proctor, etc.
  • Customer engagement at customer location, Cisco office, remote delivering proof of value and at Cisco office delivering Test Drive and or Technical Solutions Workshop content.
  • Deliver training, TOI, and presentations at events (Cisco Live, GSX, SEVT, Partner VT, etc).
  • Work with the POV Services owners, architects, and business development team to market, train, and increase global awareness of new/revised content releases.

Support and Other 20%

  • You provide transfer of information and technical support to Level 1 & 2 support engineers, program managers and others ensuring that content is understood and in working order.
  • You will test and replicate issues, isolate the root cause, and provide timely workarounds and/or short/long term fixes.
  • You will be monitoring any support trends for assigned content. Track and log critical issues effectively using Jira.
  • You provide Level 3 user support directly/indirectly to Cisco and Partner sales engineers while supporting and mentoring peer/junior engineers as required.

Who You Are

  • You are well versed in the use of standard design templates and tools (Microsoft Office including Visio, Word, Excel, PowerPoint, and Project).
  • You bring an uncanny ability to multitask between multiple projects, user support, training, events, etc. and shifting priorities.
  • Demonstrated, in-depth working knowledge/certification of routing, switching and WLAN design, configuration and deployment. Cisco Certifications including CCNA, CCNP and or CCIE (CCIE preferred) in R&S.
  • You possess professional or expert knowledge/experience with Cisco Service Provider solutions.
  • You are an Associate or have professional knowledge with Cisco Security including Cisco ISE, Stealthwatch, ASA, Firepower, AMP, etc.
  • You have the ability to travel to Cisco internal, partner and customer events, roadshows, etc. to train and raise awareness to drive POV Services adoption and sales. Up to 40% travel.
  • You bring VMWare/ESXi experience building servers, install VMware, deploying virtual appliances, etc.
  • You have Linux experience or certifications including CompTIA Linux+, Red Hat, etc.
  • Youre experience using Tool Command Language (Tcl), PERL, Python, etc. as well as Cisco and 3rd party traffic, event and device generation applications/tools/hardware. IXIA, Sapro, Pagent, etc.
  • Youve used Cisco and 3rd party management/monitoring/troubleshooting solutions; Cisco: DNA Center, Cisco Prime, Meraki, Viptela, CMX.
  • 3rd party solutions: Solarwinds, Zenoss, Splunk, LiveAction or other to monitor and/or manage an enterprise network.
  • Experience using Wireshark and PCAP files.

Why Cisco

At Cisco, each person brings their unique talents to work as a team and make a difference.

Yes, our technology changes the way the world works, lives, plays and learns, but our edge comes from our people.

  • We connect everything people, process, data and things and we use those connections to change our world for the better.
  • We innovate everywhere - From launching a new era of networking that adapts, learns and protects, to building Cisco Services that accelerate businesses and business results. Our technology powers entertainment, retail, healthcare, education and more from Smart Cities to your everyday devices.
  • We benefit everyone - We do all of this while striving for a culture that empowers every person to be the difference, at work and in our communities.
The Select Group
  • San Diego, CA
Python Data Engineer
The Select Group is looking for a senior-level Python Data Engineer in San Diego, CA. This position will be leveraging more AWS with Python based coding to migrate their data from Netezza to Snowflake. If you are looking to work with the newest and greatest technologies in your next role, please apply.
Python Data Engineer Requirements
    • 3-5+ years of experience developing Rest APIs that read data from databases using Python coding and AWS.
    • 3-5 years development experience with data caching techniques such as Redis or Memcache in AWS cloud for faster response to API requests
    • Python Coding
    • AWS (S3 buckets, SQS, Lambda)

Python Data Engineer Responsibilities
    • This resource will be responsible for building APIs that are consumed by the mobile applications. The data is used by the end customer, so this person should have knowledge of the performance of APIs
    • Read files with S3 buckets using python to load into the database
    • Create and maintain optimal data pipeline
    • Assemble large, complex data sets that meet functional / non-functional business requirements.

About The Select Group
We are TSG a fast-growing technical services firm serving the U.S. and Canada. We open doors to diversified prospective employers who respect and value your ambitions, your pursuit of a meaningful career, and your particular skill-set. We offer interview guidance, an impressive referral program, and partner with you to find work that drives you. Learn more about us in our company overview video, or visit us at http://www.selectgroup.com. Sign up to receive weekly job alerts in your inbox by joining the TSG Talent Network.
We have the privilege of impacting lives, so let us impact yours.
The Select Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity, national origin, age, disability, genetic information, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
KELZAL (QELZAL CORPORATION)
  • San Diego, CA

Challenge:

As Kelzals Machine Learning Engineer, youwill be part of an innovative team that designs and develops algorithms and software for the next generation of AI-enabled visual systems. You will develop power-efficient machine learning and adaptive signal processing algorithms to solve real-world imaging and video classification problems.


Responsibilities:

  • Develop algorithms for the fast, low-complexity and accurate detection and tracking of objects in real-world environments
  • Develop algorithms for event-based spatio-temporal signal processing
  • Contribute to our machine learning tool sets for curating data and training models
  • Inform sensor decisions for optimal approaches to classification for product requirements
  • Follow and drive research on state-of-the-art approaches in the areas described above, as applied to the problems we're solving


Requirements:

·      Experience in event-based signal processing

·      Experience in continuous-time signal processing techniques

·      Experience in some deep neural network packages (e.g. TensorFlow, NVIDIA Digits,             Caffe/Caffe2)

·      Experience with OpenCV

·      Experience with traditional computer vision approaches to image processing

·      Experience with developing machine-learning algorithms for multi-modal object detection,         scene understanding, semantic classification, face verification, human pose estimation, activity recognition, or anomaly detection

·      Strong experience with classification and regression algorithms

·      Strong coding skills with Python and/or C/C++ in Linux environment

·      Track record of research excellence or/and experience converting publications to actual implementations

·      Experience with commercial development processes such as continuous integration, deployment and release management tools a plus.

·      Experience launching products containing machine learning algorithms a plus

·      Experience with fixed point implementation a plus

·      3+ years hands-on experience working in industry

·      MS or PhD Degree in Computer Science, Electrical Engineering or a related field

.      Current US work authorization

Biswas Information Technology Solutions
  • Herndon, VA

We are seeking a junior-mid level Data Science Engineer to analyze large amounts of raw data from different sources to extract valuable business insights to aid in better business decision-making. Analytical mind, problem-solving skills and passion for machine-learning and research are critical for this role. You will be part of a highly passionate development team that is in the process of refining our Data Science toolkit that includes a wide set of predictive, recommendation, and inference modeling for our AI product — ranging from time-series forecasting, sentiment analysis, custom object-detection, to named-entity recognition, text summarization, and geometric deep learning.



Responsibilities



  • Identify valuable data sources and automate collection processes

  • Preprocessing of structured and unstructured data

  • Discover trends and patterns in large amounts of data

  • Build predictive models and machine-learning algorithms

  • Present information using data visualization techniques

  • Propose solutions and strategies to business challenges

  • Collaborate with engineering and product development teams



Requirements



  • Strong fundamentals in training, evaluating, and benchmarking machine learning models

  • Strong in Python Numpy, Pandas, keras (tensorflow or pytorch is a plus)

  • Familiar with Feature selection, Feature Extraction (especially for deep learning is a plus++)

  • Familiarity of common Hyper-optimization Techniques with different AI models.

  • Experience handling large data sets

  • Familiarity with BI tools (e.g. Tableau) and data frameworks (e.g. Hadoop)

  • Strong math skills (e.g. statistics, algebra)

  • Problem-solving aptitude

  • Excellent communication and presentation skills

  • 3 to 5 years of experience in the above is preferred

Ultra Tendency
  • Riga, Lettland

You are a developer that loves to take a look at infrastructure as well? You are a systems engineer that likes to write code? Ultra Tendency is looking for you! 


Your Responsibilities:



  • Support our customers and development teams in transitioning capabilities from development and testing to operations

  • Deploy and extend large-scale server clusters for our clients

  • Fine-tune and optimize our clusters to process millions of records every day 

  • Learn something new every day and enjoy solving complex problems


Job Requirements:



  • You know Linux like the back of your hand

  • You love to automate all the things – SaltStack, Ansible, Terraform and Puppet are your daily business

  • You can write code in Python, Java, Ruby or similar languages.

  • You are driven by high quality standards and attention to detail

  • Understanding of the Hadoop ecosystem and knowledge of Docker is a plus


We offer:



  • Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager

  • Work on the open-source community and become a contributor. Learn from open-source enthusiasts which you will find nowhere else in Germany!

  • Work in an English-speaking, international environment

  • Work with cutting edge equipment and tools

Comcast
  • Englewood, CO

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Summary:

Software engineering skills combined with the demands of a high volume, highly-visible analytics platform make this an exciting challenge for the right candidate.

Are you passionate about digital media, entertainment, and software services? Do you like big challenges and working within a highly motivated team environment?

As a software engineer in the Data Experience (DX) team, you will research, develop, support, and deploy solutions in real-time distributing computing architectures. The DX big data team is a fast-moving team of world-class experts who are innovating in providing user-driven, self-service tools for making sense and making decisions with high volumes of data. We are a team that thrives on big challenges, results, quality, and agility.

Who does the data engineer work with?

Big Data software engineering is a diverse collection of professionals who work with a variety of teams ranging from other software engineering teams whose software integrates with analytics services, service delivery engineers who provide support for our product, testers, operational stakeholders with all manner of information needs, and executives who rely on big data for data backed decisioning.

What are some interesting problems you'll be working on?

Develop systems capable of processing millions of events per second and multi-billions of events per day, providing both a real-time and historical view into the operation of our wide-array of systems. Design collection and enrichment system components for quality, timeliness, scale and reliability. Work on high-performance real-time data stores and a massive historical data store using best-of-breed and industry-leading technology.

Where can you make an impact?

Comcast DX is building the core components needed to drive the next generation of data platforms and data processing capability. Running this infrastructure, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robust big data architecture capable of providing insights that would otherwise be drowned in an ocean of data.

Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

  • Develop solutions to big data problems utilizing common tools found in the ecosystem.
  • Develop solutions to real-time and offline event collecting from various systems.
  • Develop, maintain, and perform analysis within a real-time architecture supporting large amounts of data from various sources.
  • Analyze massive amounts of data and help drive prototype ideas for new tools and products.
  • Design, build and support APIs and services that are exposed to other internal teams
  • Employ rigorous continuous delivery practices managed under an agile software development approach
  • Ensure a quality transition to production and solid production operation of the software

Skills & Requirements:

  • 5+ years programming experience
  • Bachelors or Masters in Computer Science, Statistics or related discipline
  • Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem.
  • Experience working on big data platforms in the cloud or on traditional Hadoop platforms
  • AWS Core
  • Kinesis
  • IAM
  • S3/Glacier
  • Glue
  • DynamoDB
  • SQS
  • Step Functions
  • Lambda
  • API Gateway
  • Cognito
  • EMR
  • RDS/Auora
  • CloudFormation
  • CloudWatch
  • Languages
  • Python
  • Scala/Java
  • Spark
  • Batch, Streaming, ML
  • Performance tuning at scale
  • Hadoop
  • Hive
  • HiveQL
  • YARN
  • Pig
  • Scoop
  • Ranger
  • Real-time Streaming
  • Kafka
  • Kinesis
  • Data File Formats:
  • Avro, Parquet, JSON, ORC, CSV, XML
  • NoSQL / SQL
  • Microservice development
  • RESTful API development
  • CI/CD pipelines
  • Jenkins / GoCD
  • AWS
    • CodeCommit
    • CodeBuild
    • CodeDeploy
    • CodePipeline
  • Containers
  • Docker / Kubernetes
  • AWS
    • Lambda
    • Fargate
    • EKS
  • Analytics
  • Presto / Athena
  • QuickSight
  • Tableau
  • Test-driven development/test automation, continuous integration, and deployment automation
  • Enjoy working with data data analysis, data quality, reporting, and visualization
  • Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly.
  • Great design and problem solving skills, with a strong bias for architecting at scale.
  • Adaptable, proactive and willing to take ownership.
  • Keen attention to detail and high level of commitment.
  • Good understanding in any: advanced mathematics, statistics, and probability.
  • Experience working in agile/iterative development and delivery environments. Comfort in working in such an environment. Requirements change quickly and our team needs to constantly adapt to moving targets.

About Comcast DX (Data Experience):

Data Experience(DX) is a results-driven, data platform research and engineering team responsible for the delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization. The mission of DX is to gather, organize, make sense of Comcast data, and make it universally accessible to empower, enable, and transform Comcast into an insight-driven organization. Members of the DX team define and leverage industry best practices, work on extremely large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines

Comcast is an EOE/Veterans/Disabled/LGBT employer

UST Global
  • San Diego, CA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,

UST Global
  • Atlanta, GA

KEY SKILLSETS

- 7+ years experience with Python

- 4+ years experience with Java


General Responsibilities
- Selecting features, building and optimizing classifiers using machine learning techniques
- Data mining using state of the art methods
- Extending business data with third party sources of information when needed
- Enhancing data collection procedures to include information that is relevant for building analytic systems
- Processing, cleansing, and verifying the integrity of data used for analysis
- Doing ad hoc analysis and presenting results in a clear manner
- Creating automated anomaly detection systems and constant tracking of its performance
Skills and Qualifications
- Min 8 yrs of experience
- Hands on experience in Python
- Excellent understanding of machine learning techniques and algorithms.
- Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, etc Excellence in at least one of these is highly desirable
- Great communication skills
- Experience with data visualization tools, such as GGplot, etc.
- Proficiency in using query languages such as SQL, Hive, Pig
- Experience with NoSQL databases, such as MongoDB
- Good applied statistics skills, such as distributions, statistical testing, regression,