OnlyDataJobs.com

RM Dayton Analytics
  • Dallas, TX
Job Summary & Responsibilities

RM Dayton Analytics premiere retail client has an immediate opening for a Web Analytics & Measurements Analyst.  

Overview:

 

The Analytics & Measurement Specialist position will identify data driven insights, inform test design and measure test results for various pilot test initiatives.

 

Responsibilities:

  • Proposes, executes and communicates discovery analytics to identify new tests or inform test design/sizing
  • Determine relevant metrics to effectively measure the performance of each test
  • Evaluates test performance against KPIs (including deep dives/segment cuts) according to measurement playbook, helps apply learnings to future tests and assists with making scaling decisions
  • Assists in sizing annualized impact of winning tests
  • Receive discovery analysis requests and turn them into concrete analysis plans
  • Perform relevant analytics to support test execution results and deep dives
  • Provide accurate and timely reporting of entire portfolio of active tests results

Requirements:

  • At least 2 years of experience in a business analytics role, preferably in testing or web analytics
  • Experience with A/B, multivariate and/or incremental test design and implementation
  • Experience in data manipulation and analysis using SQL, SAS, R, or Python
  • Bachelors degree in a quantitative discipline (Statistics, Applied Mathematics, Economics, etc) or sufficient on the job use of related skills
  • Exceptional standards for quality and strong attention to detail
  • Experience working in an Agile/Scrum environment
  • Exposure to big data tools such as Hadoop/Hive a plus
  • Experience developing various types of predictive models with a targeted result of increasing revenue is a plus

Equal Opportunity Employer. All qualified applicants will receive consideration for employment and will not be discriminated against based on race. color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability, age, pregnancy, genetic information or any other consideration prohibited by law or contract.

JPMorgan Chase & Co.
  • Houston, TX
Our Corporate Technology team relies on smart, driven people like you to develop applications and provide tech support for all our corporate functions across our network. Your efforts will touch lives all over the financial spectrum and across all our divisions: Global Finance, Corporate Treasury, Risk Management, Human Resources, Compliance, Legal, and within the Corporate Administrative Office. Youll be part of a team specifically built to meet and exceed our evolving technology needs, as well as our technology controls agenda.
As a Machine Learning Engineer, you will provide high quality technology solutions that address business needs by developing applications within mature technology environments. You will utilize mature (3rd or 4th Generation) programming methodologies and languages and adhere to coding standards, procedures and techniques while contributing to the technical code documentation.
You will participate in project planning sessions with project managers, business analysts and team members to analyze business requirements and outline the proposed IT solution. You will participate in design reviews and provide input to the design recommendations; incorporate security requirements into design; and provide input to information/data flow, and understand and comply to Project Life Cycle Methodology in all planning steps. You will also adhere to IT Control Policies throughout design, development and testing and incorporate Corporate Architectural Standards into application design specifications.
Additionally, you will document the detailed application specifications, translate technical requirements into programmed application modules and develop/Enhance software application modules. You will participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications; identify/troubleshoot application code-related issues; and review and provide feedback to the final user documentation.
Key Responsibilities
Provide leadership and direction for the key machine learning initiatives in the Operation Risk domain
Act as machine learning evangelist in the Operation Risk domain
Perform research and proof of concepts to determine ML/AI applicability to potential use cases
Mentor junior data scientists and team members new to machine learning
Display efficient work style with attention to detail, organization, and strong sense of urgency
Designing software and producing scalable and resilient technical designs
Creating Automated Unit Tests using Flexible/Open Source Frameworks
Digesting and understanding Business Requirements and designing new modules/functionality which meet the needs of our business partners.
Implement model reviews and machine learning governance framework
Manage code quality for total build effort.
Participate in design reviews and provide input to the design recommendations
Essentials
  • Advanced degree in Math, Computer Science or another quantitative field
  • Three to five years working experience in machine learning, preferably natural language processing
  • Ability to work in a team as a self-directed contributor with a proven track record of being detail orientated, innovative, creative, and strategic
  • Strong problem solving and data analytical skills
  • Industry experience building end-to-end Machine Learning systems leveraging Python (Scikit-Learn, Pandas, Numpy, Tensorflow, Keras, NLTK, Gensim et al.) or other similar languages
  • Experience of a variety of machine learning algorithms (classification, clustering, deep learning et al.) and natural language processing applications (topic modeling, sentiment analysis et al.)
  • Experience with NLP techniques to transform unstructured text data to structured data (lemmatization, stemming, Bag-of-words, word embedding et al.)
  • Experience visualizing/presenting data for stakeholders using Tableau, or open-source Python packages such as matplotlib, seaborn et al.
  • Familiar with Hive/Impala to manipulate data and draw insights from Hadoop
Personal Specification
Demonstrate Continual Improvement in terms of Individual Performance
Strong communication skills
Bright and enthusiastic, self-directed
Excellent analytical and problem solving skills
When you work at JPMorgan Chase & Co., youre not just working at a global financial institution. Youre an integral part of one of the worlds biggest tech companies. In 14 technology hubs worldwide, our team of 40,000+ technologists design, build and deploy everything from enterprise technology initiatives to big data and mobile solutions, as well as innovations in electronic payments, cybersecurity, machine learning, and cloud development. Our $9.5B+ annual investment in technology enables us to hire people to create innovative solutions that will not only transform the financial services industry, but also change the world.
Apple Inc.
  • Seattle, WA
Job Summary:
The iTunes Store is seeking a Software Engineer to provide new tools in order to support its dynamic growth. Join our exciting engineering team that has been leading the digital distribution industry by constantly developing creative features since its launch in April 2003. The position requires someone with all aspects of the software design cycle and has a focus on data modeling and handling large data sets in an inspiring environment.

Key Qualifications:
* Our role requires 5+ years of hands on software engineering experience in a dynamic environment
* You'll have strong development skills in Java, Scala - with a history of architect-level experience.
* Experience crafting and building multi-datacenter distributed systems.
* Experience working with Big Data solutions (e.g. spark, mapreduce, hive, etc.)
* Deep understanding of storage solutions and when to use them (e.g. Graph, Cassandra, Solr, relational database etc.)
* Deep understanding of different data formats (e.g. avro, xml, json, parquet etc) and ETL processes.
* You will understands graph computation, data search and record mapping/matching algorithms.
 

Description:
Our exciting and growing team is looking for a self starting, ambitious individual who is not afraid to question assumptions. You will have excellent written and oral skills. You should have several years of experience developing server-side software using Java. You should also be familiar with Big Data patterns and solutions. You will have the ability to effectively work and communicate technical concepts with all levels of an organization including corporate CTOs, CIOs and Developers.
 

Education:
BSCS or equivalent.

Apple is an Equal Opportunity Employer that is committed to inclusion and diversity. We also take affirmative action to offer employment and advancement opportunities to all applicants, including minorities, women, protected veterans, and individuals with disabilities. Apple will not discriminate or retaliate against applicants who inquire about, disclose, or discuss their compensation or that of other applicants.

Varen Technologies
  • Annapolis Junction, MD
  • Salary: $90k - 180k

Varen Technologies is seeking an experienced and flexible Cloud Software Engineer to augment the existing platform team for a large analytic cloud repository. A successful candidate for this position has experience working with large Hadoop and Accumulo based clusters and a familiarity with open-source technologies. Additional knowledge of Linux OS development, Prometheus, Grafana, Kafka and CentOS would benefit the candidate.


The Platform Team developers build/package/patch the components (typically open-source products) and put initial monitoring in place to ensure the component is up and running, if applicable. The platform team builds subject matter expertise and the integration team installs. Ideal candidates would have familiarity with open-source products and be willing/able to learn new technologies.


REQUIRED EXPERIENCE:



  • 5 years of experience in Software Engineering

  • 4 years of experience developing software with high level languages such as Java, C, C++

  • Demonstrated experience working with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, CloudBase/Acumulo, Big Table, etc

  • Demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.


DESIRED EXPERIENCE:



  • Hadoop/Cloud Developer Certification

  • Experience developing and deploying analytics within a heterogeneous schema environment.

  • Experience designing and developing automated analytic software, techniques, and algorithms.


 CLEARANCE REQUIREMENT:



  • TS/SCI w/ Polygraph

Epidemic Sound AB
  • Stockholm, Sweden

We are now looking for an experienced Machine Learning Specialist (you’re likely currently a Data Scientist, or perhaps an advanced Insight Analyst who’s had the opportunity to use Machine Learning in a commercial environment). 



Job description


The position as a Machine Learning Specialist will report directly in to the CTO, in a fresh new team which functions as a de-centralised squad, delivering advanced analysis and machine learning to various departments throughout the company. You’ll work alongside Data Engineers and Developers to deploy microservices solving many different business needs. The use cases range from Customer Lifetime Value & Churn prediction – to building fantastic recommender engines to further personalize Epidemic Sound’s offering.


You will be working closely with the backend data team in developing robust, scalable algorithms. You will improve the personalization of the product by:



  • Analysing behaviours of visitors, identifying patterns and outliers which can indicate their likelihood to Churn.

  • Developing classification systems through feature extraction on music to identify type & ‘feel’ of any given content.

  • Creating recommender engines so that the music our users see first, is relevant to them based on their behaviours.

  • Contributing to the automation of previously manual tasks, by leveraging the classification systems you’ve contributed to building.

  • Consulting on appropriate implementation of algorithms in practice – and actively identifying new use cases that can help improve Epidemic Sound!



What are we looking for?


We’re looking for a team member with a “no task is too small” mindset – we are at the beginning of our Machine Learning journey – so we need someone who thinks building something from scratch sounds exciting. 


It would be music to our ears if you have:



  • Deep understanding of machine learning (neural networks, deep learning, classification, regression)

  • Experience with machine learning in production

  • Experience with: tensorflow, keras, pytorch, sciki-learn, scipy, numpy, pandas or similar

  • Experience with ml projects in customer value or music information retrieval (MIR)

  • Fluency in python programming and a passion for production ready code

  • Experience from Google Cloud and/or AWS

  • MSc in a Quantitative or Computer Science based subject (Machine Learning, Statistics, Applied Mathematics)


Curious about our music? Find our music on Spotify here → https://open.spotify.com/user/epidemicsound


Application


Do you want to be a part of our fantastic team? Please apply by clicking the link below.

Audible
  • Newark, NJ

Provide technical leadership for a team of machine learning engineers and data scientists tasked with empowering Audibles customer experience through model driven insights

Be the technical expert driving the exploration, selection and implementation of tools and enablers to empower anyone at Audible engaged in data science

Define technical strategy and architecture across Data Engineering, Data Science and Business Intelligence as well as Audible Client Facing platforms

Define iterative and incremental technical strategy to implement a Data Science platform to enable agility, speed and leverage, as well as operational excellence

Define practical and maintainable architectures that can handle large-scale datasets, based on detailed knowledge of relevant technologies in the big data engineering space as it relates to the support of data science

Define technical integration points with Big Data, Business Intelligence and our client facing platforms; resolve complex cross stream/group system architecture and business problems

Hands-on, deep technical expertise: deliver key aspects of the architecture through implementation of proofs of concept, and/or key components of the overall architecture that serve as examples or the foundation for implementation

Lead a team of machine learning engineers as they use tools and develop processes to efficiently train, productionize, monitor and maintain data models at scale

Partner with technical managers whose teams will deliver the strategy and architecture incrementally; hands on mentorship and influence of engineers through design and code reviews

Influence business strategy based on emerging technology capabilities and trends; Provide advice to senior managers across the marketing, product, data science and engineering disciplines




BASIC QUALIFICATIONS


· Bachelors Degree in Computer Science
· 7+ years of experience of software development and deployment of distributed multi-tier applications with high throughput requirements




PREFERRED QUALIFICATIONS


· Demonstrated expertise in a variety of Big Data and Data Science technologies and platforms such as AWS RedShift, Glue, EMR, Sagemaker, as well as Hadoop, Spark, Jupyter
· NotebooksDemonstrated experience using machine learning engineering tools and practices to productionize, train, deploy, monitor and maintain data models at scale using libraries such as MLLib, SciKit-learn and TensorFlow
· Action-oriented strategic thinker
· Be able to thrive in an ambiguous environment - where change is the only constant
· Detailed oriented to ensure that project success is paramount
· Strong verbal and communication skills
· Strong analytical skills and an out of the box thinker
· Self-starter with the ability to multi-task and work in a very fast paced environment
· Track record of defining and delivering cross functional solutions that are innovative and extensible
· Be able to disagree, yet align, when dealing with different stakeholders
· Results oriented and with a strong customer focus
· Highly autonomous. Delivers with little guidance
· Strong mentor of peers and subordinates

Audible is an Equal Opportunity Employer Minority / Women / Disability / Veteran / Gender Identity / Sexual Orientation / Age

Audible
  • Newark, NJ

ABOUT THE TEAM:

Our Data Technology team owns and develops the technology platform that offers decision makers both performance metrics and analysis as well as the self-service capability to perform independent analysis on a wide array of internal and external datasets in order to identify opportunities, trends and issues, uncover new insights, and fine-tune operations to meet business goals.

In this role, you will be instrumental in revolutionizing the way Audible captures high quality, reusable data from across our landscape of applications and services. You will partner with other software engineering teams and work together with them to improve data availability and data instrumentation, and help to positively transform the value of data across Audible, by unlocking data so that it is faster, more available, more robust, and more of an actively useful asset for Audible teams to leverage.

KEY RESPONSIBILITIES
· Play a leading role in building and maintaining the infrastructure for Enterprise Data Platforms, using software engineering best practices, data management fundamentals, data storage principles, recent advances in distributed systems and data streaming, and operational excellence best practices.
· Work closely with product owners and engineers across the company to instrument key data elements
· Design, build, and support platforms for monitoring and surfacing data quality issues
· Integrate different technologies to provide data lineage and visibility
· Effectively communicate with various teams and stakeholders, escalate technical and managerial issues at the right time and resolve conflicts.
· Demonstrate passion for quality and productivity by use of efficient development techniques, standards and guidelines
· Peer reviews of work. Actively mentor more junior members of the team, improving their skills, their knowledge of our systems and their ability to get things done
HOW DOES AMAZON FIT IN?

We're a part of Amazon: they are our parent company and it's a great partnership. You'll get to play with Amazon's technologies, but it doesn't stop there. Audible is built on a strong foundation of Amazon technology and you'll have insight into the inner workings of the world's leading ecommerce experience. There's a LOT to learn! Your career will benefit from working with teams like Alexa, Search, Kindle, A9, P13N and many more.

If you want to own and solve problems, work with a creative dynamic team, fail fast in a supportive environment whilst growing your career and working on a platform that powers web applications used by millions of customers worldwide we want to hear from you.




BASIC QUALIFICATIONS


· Bachelors degree or higher in Computer Science or related field
· 5+ years of professional software development experience
· Experience with a variety of modern programming languages (Java, JavaScript, C/C++) and open-source technologies (Linux, Spring, SOA)




PREFERRED QUALIFICATIONS


· Strong problem-solving skills with the ability to navigate highly complex and ambiguous situations
· Ability to work independently with little supervision and successfully resolve ambiguity
· Willingness to learn, be open minded to new ideas and different opinions yet knowing when to stop, analyze, and reach a decision
· Well-rounded engineering skills; full-stack development experience - web + services - If you've built something in your spare time send us the link, we'd love to hear about it
· Great communication skills - ability to think creatively and adapt the message to the audience. Can provide information to technical and non-technical stakeholders alike and guide them to confidently informed decisions
· Strong data-oriented skills with knowledge of Core Data and database design
· Prior use of AWS technologies at scale in a production environment
· Familiarity with big data technologies (Hadoop, Hive, Spark, etc.)
· Experience working with Agile methodologies

Audible is an Equal Opportunity Employer Minority / Women / Disability / Veteran / Gender Identity / Sexual Orientation / Age

HelloFresh
  • Berlin, Germany

At HelloFresh, we want to change the way people eat. Over the past 5 years we've seen this mission spread beyond our wildest dreams. So, how did we do it? Our weekly recipe boxes full of exciting recipes and lovingly sourced, fresh ingredients have blossomed into a community of inspired, energised home cooks that expands across the globe. Now we're the fastest growing company in Europe, active and growing in 9 different countries across 3 continents.


Our story started in Berlin. As Europe’s tech hub, and the home of our global headquarters, it’s a dynamic, progressive environment where innovation is nurtured and promoted. Since we started, we’ve worked exceptionally hard and we’ve received almost US$ 300 million in investment which together have allowed us to create an award winning product.


As a member of HelloTech you’ll be exposed to a modern technology stack and a slick cross functional agile team setup. We have developed a refined product and provide scalability on a global level. Join our HelloTech team and help us to build a fresh food global champion!



About the job


At HelloFresh, Data Science is an interdisciplinary team that designs, implements, and maintains state of the art machine learning models to automate and optimize marketing, operations, logistics and customer experience. With our rapid company expansion, we are currently looking for a Technical Lead Data Scientist that is excited about turning innovative ideas into powerful data driven products. Working closely with the Data Scientists and Machine Learning Engineers, you will help bringing these ideas into reality and improve the experience of millions of customers around the globe.


To succeed in this role, you’ll need a hunger for discovering information in massive amounts of data, an attitude towards experimentation, fluency in machine learning techniques and software engineering best practices.



As a Technical Lead Data Scientist you will be responsible for



  • Deriving and maintaining a technical vision for end-to-end machine learning pipelines.

  • Playing a key role in ensuring the ongoing alignment and standards of our technology vision for the data science team.

  • Leading by example by translating business problems into quantitative terms and productionized models to presenting the results in clear and effective manner through visualization and prototyping.

  • Collaborating with cross-functional teams to discover innovative ways of leveraging vast repositories of user generated data.

  • Turning prototypes into production ready implementation.

  • Measuring performance over time, tuning, and inspecting models.

  • Sharing data solutions and intel through Open Source and promoting a data driven culture within a large international organization.



Who we’re looking for



  • Significant experience in tech industry.

  • Excellent educational background, MSc in a Computer Science, Mathematics, Physics, Computer Linguistics, or similar quantitative field.

  • Great engineering abilities and the capacity to design resilient distributed systems that are flexible and scalable.

  • A strong understanding of the state of the art in machine learning methods and in depth techniques for practical application.

  • Great coding skills in Python and knowledge of data libraries as sklearn and pandas. Knowledge about big data frameworks as spark is a plus.

  • Exceptional written and verbal communication skills, with an ability to listen and show empathy.

  • A self-organized individual, with excellent focus and prioritization of workload using business data and metrics.

  • A passionate leader with a proven track record of mentoring team members.



What we offer



  • HelloFresh is a place that lets you implement your own ideas and contribute to our open source repositories.

  • The opportunity to get into one of the most intellectually demanding roles at one of the largest technology companies in Europe.

  • Cutting edge technology, allowing you to work with state-of-the-art tools and software solutions.

  • Competitive compensation and plenty of room for personal growth.

  • Great international exposure and team atmosphere.

  • Work in a modern, spacious office in the heart of Berlin with excellent transport links and employee perks.

Indeed - Tokyo, JP
  • Tokyo, Japan

Your job.



The role of Data Science at Indeed is to follow the data. We log, analyze, visualize, and model terabytes of job search data. Our Data Scientists build and implement machine learning models to make timely decisions. Each of us is a mixture of a statistician, scientist, machine learning expert, and engineer. We have a passion for building and improving Internet-scale products. We seek to understand human behavior via careful experimentation and analysis, to “help people get jobs”.

You're someone who wants to see the impact of your work making a difference every day. You understand how to use data to make decisions and how to train others to do so. You find passion in the craft and are constantly seeking improvement and better ways to solve tough problems. You produce the highest quality Data Science solutions and lead others to do the same.


You understand that the best managers serve their teams by removing roadblocks and giving individual contributors autonomy and ownership. You have high standards and will take pride in Indeed like we do as well as push us to be better. You have delivered challenging technical solutions at scale. You have led Data Science or engineering teams, and earned the respect of talented practitioners. You are equally happy talking about deep learning and statistical inference, as you are brainstorming about practical experimental design and technology career development. You love being in the mix technically while providing leadership to your teams.


About you.


Requirements   



  • Significant prior success as a Data Scientist working on challenging problems at scale

  • 5+ years of industrial Data Science experience, with expertise in machine learning and statistical modeling

  • The ability to guide a team to achieve important goals together

  • Have full stack experience in data collection, aggregation, analysis, visualization, productionization, and monitoring of Data Science products

  • Strong desire to solve tough problems with scientific rigour at scale

  • An understanding of the value derived from getting results early and iterating

  • Strong ability to coach Data Scientists, helping them improve their skills and grow their careers

  • Ph.D. or M.S. in a quantitative field such as Computer Science, Operations Research, Statistics, Econometrics or Mathematics

  • Passion to answer Product/Engineering questions with data

  • Proficiency with the English language 


We get excited about candidates who



  • Can do small data modeling work: R, Python, Julia, Octave

  • Can do big data modeling work: Hadoop, Pig, Scala, Spark

  • Can fish for data: SQL, Pandas, MongoDB

  • Can deploy Data Science solutions: Java, Python, C++

  • Can communicate concisely and persuasively with engineers and product managers



Indeed provides a variety of benefits that help us focus on our mission of helping people get jobs.


View our bounty of perks: http://indeedhi.re/IndeedBenefits  

IBM
  • Austin, TX

IBM Global Business Services: Join a Leader. Consult with us. IBM Global Business Services helps top-tier clients solve their most complex business and technical issues. As the Advanced Analytics Leader, you will deliver innovative business consulting, business process design, systems integration, and application design and management to leading commercial and public-sector organizations in 17 industries worldwide. With access to resources that only a global leader can provide, as a consultant you will learn valuable skills, gain access to a vast and diverse network of talented professionals, and enjoy unparalleled career, training, and educational opportunities.
 

As the Advanced Analytics Leader you will have an analytics background with in depth knowledge in-SAP, HANA, Big Data-Hadoop, Machine learning . The responsibilities include delivery on consulting engagements, sales activities, and thought leadership. You will also have strong leadership acumen, an ability to operate in positions requiring significant self-direction and motivation, and a proven track record in Analytics consultative selling solutions to senior business and IT leaders. You will be empowered to manage multiple priorities and capable of developing strong relationships at assigned accounts and must be able to:
 

  • Lead and manage data science projects. Support machine learning offerings and be a thought leader in machine learning initiatives across analytics
  • Have a proven track record of drawing insights from data and translating those insights into tangible business outcomes
  • Ability to implement new technologies with cutting-edge machine learning and statistical modeling techniques
  • Establish and maintain deal focused trusted relationships with clients and partners to scope, solution, propose, close and deliver complex projects
  • Identify, notice, validate, and qualify opportunities and help close them on an as needed basis. Maintain a strong pipeline of opportunities and progress them during the quarter
  • Recruit, motivate, mentor and develop team members

Bottom Line? We outthink ordinary. Discover what you can do at IBM.


Required Professional and Technical Expertise :

  • At least 5 years experience in professional services consulting, and sales at a national, or global management consulting firm
  • At least 3 years experience leading and delivering SAP HANA solutions - calculation view modeling, PAL, SQL scripting and performance tuning
  • At least 3 years experience working on full life cycle implementation projects with SAP
  • Strong understanding of SAP data SAP ERP/CRM/APO across modules SD, MM, PP, FI and SAP processes O2C, R2R, P2P
  • At least 2 years experience working on data science projects with of a variety of machine learning and data mining techniques (clustering, decision tree learning, GLM, Bayesian modeling artificial neural networks, etc.)
  • Expertise using statistical computer languages (R, Python, HANA PAL etc.) to manipulate data and draw insights from large data sets
  • Knowledge of working with HANA studio/web ide with calculation views, store procedures, flowgraphs, XS applications

Preferred Professional and Technical Expertise :

  • At least 5 years experience in applying predictive analytic methodologies in a commercial environment
  • At least 5 years experience in professional SAP services consulting, and sales at a national, or global management consulting firm
  • At least 5 years experience working with SAP HANA solution in areas of calculation view modeling, PAL, SQL scripting and performance tuning
  • At least 5 years experience working on full life cycle implementation projects with SAP
  • Strong understanding of SAP data SAP ERP/CRM/APO across modules SD, MM, PP,FI  and SAP processes O2C, R2R, P2P
  • At least 5 years deployment experience on data science projects with a variety of machine learning and data mining techniques (clustering, decision tree learning, artificial neural networks, etc.)
  • Masters in Mathematics, Statistics, Computer Science or similar degree

BENEFITS
Health Insurance. Paid time off. Corporate Holidays. Sick leave. Family planning. Financial Guidance. Competitive 401K. Training and Learning. We continue to expand our benefits and programs, offering some of the best support, guidance and coverage for a diverse employee population.
  • http://www-01.ibm.com/employment/us/benefits/
  • https://www-03.ibm.com/press/us/en/pressrelease/50744.wss
 
CAREER GROWTH
Our goal is to be essential to the world, which starts with our people. Company wide we kicked off an internal talent strategy program called Go Organic. At our core, we are committed to believing and investing in our workforce through:
 
  • Skill development: helping our employees grow their foundational skills
  • Finding the dream job at IBM: navigating our company with the potential for many careers by channeling an employees strengths and career aspirations
  • Diversity of people: Diversity of thought driving collective innovation
 
In 2015, Go Organic filled approximately 50% of our open positions with internal talent that were promoted into the role.


CORPORATE CITIZENSHIP
With an employee population of 375,000 in over 170 countries, amazingly we connect, collaborate, and care. IBMers drive a corporate culture of shared responsibility. We love grand challenges and everyday improvements for our company and for the world. We care about each other, our clients, and the communities we live, work, and play in!
  • http://www.ibm.com/ibm/responsibility/initiatives.html
  • http://www.ibm.com/ibm/responsibility/corporateservicecorps
State Farm
  • Dallas, TX

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

For over 95 years, data has been key to State Farm.  As a member of our data science team with the Enterprise Data & Analytics department under our Chief Data & Analytics Officer, you will work across the organization to solve business problems and help achieve business strategies.  You will employ sophisticated, statistical approaches and state of the art technology.  You will build and refine our tools/techniques and engage w/internal stakeholders across the organization to improve our products & services.


Implementing solutions is critical for success. You will do problem identification, solution proposal & presentation to a wide variety of management & technical audiences. This challenging career requires you to work on multiple concurrent projects in a community setting, developing yourself and others, and advancing data science both at State Farm and externally.


Skills & Professional Experience

·        Develop hypotheses, design experiments, and test feasibility of proposed actions to determine probable outcomes using a variety of tools & technologies

·        Masters, other advanced degrees, or five years experience in an analytical field such as data science quantitative marketing, statistics, operations research, management science, industrial engineering, economics, etc. or equivalent practical experience preferred.

·        Experience with SQL, Python, R, Java, SAS or MapReduce, SPARK

·        Experience with unstructured data sets: text analytics, image recognition etc.

·        Experience working w/numerous large data sets/data warehouses & ability to pull from such data sets using relevant programs & coding including files, RDBMS & Hadoop based storage systems

·        Knowledge in machine learning methods including at least one of the following: Time series analysis, Hierarchical Bayes; or learning techniques such as Decision Trees, Boosting, Random Forests.

·        Excellent communication skills and the ability to manage multiple diverse stakeholders across businesses & leadership levels.

·        Exercise sound judgment to diagnose & resolve problems within area of expertise

·        Familiarity with CI/CD development methods, Git and Docker a plus


Multiple location opportunity. Locations offered are: Atlanta, GA, Bloomington, IL, Dallas, TX and Phoenix, AZ


Remote work option is not available.


There is no sponsorship for an employment visa for the position at this time.


Competencies desired:
Critical Thinking
Leadership
Initiative
Resourcefulness
Relationship Building
Eliassen Group
  • Atlanta, GA

Machine Learning Engineer

$120-135k+ annual compensation plus bonus potential (flexible for right person)


We are seeking an experienced Machine Learning Engineer to join our Next Generation software team building an innovative platform that will be utilized worldwide. In this role you will play a pivotal role innovating new, globally consumed B2B software products. 


Qualifications: 

  • Theoretical and practical understanding of data mining and machine learning techniques such as: GLM/Regression, Random Forest, Boosting, Trees, text mining, Deep learning, social network analysis, Optimization, NLP, Probabilistic Inference, Information Retrieval, Recommendation Systems.
  • Strong coding and debugging skills in one or more of the following technologies: Java, Python, PySpark.ML, R, H2O, SparklyR , Pandas, Scikit-learn, Spark-Mllib


Please forward resumes to Steve Fritsch (sfritsch@eliassen.com)

Grubhub
  • Philadelphia, PA

More About the Role:

Grubhub is looking for a experienced Director of Data Engineering to build and lead all our Data Warehouse and Data Pipeline efforts. You will be heading multiple teams of passionate and skilled database engineers responsible for building and owning batch and streaming data pipelines that process tens of terabytes of data daily and support all of the analytics, business intelligence, data science and reporting data needs across the organization.

Grubhub Big data platforms are cutting edge and built primarily around a few core technologies including AWS EMR, Hive, Cassandra, S3 and Redshift for data storage. Apache Spark, python and Scala for data processing, Presto query engine and Azkaban for workflow management. You will also be interacting with several other technologies in the ecosystem primarily from acquisitions, partners or affiliates including Mysql, Postgress, SQL Server, Salesforce etc.


The Director of Web Engineering is responsible for the overall planning, organizing, and execution of the consumer-facing web applications. This includes directing all engineering efforts to meet product requirements, support, monitoring, and iterative improvement of existing web applications, strategic development of new application solutions when necessary and the high-level architecture and detailed design on the implementation and delivery of web application components. The role requires proven experience in planning, designing, developing and deploying performant, scalable and resilient web application systems.


Some Challenges Youll Tackle :

  • Understand and manage the data needs of different stakeholders across multiple business verticals including Finance, Marketing, Logistics, Product etc. Develop the vision and map strategy to provide proactive solutions and enable stakeholders to extract insights and value from data.
  • Understand end to end data management interactions and dependencies across complex data pipelines and data transformation and how they impact business decisions
  • Design best practices for big data processing, data modeling and warehouse development throughout the company. Design best practices for big data processing throughout the company
  • Translate from technical to business, and vice versa. You need to be able to speak with the least technically-minded client (internal or external) and make technology make sense to them. Then turn around and do it the other way
  • Evaluate new technologies and solutions to solve business problems

What's Actually Required:

  • 5+ years in a leadership/management capacity around data engineering, building data warehouse, data mart and data pipelines
  • Experience managing multiple stakeholders and managing team through multiple team leads
  • Experience with big data architectures and data modeling to efficiently process large volumes of data
  • Experience developing large data processing pipelines on Apache Spark.
  • Experience with Python or Scala programming languages
  • SQL is virtually as effortless as your native spoken language
  • Background in ETL and data processing, know how to transform data to meet business goals
  • Experience running agile projects
  • Excellent communication, adaptability and collaboration skills

 And Of Course, Perks! 

  • Flexible PTO. Its true, no strings attached and all the time you need to recharge.
  • PTO. Its true, we provide you a generous amount of time to recharge.
  • Better Benefits. Get quality insurance, flex-spending accounts, retirement options and commuter perks.  
  • Free Food. Kitchens are stocked and free Grubhub each week.
  • Stock Up. All of our employees are owners, in fact, theyre granted Restricted Stock Units, which means were all in it to win it.
  • Casual Culture. Catch rays on the rooftop or get comfy on a couch and get to know your coworkers because work, should be a place you want to be.

Grubhub is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics. The EEO is the Law poster is available here: DOL Poster. If you are applying for a job in the U.S. and need a reasonable accommodation for any part of the employment process, please send an e-mail to talentacquisition@grubhub.com and let us know the nature of your request and contact information. Please note that only those inquiries concerning a request for reasonable accommodation will be responded to from this e-mail address.

Vaco - San Diego
  • San Diego, CA

Senior Backend Engineer- Context Awareness Team




About Us:


We are a platform for today's busy families, bringing them closer together by helping them better sync, communicate with and protect the people they care about most.


Our mobile app provides millions of families in over 140 countries with services such as private location sharing, location history, drive details, crash detection, roadside assistance and help alerts through our free and paid membership subscription.


Founded in 2008, we are based in San Francisco with offices in San Diego, Las Vegas and Ft. Lauderdale.


We have raised +$100M from investors such as Bessemer Venture Partners, DCM, Fontinalis Partners, BMW iVentures, Allstate, Bullpen Capital, Founders Fund (FF Angel), Launch Capital, Kapor Capital, and 500 startups.




About the Context Awareness Backend role:


* Work closely within a cross-platform team which provides up-to-date and real-time location and driving information to the families which use our app


* Build and support an engine for collecting, processing, and storing tens of thousands of locations saves per second


* Build and support the systems which collect, process, and store millions of drives daily


* Maintain and improve the systems which alert users in real-time when a vehicular collision occurs


* Research, prototype, and build new systems to provide location context to users, potentially using machine learning


* Ensure that our APIs are able to process millions of requests per minute, looking for ways to scale us up by 5x over the next few years


* Be a very active contributor to our diverse codebase; we have a lot of Java, are growing in Scala, and have legacy systems in PHP, Python, and Golang


* Engage with feature developers to ensure code is written with performance, scale, and maintainability in mind


* Use automation tools as often as possible, and develop and improve these tools


* Handle 4.5 billion daily API calls comfortably






You're that someone with these relevant skills:


* Proficient in JVM languages. This team uses primarily Java (Spring Boot) and Scala (Akka, Lagom); deep knowledge of either is required, of both is great


* Familiarity with PHP, Python, and/or Go (to maintain/convert our legacy code bases) are pros


* Excellent understanding of data stores, distributed systems, data modeling and their associated pitfalls.


* Several years' experience with microservices


* Experience with the AWS environment and its various tools


* Agile software development experience


* Ability to work in a cross-functional team


* A desire to bring innovative solutions to the challenges of scaling the API and platform




Some of the things you'll do:


* Build new services in Java and/or Scala


* Break up legacy monoliths in PHP and Python into Java and/or Scala microservices


* Design new systems


* Conduct technical and code reviews and approve pull requests


* Take specs and translate them into reality




Successful candidates will have:


* Minimum 5 years of relevant experience required


* Strong attention to detail


* A commitment to the importance of craftsmanship and incremental delivery


* Comfort with the uncertainty of working with new technologies


* Strong and clear communication skills


* Ability to work effectively with remote teammates


* A sense of humor and the ability not to take yourself too seriously






Perks:


* Competitive pay and benefits


* Medical, dental and vision insurance plans


* 401k plan


* $200/month Quality of Life perk


* Whatever makes you stronger makes us stronger. We buy you the things you need to improve yourself and get your job done.

Blue Chip Talent
  • Ann Arbor, MI

Summary:
This position is part of the business intelligence team and will be responsible for projects including BI and Analytics but with a lot of opportunity to innovate by incorporating advanced capabilities of descriptive and predictive insights into BI Deliverables. As a dynamic and effective BI team member, you will liaise with business across all domains to understand their growing analytic data needs and develop and operationalize solutions with business impact. The ideal candidate will leverage their knowledge of business intelligence tools, statistical and data mining techniques, data warehouse and SQL to find innovative solutions utilizing data from multiple sources. We are looking for a strong team member who can communicate with the business as well as IT.

GENERAL RESPONSIBILITIES

  • Drive the utilization of new data sources for impactful business insights
  • Condense large data sets into clear and concise observations and recommendations
  • Design and develop BI/Analytics solution to facilitate end user experience and impact the business
  • Generate new ideas and execute on solutions
  • Demonstrate expert knowledge of at least one analytics tool
  • Understand and apply advanced techniques for data extraction, cleaning and preparation tasks
  • Understand dimensional modeling and data warehouse concepts
  • Able to clearly articulate findings and answers through effective data visualization approaches
  • Work with stakeholders to determine analytics data requirements and implement solutions to provide actionable business insight
  • Serve as a mentor for other team members
  • Perform BI on call duties
  • Regularly share best practices to help develop others
  • Able to effectively communicate with multiple stakeholders across the organization
  • Able to work in a team-oriented, collaborative, fast-paced and diverse environment
  • 5+ years of Experience with implementing and supporting BI/Analytics solutions
  • 5+ years of Experience with databases and querying language, particularly SQL
  • 3+ years of Experience with statistical tool such as R and SAS
  • 5+ years of experience delivering BI/DW solutions within an Agile construct
  • Experience working with large data sets; experience with distributed computing tools (Hadoop, Map/Reduce, Hive, Spark) and other emerging technologies is highly desired
  • Familiarity with statistical methods used in descriptive and inferential statistics
  •  Knowledge of statistical modeling and machine learning is preferable
  • Fluency with programming language such as Python
Elev8 Hire Solutions
  • Atlanta, GA

Sr. Python Developer/Data Scientist

We are an AI parts inventory optimization software that makes it easy for enterprises to manage their maintenance and repair operations (MRO). We've raised millions in venture funding from both Silicon Valley and Boston and are growing fast. Are you a driven software engineer interested in being a part of that growth?

Role Expectations/Responsibilities:

  • Write clean, maintainable, thoroughly tested,code
  • Participate in product, design, and code reviews
  • QA and ship code daily
  • Identify, incorporate, and communicate best practices

Required Skills:

  • 10+ years of software engineering experience
  • Proficient in Python
  • Proficient in SciPy and SciKit
  • Experience in a python testing framework
  • Proficient in neural network theory
  • Proficient in general machine learning algorithms
  • Proficient in using Tensorflow
  • Proficient in using Keras
  • Proficient in architecting, testing, optimizing, and deploying deep learning models
  • Competency in Git
  • Experience with data structures, algorithm design, problem-solving, and complexity analysis

Nice to have:

  • Experience in a startup environment
  • Experience working on a small team with high visibility
  • Ability to handle a high volume of projects

Benefits:

  • Health/dental/vision coverage
  • 15 Days PTO
  • Option for 1 day remote
Booster Fuels
  • Dallas, TX

About Booster

Do you want to impact the lives of nearly all Americans by eliminating the errand of going to the gas station? Do you want to impact the lives of everyone by eliminating the errand of going to the gas station? Booster has redesigned the infrastructure to deliver gas and replaced the gas station with an app. Our technology and operational expertise enable Booster to deliver fresh gas at the same price as traditional gas stations. This is a difficult problem to master and we are making it happen. Every day, we solve incredibly hard problems to create an experience for our customers that is absolutely magical.


Booster is powered by data. We are creating the best way for consumers and businesses to get gas by applying data, algorithms and machine learning to problems in logistics, retail, personalization, pricing and more.


Are you interested in working at the intersection of applied quantitative research, engineering development, and data science? If so, this is the job for you.


About the Role

The Booster Operations Team is looking for a passionate and solution-oriented Operations Data Analyst to develop and implement analytical solutions that deepen our collective understanding of customers, influence innovation, and deliver actionable insights to accelerate the growth and efficiency of Booster operations.


This position requires a deep quantitative domain expertise along with the ability to manage internal and external stakeholder teams. You will be responsible for delivering key insights to the operations team, the development of core KPIs and metrics, partnering to design optimized schedules and routes, balancing supply and demand, and analyzing and providing guidance on process stability and quality, with a keen focus on data quality and completeness.


You must be passionate about and undaunted by an operations engine with trucks, drivers, complex supply chains, and rapid growth. Successful candidates will be entrepreneurial, discontent with the status-quo and obsessed with improving anything they touch.


What You Will Be Doing

  • Delight customers

  • Work on a dynamic and constantly changing delivery platform

  • Ensure customer orders are planned efficiently and delivered on time

  • Come up with scalable solutions to continuously evolving logistics problems

  • Build predictive models

  • Forecast supply and demand

  • Optimize delivery network and dispatch operations

  • Forecast consumer demand

  • Estimated time of arrival

  • Partner with Operations and Technology leaders to build data-centric solutions to business impact

  • Help us realize a data-informed culture and teach people how to Think Like A Scientist

  • Find other exciting problems to solve


Requirements

  • Bachelor's degree or higher in Operations Research, Applied Mathematics or a related field

  • Comfortable with research methodologies to address abstract business and product problems with utmost precision, while staying grounded in common sense. Makes complex problems simple.

  • 4+ years of strong quantitative industry experience (2+ years with a PhD)

  • Expertise in mathematical optimization and implementing tailored solution approaches

  • Experience in data mining, predictive modeling and statistical analysis

  • Solid common sense and business acumen

  • Writing production applications using Python, R, and SQL

  • Superior analytical skills and a strong sense of ownership in your work

  • Self-motivated drive to build, launch and iterate on products

  • Comfortable giving definition to ambiguous problems, can do this independently with limited guidance


Benefits

  • Stay healthy: 100% employer paid medical, dental, vision, disability and life insurance coverage

  • Refuel: open vacation policy, take the time you need when you need it

  • Monthly team building events (yoga, safari adventures, wine blending, paint nights and bocce tournaments to name a few)

  • Early Stock Options at a fast-growing startup with a strong VC backing


Individuals seeking employment at Booster are considered without regards to race, religion, color, national origin, ancestry, sex, gender, gender identity, gender expression, sexual orientation, marital status, age, physical or mental disability or medical condition (except where physical fitness is a valid occupational qualification), genetic information, veteran status, or any other consideration made unlawful by federal, state or local laws.


Booster does not discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant.


Disclaimer: The above statements are intended to describe the general nature and level of work being performed by associates assigned to this classification. They are not to be construed as an exhaustive list of all responsibilities, duties, and skills required of personnel so classified. All personnel may be required to perform duties outside of their normal responsibilities from time to time, as needed


Booster doesn't accept unsolicited agency resumes and won't pay fees to any third-party agency or firm that doesn't have a signed agreement with Booster.

Levatas
  • Palm Beach Gardens, FL

Levatas is an AI solutions company. We help our partners understand and deploy artificial intelligence solutions across their enterprise using Data Science, Machine Learning, Predictive Analytics, Automation, Machine Learning, and Natural Language Processing.



Our ability to create the future of business is only as strong as the smart, creative people who make up our team. We believe that doing the best work of our lives shouldn't come at the expense of happiness and balance, which is why we're consistently recognized as a best place to work and top company culture.



Levatas is seeking a qualified Senior Software Engineer to join our Technology team at Levatas headquarters. We're looking for someone who's ready to do their best work of their career, architecting, designing, and implementing software solutions, with specific concentration in cross-disciplinary problem solving and collaborative development for our growing base of first-class business clients.



Core responsibilities:




  • Design, develop, and maintain software solutions to meet business requirements of the clients

  • Develop and document project deliverables such as requirement specifications, proposed solutions, detailed designs, project plans, system impact analysis and proof of concepts

  • Program, test, build, integrate and implement web based multi-tier applications of varying complexities

  • Analyze and fully understand project requirements to formulate and implement programmatic solutions that efficiently and effectively address clients  requirements

  • Integrate applications by designing database architecture and server scripting; studying and establishing connectivity with network systems, search engines, and information servers

  • Use engineering principles to conduct technical investigations involved with the modification of material; component; or process specifications and requirements

  • Advise, mentor, train or assist engineers and developers at other skill levels, as needed, to ensure timely releases of high quality code

  • Provide technical consulting services to projects and production system issues.



The following are profile items that interest us:




  • 2-5 years experience in Amazon (AWS) native serverless application development

  • Experience in working with various AWS services such as EC2, ECS, EBS, S3, Glacier, SNS, SQS, IAM, Auto scaling, OpsWorks, Route 53, VPC, CloudFront, Direct Connect, Cloud Trail, Cloud Watch and building CI/CD on AWS environment using AWS Code Commit, Code Build, Code Deploy and Code Pipeline.

  • 3-5 years Nodejs development experience, preferably writing Lambda functions in AWS.

  • 5 years of full-time work experience in software engineering, information technology, or related domains.

  • An unlimited passion about software development

  • Willing to work across the stack to tackle technical challenges anywhere in the system.

  • Interest in working in a cross-functional team that touches many of the core systems and user flows of our customers.

  • Demonstrable experience in compile-time languages such as .NET C#, Java, Swift, or Kotlin.

  • Demonstrable experience working with relational and NoSQL database technologies

  • Demonstrable experience building web applications with HTML5, Javascript, CSS, using JavaScript frameworks like AngularJS, VueJS, or ReactJS

  • Knowledge and understanding of data science, machine learning, Tensorflow (or similar platform like Keras), and Python a huge plus

  • Knowledge and understanding of data transformation processes, ETL, etc. a plus.

  • Experience with designing and building large scale production systems or features.

  • Ability to leverage and integrate with third party APIs.

  • Experience with SOA (Service Oriented Architecture) designs a plus.

  • Advanced analytical thinking; experienced with making product decisions based on data and A/B testing.

  • Exposure to architectural patterns of a large, high-scale web application.

  • Strong interpersonal and communication skills

  • Experience working with Scrum or a similar Agile management process



This role is based in Palm Beach County, Florida.

DTTS
  • Nashville, TN
Deloitte Transactions and Business Analytics LLP advises clients on managing business controversy and conflict, executing deals, and maintaining regulatory compliance. We provide services to companies throughout their lifecycle from purchasing a company to investigating potential fraud.

 

In 2008, Deloitte Transactions and Business Analytics LLP (DTBA) opened a new facility in Nashville, TN for e-discovery and computer forensics work.  DTBA's E-Discovery Solutions Center ("EDSC") continues to increase the capacity, quality, and efficiency of DTBA' s e-file processing by utilizing state-of-the-art technology in a dedicated environment. We are looking for dedicated individuals to join DTBA's Analytic & Forensic Technology practice and help drive the success of the EDSC.

 

The EDSC is a 24x7, 365 days a year operation.  The position is located in Nashville, Tennessee. Our operators perform the work as well as quality check points. Typically, each team has a 1st, second and weekend shift representative

Work youll do:

* Utilize EDSC eDiscovery applications and internal systems following documented processes and procedures
* Processes data for loading into hosted review databases
* Identify and escalate issues as encountered
* Validate data throughout processes and update processing group documents
* Assist the processing team with identifying areas for documentation and recommend improvements.
* Perform quality checks per documented processes
* Prepare client data deliverables
* Work flexible schedules to meet project deadlines
 

Qualifications:

* Bachelors Degree  preferred or commensurate work experience
* 1+ years of related E-Discovery experience
* Experience with one or more of the following preferred: SQL, SQL+, Visual Basic/VBSCRIPT, MS-Access, C++, C#,XML, Java, ASP, Windows
* Excellent verbal and written communication skills
* Working knowledge of basic PC functions and Windows environment (e.g. MS Office)
* General knowledge of databases, networking, and systems concepts
* Experience with both traditional (Term and Boolean) as well as concept-based search tools and technologies
* Hands on experience with several ESI data processing platforms (e.g. Clearwell, eCapture, etc.)
* Intermediate knowledge of several ESI data hosting platforms (e.g. Concordance, Relativity, Clearwell, etc.)
 

The Team

In todays global marketplace, organizations can become vulnerable to critical incidents that include international corruption, financial crime, enterprise fraud, cybercrime and supply chain breakdowns. Utilizing market-leading technology to uncover latent possibilities, our team advises clients on ways to mitigate exposure to these threats and turn business issues into opportunities for growth, resilience and long-term advantage. Learn more about Deloitte Advisorys Forensics and Investigations practice.

 

How youll grow

At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe theres always room to learn. We offer opportunities to help sharpen skills in addition to hands-on experience in the global, fast-changing business world.  From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career.  Explore Deloitte University, The Leadership Center.

 

Benefits

At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.

 

Deloittes culture

Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives.  Learn more about Life at Deloitte.

 

Corporate citizenship

Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities.  We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities.  Learn more about Deloittes impact on the world.

 

Recruiter tips

We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area youre applying to. Check out recruiting tips from Deloitte professionals.