Deloitte Consulting LLP seeks a Consulting, Senior Consultant in Boston, Massachusetts and various unanticipated Deloitte office locations and client sites nationally.
Work You’ll Do
Modernize business and core environments to leverage technology innovations across multiple platforms; drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise for continuous insights and improvements. Drive various aspects of the full data delivery lifecycle for implementations, which may include requirements gathering, business process analysis, scripting, code deployments, enhancement validation, triage of defects and change requests, escalations and urgent issues. Create architecturally significant non-functional requirements to design reusable data ingestion and data management components. Establish data ingestion patterns and design data integration interfaces with internal and external data sources. Drive advisory and implementation services of large-scale data ecosystems, including data management, governance and the integration of structured and unstructured data, to generate insights leveraging various platforms to help companies unlock the value of big technology investments. Provide technical recommendations for optimized data access and retention from various data warehouses. Create technical and functional specifications, including screen layouts, navigation flows, logic diagrams, data models, and process models. Design and implement solutions to extract and reconcile data scattered across multiple sources.
- 50% Travel required nationally.
- Telecommuting permitted.
Requirements
Bachelor’s degree or foreign equivalent in Business Administration, any STEM field, or a related field. Must have 2 years of related work experience in the job offered or in a related occupation. Must also have 2 years of related work experience involving each of the following:
- Using scalable cloud-based technologies, including Amazon Web Services (AWS) and Azure, to build reusable systems that generate insights;
- Providing industry insight and analytical support using data mining, pattern matching, data visualization, and predictive modeling tools, including TensorFlow, Keras, scikit-learn, scipy, matplotlib, statsmodels, spacy, and Gensim, and consolidating the implications to present recommendations to senior management and executives;
- Designing and implementing reporting and visualization for unstructured and structured data sets, utilizing Python and Python libraries, including Pandas, Numpy, Natural Language Toolkit (NLTK), Gensim, Seaborn and Plotly;
- Designing and developing data cleansing routines utilizing data quality functions, including standardization, transformation, rationalization, linking and matching;
- Delivering scalable analytic solutions using distributed computing in production systems;
- Performing data analysis, including inspecting, cleansing, transforming and modeling, for enabling data mining and resolving data mapping and data modeling issues;
- Providing insights into businesses using R, Python, data modeling of Microsoft SQL Server or Postgre SQL, and statsmodels, sklearn, dplyr, ggplot and Shiny libraries; and
- Building reports and data models in Tableau or front-end systems based on performance measurement metrics, to provide insights into the organization's data.
EOE.