Duration 3 Days 18 CPD hours This course is intended for This in an intermediate and beyond-level course is geared for experienced Python developers looking to delve into the exciting field of Natural Language Processing. It is ideally suited for roles such as data analysts, data scientists, machine learning engineers, or anyone working with text data and seeking to extract valuable insights from it. If you're in a role where you're tasked with analyzing customer sentiment, building chatbots, or dealing with large volumes of text data, this course will provide you with practical, hands on skills that you can apply right away. Overview This course combines engaging instructor-led presentations and useful demonstrations with valuable hands-on labs and engaging group activities. Throughout the course you'll: Master the fundamentals of Natural Language Processing (NLP) and understand how it can help in making sense of text data for valuable insights. Develop the ability to transform raw text into a structured format that machines can understand and analyze. Discover how to collect data from the web and navigate through semi-structured data, opening up a wealth of data sources for your projects. Learn how to implement sentiment analysis and topic modeling to extract meaning from text data and identify trends. Gain proficiency in applying machine learning and deep learning techniques to text data for tasks such as classification and prediction. Learn to analyze text sentiment, train emotion detectors, and interpret the results, providing a way to gauge public opinion or understand customer feedback. The Hands-on Natural Language Processing (NLP) Boot Camp is an immersive, three-day course that serves as your guide to building machines that can read and interpret human language. NLP is a unique interdisciplinary field, blending computational linguistics with artificial intelligence to help machines understand, interpret, and generate human language. In an increasingly data-driven world, NLP skills provide a competitive edge, enabling the development of sophisticated projects such as voice assistants, text analyzers, chatbots, and so much more. Our comprehensive curriculum covers a broad spectrum of NLP topics. Beginning with an introduction to NLP and feature extraction, the course moves to the hands-on development of text classifiers, exploration of web scraping and APIs, before delving into topic modeling, vector representations, text manipulation, and sentiment analysis. Half of your time is dedicated to hands-on labs, where you'll experience the practical application of your knowledge, from creating pipelines and text classifiers to web scraping and analyzing sentiment. These labs serve as a microcosm of real-world scenarios, equipping you with the skills to efficiently process and analyze text data. Time permitting, you?ll also explore modern tools like Python libraries, the OpenAI GPT-3 API, and TensorFlow, using them in a series of engaging exercises. By the end of the course, you'll have a well-rounded understanding of NLP, and will leave equipped with the practical skills and insights that you can immediately put to use, helping your organization gain valuable insights from text data, streamline business processes, and improve user interactions with automated text-based systems. You?ll be able to process and analyze text data effectively, implement advanced text representations, apply machine learning algorithms for text data, and build simple chatbots. Launch into the Universe of Natural Language Processing The journey begins: Unravel the layers of NLP Navigating through the history of NLP Merging paths: Text Analytics and NLP Decoding language: Word Sense Disambiguation and Sentence Boundary Detection First steps towards an NLP Project Unleashing the Power of Feature Extraction Dive into the vast ocean of Data Types Purification process: Cleaning Text Data Excavating knowledge: Extracting features from Texts Drawing connections: Finding Text Similarity through Feature Extraction Engineer Your Text Classifier The new era of Machine Learning and Supervised Learning Architecting a Text Classifier Constructing efficient workflows: Building Pipelines for NLP Projects Ensuring continuity: Saving and Loading Models Master the Art of Web Scraping and API Usage Stepping into the digital world: Introduction to Web Scraping and APIs The great heist: Collecting Data by Scraping Web Pages Navigating through the maze of Semi-Structured Data Unearth Hidden Themes with Topic Modeling Embark on the path of Topic Discovery Decoding algorithms: Understanding Topic-Modeling Algorithms Dialing the right numbers: Key Input Parameters for LSA Topic Modeling Tackling complexity with Hierarchical Dirichlet Process (HDP) Delving Deep into Vector Representations The Geometry of Language: Introduction to Vectors in NLP Text Manipulation: Generation and Summarization Playing the creator: Generating Text with Markov Chains Distilling knowledge: Understanding Text Summarization and Key Input Parameters for TextRank Peering into the future: Recent Developments in Text Generation and Summarization Solving real-world problems: Addressing Challenges in Extractive Summarization Riding the Wave of Sentiment Analysis Unveiling emotions: Introduction to Sentiment Analysis Tools Demystifying the Textblob library Preparing the canvas: Understanding Data for Sentiment Analysis Training your own emotion detectors: Building Sentiment Models Optional: Capstone Project Apply the skills learned throughout the course. Define the problem and gather the data. Conduct exploratory data analysis for text data. Carry out preprocessing and feature extraction. Select and train a model. ? Evaluate the model and interpret the results. Bonus Chapter: Generative AI and NLP Introduction to Generative AI and its role in NLP. Overview of Generative Pretrained Transformer (GPT) models. Using GPT models for text generation and completion. Applying GPT models for improving autocomplete features. Use cases of GPT in question answering systems and chatbots. Bonus Chapter: Advanced Applications of NLP with GPT Fine-tuning GPT models for specific NLP tasks. Using GPT for sentiment analysis and text classification. Role of GPT in Named Entity Recognition (NER). Application of GPT in developing advanced chatbots. Ethics and limitations of GPT and generative AI technologies.
Duration 3 Days 18 CPD hours This course is intended for This course is geared for attendees with Intermediate IT skills who wish to learn Computer Vision with tensor flow 2 Overview This 'skills-centric' course is about 50% hands-on lab and 50% lecture, with extensive practical exercises designed to reinforce fundamental skills, concepts and best practices taught throughout the course. Working in a hands-on learning environment, led by our Computer Vision expert instructor, students will learn about and explore how to Build, train, and serve your own deep neural networks with TensorFlow 2 and Keras Apply modern solutions to a wide range of applications such as object detection and video analysis Run your models on mobile devices and web pages and improve their performance. Create your own neural networks from scratch Classify images with modern architectures including Inception and ResNet Detect and segment objects in images with YOLO, Mask R-CNN, and U-Net Tackle problems faced when developing self-driving cars and facial emotion recognition systems Boost your application's performance with transfer learning, GANs, and domain adaptation Use recurrent neural networks (RNNs) for video analysis Optimize and deploy your networks on mobile devices and in the browser Computer vision solutions are becoming increasingly common, making their way into fields such as health, automobile, social media, and robotics. Hands-On Computervision with TensorFlow 2 is a hands-on course that thoroughly explores TensorFlow 2, the brandnew version of Google's open source framework for machine learning. You will understand how to benefit from using convolutional neural networks (CNNs) for visual tasks. This course begins with the fundamentals of computer vision and deep learning, teaching you how to build a neural network from scratch. You will discover the features that have made TensorFlow the most widely used AI library, along with its intuitive Keras interface. You'll then move on to building, training, and deploying CNNs efficiently. Complete with concrete code examples, the course demonstrates how to classify images with modern solutions, such as Inception and ResNet, and extract specific content using You Only Look Once (YOLO), Mask R-CNN, and U-Net. You will also build generative dversarial networks (GANs) and variational autoencoders (VAEs) to create and edit images, and long short-term memory networks (LSTMs) to analyze videos. In the process, you will acquire advanced insights into transfer learning, data augmentation, domain adaptation, and mobile and web deployment, among other key concepts Computer Vision and Neural Networks Computer Vision and Neural Networks Technical requirements Computer vision in the wild A brief history of computer vision Getting started with neural networks TensorFlow Basics and Training a Model TensorFlow Basics and Training a Model Technical requirements Getting started with TensorFlow 2 and Keras TensorFlow 2 and Keras in detail The TensorFlow ecosystem Modern Neural Networks Modern Neural Networks Technical requirements Discovering convolutional neural networks Refining the training process Influential Classification Tools Influential Classification Tools Technical requirements Understanding advanced CNN architectures Leveraging transfer learning Object Detection Models Object Detection Models Technical requirements Introducing object detection A fast object detection algorithm YOLO Faster R-CNN ? a powerful object detection model Enhancing and Segmenting Images Enhancing and Segmenting Images Technical requirements Transforming images with encoders-decoders Understanding semantic segmentation Training on Complex and Scarce Datasets Training on Complex and Scarce Datasets Technical requirements Efficient data serving How to deal with data scarcity Video and Recurrent Neural Networks Video and Recurrent Neural Networks Technical requirements Introducing RNNs Classifying videos Optimizing Models and Deploying on Mobile Devices Optimizing Models and Deploying on Mobile Devices Technical requirements Optimizing computational and disk footprints On-device machine learning Example app ? recognizing facial expressions
Duration 4 Days 24 CPD hours This course is intended for This course is intended for: Developers Solutions Architects Data Engineers Anyone with little to no experience with ML and wants to learn about the ML pipeline using Amazon SageMaker Overview In this course, you will learn to: Select and justify the appropriate ML approach for a given business problem Use the ML pipeline to solve a specific business problem Train, evaluate, deploy, and tune an ML model using Amazon SageMaker Describe some of the best practices for designing scalable, cost-optimized, and secure ML pipelines in AWS Apply machine learning to a real-life business problem after the course is complete This course explores how to use the machine learning (ML) pipeline to solve a real business problem in a project-based learning environment. Students will learn about each phase of the pipeline from instructor presentations and demonstrations and then apply that knowledge to complete a project solving one of three business problems: fraud detection, recommendation engines, or flight delays. By the end of the course, students will have successfully built, trained, evaluated, tuned, and deployed an ML model using Amazon SageMaker that solves their selected business problem. Module 0: Introduction Pre-assessment Module 1: Introduction to Machine Learning and the ML Pipeline Overview of machine learning, including use cases, types of machine learning, and key concepts Overview of the ML pipeline Introduction to course projects and approach Module 2: Introduction to Amazon SageMaker Introduction to Amazon SageMaker Demo: Amazon SageMaker and Jupyter notebooks Hands-on: Amazon SageMaker and Jupyter notebooks Module 3: Problem Formulation Overview of problem formulation and deciding if ML is the right solution Converting a business problem into an ML problem Demo: Amazon SageMaker Ground Truth Hands-on: Amazon SageMaker Ground Truth Practice problem formulation Formulate problems for projects Module 4: Preprocessing Overview of data collection and integration, and techniques for data preprocessing and visualization Practice preprocessing Preprocess project data Class discussion about projects Module 5: Model Training Choosing the right algorithm Formatting and splitting your data for training Loss functions and gradient descent for improving your model Demo: Create a training job in Amazon SageMaker Module 6: Model Evaluation How to evaluate classification models How to evaluate regression models Practice model training and evaluation Train and evaluate project models Initial project presentations Module 7: Feature Engineering and Model Tuning Feature extraction, selection, creation, and transformation Hyperparameter tuning Demo: SageMaker hyperparameter optimization Practice feature engineering and model tuning Apply feature engineering and model tuning to projects Final project presentations Module 8: Deployment How to deploy, inference, and monitor your model on Amazon SageMaker Deploying ML at the edge Demo: Creating an Amazon SageMaker endpoint Post-assessment Course wrap-up Additional course details: Nexus Humans The Machine Learning Pipeline on AWS training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the The Machine Learning Pipeline on AWS course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
Duration 4 Days 24 CPD hours This course is intended for This course is geared for attendees with Intermediate IT skills who wish to learn Computer Vision with tensor flow 2 Overview This 'skills-centric' course is about 50% hands-on lab and 50% lecture, with extensive practical exercises designed to reinforce fundamental skills, concepts and best practices taught throughout the course. Working in a hands-on learning environment, led by our Computer Vision expert instructor, students will learn about and explore how to Build, train, and serve your own deep neural networks with TensorFlow 2 and Keras Apply modern solutions to a wide range of applications such as object detection and video analysis Run your models on mobile devices and web pages and improve their performance. Create your own neural networks from scratch Classify images with modern architectures including Inception and ResNet Detect and segment objects in images with YOLO, Mask R-CNN, and U-Net Tackle problems faced when developing self-driving cars and facial emotion recognition systems Boost your application's performance with transfer learning, GANs, and domain adaptation Use recurrent neural networks (RNNs) for video analysis Optimize and deploy your networks on mobile devices and in the browser Computer vision solutions are becoming increasingly common, making their way into fields such as health, automobile, social media, and robotics. Hands-On Computervision with TensorFlow 2 is a hands-on course that thoroughly explores TensorFlow 2, the brand-new version of Google's open source framework for machine learning. You will understand how to benefit from using convolutional neural networks (CNNs) for visual tasks. This course begins with the fundamentals of computer vision and deep learning, teaching you how to build a neural network from scratch. You will discover the features that have made TensorFlow the most widely used AI library, along with its intuitive Keras interface. You'll then move on to building, training, and deploying CNNs efficiently. Complete with concrete code examples, the course demonstrates how to classify images with modern solutions, such as Inception and ResNet, and extract specific content using You Only Look Once (YOLO), Mask R-CNN, and U-Net. You will also build generative adversarial networks (GANs) and variational autoencoders (VAEs) to create and edit images, and long short-term memory networks (LSTMs) to analyze videos. In the process, you will acquire advanced insights into transfer learning, data augmentation, domain adaptation, and mobile and web deployment, among other key concepts. Computer Vision and Neural Networks Computer Vision and Neural Networks Technical requirements Computer vision in the wild A brief history of computer vision Getting started with neural networks TensorFlow Basics and Training a Model TensorFlow Basics and Training a Model Technical requirements Getting started with TensorFlow 2 and Keras TensorFlow 2 and Keras in detail The TensorFlow ecosystem Modern Neural Networks Modern Neural Networks Technical requirements Discovering convolutional neural networks Refining the training process Influential Classification Tools Influential Classification Tools Technical requirements Understanding advanced CNN architectures Leveraging transfer learning Object Detection Models Object Detection Models Technical requirements Introducing object detection A fast object detection algorithm ? YOLO Faster R-CNN ? a powerful object detection model Enhancing and Segmenting Images Enhancing and Segmenting Images Technical requirements Transforming images with encoders-decoders Understanding semantic segmentation Training on Complex and Scarce Datasets Training on Complex and Scarce Datasets Technical requirements Efficient data serving How to deal with data scarcity Video and Recurrent Neural Networks Video and Recurrent Neural Networks Technical requirements Introducing RNNs Classifying videos Optimizing Models and Deploying on Mobile Devices Optimizing Models and Deploying on Mobile Devices Technical requirements Optimizing computational and disk footprints On-device machine learning Example app ? recognizing facial expressions
Duration 2 Days 12 CPD hours This course is intended for The course is designed for individuals who want to gain in-depth knowledge and practice in the discipline of managing requirements (Business Analysts, Requirements Engineers, Product manager, Product Owner, Chief Product Owner, Service Manager, Service Owner, Project manager, Consultants) Overview Students should be able to demonstrate knowledge and understanding and application of Requirements Engineering principles and techniques. Key areas are: Requirements Engineering framework The hierarchy of requirements Key stakeholders in the framework Requirements elicitation Requirements modelling Requirements documentation Requirements analysis Requirements validation Requirements management The Business Analyst role analyzes, understands and manages the requirements in a customer-supplier relationship and ensures that the right products are delivered.The practical course provides in-depth knowledge and practice in Requirements Engineering. Course Introduction Let?s Get to Know Each Other Course Overview Course Learning Objectives Course Structure Course Agenda Introduction to Business Analysis Structure and Benefits of Business Analysis Foundation Exam Details Business Analysis Certification Scheme What is Business Analysis? Intent and Context Origins of business analysis The development of business analysis The scope of business analysis work Taking a holistic approach The role and responsibilities of the business analyst The competencies of a Business Analyst Personal qualities Business knowledge Professional techniques The development of competencies Strategy Analysis The context for strategy The definition of strategy Strategy development External environmental analysis Internal environmental analysis SWOT analysis Executing strategy Business Analysis Process Model An approach to problem solving Stages of the business analysis process model Objectives of the process model stages Procedures for each process model stage Techniques used within each process model stage Investigation Techniques Interviews Observation Workshops Scenarios Prototyping Quantitative approaches Documenting the current situation Stakeholder Analysis and Management Stakeholder categories and identification Analysing stakeholders Stakeholder management strategies Managing stakeholders Understanding stakeholder perspectives Business activity models Modelling Business Processes Organizational context An altrnative view of an organization The organizational view of business processes Value propositions Process models Analysing the as-is process model Improving business processes (to-be business process) Defining the Solution Gab analysis Introduction to Business Architecture Definition to Business Architecture Business Architecture techniques Business and Financial Case The business case in the project lifecycle Identifying options Assessing project feasibility Structure of a business case Investment appraisal Establishing the Requirements A framework for requirements engineering Actors in requirements engineering Requirements elicitation Requirements analysis Requirements validation Documenting and Managing the Requirements The requirements document The requirements catalogue Managing requirements Modelling the Requirements Modelling system functions Modelling system data Delivering the Requirements Delivering the solution Context Lifecycles Delivering the Business Solution BA role in the business change lifecycle Design stage Implementation stage Realization stage Additional course details: Nexus Humans Business Analysis - Requirements Engineering training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the Business Analysis - Requirements Engineering course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
Duration 4 Days 24 CPD hours This course is intended for The workshop is designed for data scientists who currently use Python or R to work with smaller datasets on a single machine and who need to scale up their analyses and machine learning models to large datasets on distributed clusters. Data engineers and developers with some knowledge of data science and machine learning may also find this workshop useful. Overview Overview of data science and machine learning at scale Overview of the Hadoop ecosystem Working with HDFS data and Hive tables using Hue Introduction to Cloudera Data Science Workbench Overview of Apache Spark 2 Reading and writing data Inspecting data quality Cleansing and transforming data Summarizing and grouping data Combining, splitting, and reshaping data Exploring data Configuring, monitoring, and troubleshooting Spark applications Overview of machine learning in Spark MLlib Extracting, transforming, and selecting features Building and evaluating regression models Building and evaluating classification models Building and evaluating clustering models Cross-validating models and tuning hyperparameters Building machine learning pipelines Deploying machine learning models Spark, Spark SQL, and Spark MLlib PySpark and sparklyr Cloudera Data Science Workbench (CDSW) Hue This workshop covers data science and machine learning workflows at scale using Apache Spark 2 and other key components of the Hadoop ecosystem. The workshop emphasizes the use of data science and machine learning methods to address real-world business challenges. Using scenarios and datasets from a fictional technology company, students discover insights to support critical business decisions and develop data products to transform the business. The material is presented through a sequence of brief lectures, interactive demonstrations, extensive hands-on exercises, and discussions. The Apache Spark demonstrations and exercises are conducted in Python (with PySpark) and R (with sparklyr) using the Cloudera Data Science Workbench (CDSW) environment. The workshop is designed for data scientists who currently use Python or R to work with smaller datasets on a single machine and who need to scale up their analyses and machine learning models to large datasets on distributed clusters. Data engineers and developers with some knowledge of data science and machine learning may also find this workshop useful. Overview of data science and machine learning at scaleOverview of the Hadoop ecosystemWorking with HDFS data and Hive tables using HueIntroduction to Cloudera Data Science WorkbenchOverview of Apache Spark 2Reading and writing dataInspecting data qualityCleansing and transforming dataSummarizing and grouping dataCombining, splitting, and reshaping dataExploring dataConfiguring, monitoring, and troubleshooting Spark applicationsOverview of machine learning in Spark MLlibExtracting, transforming, and selecting featuresBuilding and evauating regression modelsBuilding and evaluating classification modelsBuilding and evaluating clustering modelsCross-validating models and tuning hyperparametersBuilding machine learning pipelinesDeploying machine learning models Additional course details: Nexus Humans Cloudera Data Scientist Training training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the Cloudera Data Scientist Training course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
Duration 4 Days 24 CPD hours This course is intended for This class is intended for experienced developers who are responsible for managing big data transformations including: Extracting, loading, transforming, cleaning, and validating data. Designing pipelines and architectures for data processing. Creating and maintaining machine learning and statistical models. Querying datasets, visualizing query results and creating reports Overview Design and build data processing systems on Google Cloud Platform. Leverage unstructured data using Spark and ML APIs on Cloud Dataproc. Process batch and streaming data by implementing autoscaling data pipelines on Cloud Dataflow. Derive business insights from extremely large datasets using Google BigQuery. Train, evaluate and predict using machine learning models using TensorFlow and Cloud ML. Enable instant insights from streaming data Get hands-on experience with designing and building data processing systems on Google Cloud. This course uses lectures, demos, and hand-on labs to show you how to design data processing systems, build end-to-end data pipelines, analyze data, and implement machine learning. This course covers structured, unstructured, and streaming data. Introduction to Data Engineering Explore the role of a data engineer. Analyze data engineering challenges. Intro to BigQuery. Data Lakes and Data Warehouses. Demo: Federated Queries with BigQuery. Transactional Databases vs Data Warehouses. Website Demo: Finding PII in your dataset with DLP API. Partner effectively with other data teams. Manage data access and governance. Build production-ready pipelines. Review GCP customer case study. Lab: Analyzing Data with BigQuery. Building a Data Lake Introduction to Data Lakes. Data Storage and ETL options on GCP. Building a Data Lake using Cloud Storage. Optional Demo: Optimizing cost with Google Cloud Storage classes and Cloud Functions. Securing Cloud Storage. Storing All Sorts of Data Types. Video Demo: Running federated queries on Parquet and ORC files in BigQuery. Cloud SQL as a relational Data Lake. Lab: Loading Taxi Data into Cloud SQL. Building a Data Warehouse The modern data warehouse. Intro to BigQuery. Demo: Query TB+ of data in seconds. Getting Started. Loading Data. Video Demo: Querying Cloud SQL from BigQuery. Lab: Loading Data into BigQuery. Exploring Schemas. Demo: Exploring BigQuery Public Datasets with SQL using INFORMATION_SCHEMA. Schema Design. Nested and Repeated Fields. Demo: Nested and repeated fields in BigQuery. Lab: Working with JSON and Array data in BigQuery. Optimizing with Partitioning and Clustering. Demo: Partitioned and Clustered Tables in BigQuery. Preview: Transforming Batch and Streaming Data. Introduction to Building Batch Data Pipelines EL, ELT, ETL. Quality considerations. How to carry out operations in BigQuery. Demo: ELT to improve data quality in BigQuery. Shortcomings. ETL to solve data quality issues. Executing Spark on Cloud Dataproc The Hadoop ecosystem. Running Hadoop on Cloud Dataproc. GCS instead of HDFS. Optimizing Dataproc. Lab: Running Apache Spark jobs on Cloud Dataproc. Serverless Data Processing with Cloud Dataflow Cloud Dataflow. Why customers value Dataflow. Dataflow Pipelines. Lab: A Simple Dataflow Pipeline (Python/Java). Lab: MapReduce in Dataflow (Python/Java). Lab: Side Inputs (Python/Java). Dataflow Templates. Dataflow SQL. Manage Data Pipelines with Cloud Data Fusion and Cloud Composer Building Batch Data Pipelines visually with Cloud Data Fusion. Components. UI Overview. Building a Pipeline. Exploring Data using Wrangler. Lab: Building and executing a pipeline graph in Cloud Data Fusion. Orchestrating work between GCP services with Cloud Composer. Apache Airflow Environment. DAGs and Operators. Workflow Scheduling. Optional Long Demo: Event-triggered Loading of data with Cloud Composer, Cloud Functions, Cloud Storage, and BigQuery. Monitoring and Logging. Lab: An Introduction to Cloud Composer. Introduction to Processing Streaming Data Processing Streaming Data. Serverless Messaging with Cloud Pub/Sub Cloud Pub/Sub. Lab: Publish Streaming Data into Pub/Sub. Cloud Dataflow Streaming Features Cloud Dataflow Streaming Features. Lab: Streaming Data Pipelines. High-Throughput BigQuery and Bigtable Streaming Features BigQuery Streaming Features. Lab: Streaming Analytics and Dashboards. Cloud Bigtable. Lab: Streaming Data Pipelines into Bigtable. Advanced BigQuery Functionality and Performance Analytic Window Functions. Using With Clauses. GIS Functions. Demo: Mapping Fastest Growing Zip Codes with BigQuery GeoViz. Performance Considerations. Lab: Optimizing your BigQuery Queries for Performance. Optional Lab: Creating Date-Partitioned Tables in BigQuery. Introduction to Analytics and AI What is AI?. From Ad-hoc Data Analysis to Data Driven Decisions. Options for ML models on GCP. Prebuilt ML model APIs for Unstructured Data Unstructured Data is Hard. ML APIs for Enriching Data. Lab: Using the Natural Language API to Classify Unstructured Text. Big Data Analytics with Cloud AI Platform Notebooks What's a Notebook. BigQuery Magic and Ties to Pandas. Lab: BigQuery in Jupyter Labs on AI Platform. Production ML Pipelines with Kubeflow Ways to do ML on GCP. Kubeflow. AI Hub. Lab: Running AI models on Kubeflow. Custom Model building with SQL in BigQuery ML BigQuery ML for Quick Model Building. Demo: Train a model with BigQuery ML to predict NYC taxi fares. Supported Models. Lab Option 1: Predict Bike Trip Duration with a Regression Model in BQML. Lab Option 2: Movie Recommendations in BigQuery ML. Custom Model building with Cloud AutoML Why Auto ML? Auto ML Vision. Auto ML NLP. Auto ML Tables.
Duration 2 Days 12 CPD hours This course is intended for Data Modelers Participants will learn the full scope of the metadata modeling process, from initial project creation, to publishing a dynamic cube, and enabling end users to easily author reports and analyze data. Introduction to IBM Cognos Dynamic Cubes Define and differentiate Dynamic Cubes Dynamic Cubes characteristics Examine Dynamic Cube requirements Examine Dynamic Cube components Examine high level architecture IBM Cognos Dynamic Query Review Dimensional Data Structures Dynamic Cubes caching Create & Design a Dynamic Cube Explore the IBM Cognos Cube Designer Review the cube development process Examine the Automatic Cube Generation Manual development overview Create dimensions Model the cube Best practice for effective modeling Deploy & Configure a Dynamic Cube Deploy a cube Explore the Estimate Hardware Requirements Identify cube management tasks Examine Query Service administration Explore Dynamic Cube properties Schedule cube actions Use the DCAdmin comment line tool Advanced Dynamic Cube Modelling Examine advanced modeling concepts Explore modeling caveats Calculated measures and members Model Relative Time Explore the Current Period property Define period aggregation rules for measures Advanced Features of Cube Designer Examine multilingual support Examine ragged hierarchies and padding members Define Parent-Child Dimensions Refresh Metadata Import Framework Manager packages Filter measures and dimensions Optimize Performance with Aggregates Identify aggregates and aggregate tables In-memory aggregates Use Aggregate Advisor to identify aggregates User defined in-memory aggregates Optimize In-Memory Aggregates automatically Aggregate Advisor recommendations Monitor Dynamic Cube performance Model aggregates (automatically vs manually) Use Slicers to define aggregation partitions Define Security Overview of Dynamic Cube security Identify security filters The Security process - Three steps Examine security scope Identify scope rules Identify roles Capabilities and access permissions Cube security deep dive Model a Virtual Cube Explore virtual cubes Create the virtual cube Explore virtual cube objects Examine virtual measures and calculated members Currency conversion using virtual cubes Security on virtual cubes Introduction to IBM Cognos Analytics Define IBM Cognos Analytics Redefined Business Intelligence Self-service Navigate to content in IBM Cognos Analytics Interact with the user interface Model data with IBM Cognos Analytics IBM Cognos Analytics components Create reports Perform self-service with analysis and Dashboards IBM Cognos Analytics architecture (high level) IBM Cognos Analytics security Package / data source relationship Create Data modules Upload files Additional course details: Nexus Humans B6063 IBM Cognos Cube Designer - Design Dynamic Cubes (v11.0) training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the B6063 IBM Cognos Cube Designer - Design Dynamic Cubes (v11.0) course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
Duration 3 Days 18 CPD hours This course is intended for This course is intended for intermediate to advanced Business Analysts who are looking to improve their skills for eliciting, analyzing, documenting, validating, and communicating requirements. Overview Obtain a thorough understanding of the core responsibilities of the business analyst Understand the main professional associations and standards supporting business analysts in the industry Discuss and explore the components of each of the domains/knowledge areas that comprise the work of business analysis Recognize the importance of properly defining the business need prior to engaging in requirements activities Formulate a strong understanding of the concepts that comprise strategy analysis Obtain experience with identifying and analyzing stakeholders Decipher between project and product scope and successfully use models to communicate scope Thoroughly understand and identify the various requirements categories and be able to recognize requirements of various types Explore business rules analysis Understand the benefits of process modeling and the common modeling language of BPMN Discuss process models and how the techniques can capture details about the as-is/to-be environment Learn how to properly prepare and conduct interviews Explore the components of use cases Learn what it means to package requirements Obtain hands-on experience with a number of business analysis techniques and gain hands-on experience eliciting, defining, and writing requirements. This course provides students a clear understanding of all the facets of the business analysis role, including a thorough walkthrough of the various domain/knowledge areas that comprise the business analysis profession. Students are provided an opportunity to try their hand at several business analysis techniques to assist with improving their skills in stakeholder identification, scope definition, and analyzing, documenting, and modeling requirements. Introduction to Business Analysis What is business analysis Benefits and challenges of business analysis Project success factors A Closer Look at the Business Analyst Role Definition of a business analyst Responsibilities of a business analyst Importance of communication/collaboration BA role vs. PM role Project roles involved in requirements IIBA/PMI and the goals of a professional association Purpose for having a BA standard IIBA?s BABOK© Guide and PMI?s Practice Guide in Business Analysis Business analysis beyond project work Business analysis core concepts Business analysis perspectives IIBA and PMI certifications for business analysts Workshop: Choose Your Project Supporting the Project Portfolio (Enterprise Analysis) Define Strategy Analysis When to perform Strategy Analysis Components of Strategy Analysis Defining the business need Envisioning the Product and Project Defining business requirements The importance of stakeholders Stakeholder identification Tips for analyzing stakeholders Techniques for managing stakeholder lists Discussion: Who is involved in strategy analysis? Workshops: Define the Business Need, Write Business Requirements, and Identify Stakeholders Understanding and Defining Solution Scope Defining solution scope Techniques for defining solution scope Applying the brainstorming technique Project scope vs. Product scope Finding solution boundaries The Context Diagram Actors and key information Workshop: Draw a Context Diagram Understanding Requirements What is a requirement? Requirement types Business, Stakeholder, Solution, and Transition requirements Assumptions and constraints Business rules Taxonomy of business rules Decision tables How to write simple calculations Requirements vs. business rules Document requirements Workshop ? Document Requirements Elicitation and Process Modeling Why do we model processes? What is Business Process Management? Using a modeling notation ?As Is? vs. ?To Be? modeling Why use BPMN? Basic BPM notation Business Process Modeling ? A case study Business Process Realignment ?As Is? vs. ?To Be? activity diagrams Workshop: Create a Business Process Model Planning & Eliciting Requirements Interviewing ? what and why? Preparing for an effective interview Types of questions to ask Sequencing questions Active listening techniques Planning for elicitation Conducting the interview Establishing rapport Active Listening Feedback techniques Types of elicitation techniques Workshops: Planning for Elicitation and Conducting an Elicitation Session Use Case & User Story Analysis What is an Actor? Types of Actors How to ?find? Use Cases? Diagramming Use Cases Tips on naming Use Cases Explaining scenarios The use case template Components of a use case Scenario examples Best practices for writing Use Cases Scenarios and flows Alternate and exception flows Exercises: Drawing a Use Case Diagram, Writing the Main Success Scenario, and Writing Alternate and Exception Scenarios Analyzing & Documenting Requirements Requirements and Use Cases Non-Functional requirements User Interface Requirements UI Data Table Reporting requirements Data requirements Data accessibility requirements Characteristics of good requirements The business requirements document (BRD) BRD vs. Functional Requirements Specification Preparing the requirements package Requirements traceability Workshops: Develop a User Interface, Analyzing Requirements, and Tracing requirements Additional Resources Useful books and links on writing effective requirements
Duration 3 Days 18 CPD hours Overview The goal of this course is to enable technical students new to Cassandra to begin working with Cassandra in an optimal manner. Throughout the course students will learn to: Understand the Big Data needs that C* addresses Be familiar with the operation and structure of C* Be able to install and set up a C* database Use the C* tools, including cqlsh, nodetool, and ccm (Cassandra Cluster Manager) Be familiar with the C* architecture, and how a C* cluster is structured Understand how data is distributed and replicated in a C* cluster Understand core C* data modeling concepts, and use them to create well-structured data models Be familiar with the C* eventual consistency model and use it intelligently Be familiar with consistency mechanisms such as read repair and hinted handoff Understand and use CQL to create tables and query for data Know and use the CQL data types (numerical, textual, uuid, etc.) Be familiar with the various kinds of primary keys available (simple, compound, and composite primary keys) Be familiar with the C* write and read paths Understand C* deletion and compaction The Cassandra (C*) database is a massively scalable NoSQL database that provides high availability and fault tolerance, as well as linear scalability when adding new nodes to a cluster. It has many powerful capabilities, such as tunable and eventual consistency, that allow it to meet the needs of modern applications, but also introduce a new paradigm for data modeling that many organizations do not have the expertise to use in the best way.Introduction to Cassandra is a hands-on course designed to teach attendees the basics of how to create good data models with Cassandra. This technical course has a focus on the practical aspects of working with C*, and introduces essential concepts needed to understand Cassandra, including enough coverage of internal architecture to make good decisions. It is hands-on, with labs that provide experience in core functionality. Students will also explore CQL (Cassandra Query Language), as well as some of the ?anti-patterns? that lead to non-optimal C* data models and be ready to work on production systems involving Cassandra. Session 1: Cassandra Overview Why We Need Cassandra - Big Data Challenges vs RDBMS High level Cassandra Overview Cassandra Features Optional: Basic Cassandra Installation and Configuration Session 2: Cassandra Architecture and CQL Overview Cassandra Architecture Overview Cassandra Clusters and Rings Nodes and Virtual Nodes Data Replication in Cassandra Introduction to CQL Defining Tables with a Single Primary Key Using cqlsh for Interactive Querying Selecting and Inserting/Upserting Data with CQL Data Replication and Distribution Basic Data Types (including uuid, timeuuid) Session 3: Data Modeling and CQL Core Concepts Defining a Compound Primary Key CQL for Compound Primary Keys Partition Keys and Data Distribution Clustering Columns Overview of Internal Data Organization Overview of Other Querying Capabilities ORDER BY, CLUSTERING ORDER BY, UPDATE , DELETE, ALLOW FILTERING Batch Queries Data Modeling Guidelines Denormalization Data Modeling Workflow Data Modeling Principles Primary Key Considerations Composite Partition Keys Defining with CQL Data Distribution with Composite Partition Key Overview of Internal Data Organization Session 4: Additional CQL Capabilities Indexing Primary/Partition Keys and Pagination with token() Secondary Indexes and Usage Guidelines Cassandra collections Collection Structure and Uses Defining and Querying Collections (set, list, and map) Materialized View Overview Usage Guidelines Session 5: Data Consistency In Cassandra Overview of Consistency in Cassandra CAP Theorem Eventual (Tunable) Consistency in C* - ONE, QUORUM, ALL Choosing CL ONE Choosing CL QUORUM Achieving Immediate Consistency Overview of Other Consistency Levels Supportive Consistency Mechanisms Writing / Hinted Handoff Read Repair Nodetool repair Session 6: Internal Mechanisms Ring Details Partitioners Gossip Protocol Snitches Write Path Overview / Commit Log Memtables and SSTables Write Failure Unavailable Nodes and Node Failure Requirements for Write Operations Read Path Overview Read Mechanism Replication and Caching Deletion/Compaction Overview Delete Mechanism Tombstones and Compaction Session 7: Working with IntelliJ Configuring JDBC Data Source for Cassandra Reading Schema Information Querying and Editing Tables. Additional course details: Nexus Humans Introduction to Cassandra (TTDS6776) training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the Introduction to Cassandra (TTDS6776) course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.