Duration 4 Days 24 CPD hours This course is intended for This course is best suited to developers, engineers, and architects who want to use use Hadoop and related tools to solve real-world problems. Overview Skills learned in this course include:Creating a data set with Kite SDKDeveloping custom Flume components for data ingestionManaging a multi-stage workflow with OozieAnalyzing data with CrunchWriting user-defined functions for Hive and ImpalaWriting user-defined functions for Hive and ImpalaIndexing data with Cloudera Search Cloudera University?s four-day course for designing and building Big Data applications prepares you to analyze and solve real-world problems using Apache Hadoop and associated tools in the enterprise data hub (EDH). IntroductionApplication Architecture Scenario Explanation Understanding the Development Environment Identifying and Collecting Input Data Selecting Tools for Data Processing and Analysis Presenting Results to the Use Defining & Using Datasets Metadata Management What is Apache Avro? Avro Schemas Avro Schema Evolution Selecting a File Format Performance Considerations Using the Kite SDK Data Module What is the Kite SDK? Fundamental Data Module Concepts Creating New Data Sets Using the Kite SDK Loading, Accessing, and Deleting a Data Set Importing Relational Data with Apache Sqoop What is Apache Sqoop? Basic Imports Limiting Results Improving Sqoop?s Performance Sqoop 2 Capturing Data with Apache Flume What is Apache Flume? Basic Flume Architecture Flume Sources Flume Sinks Flume Configuration Logging Application Events to Hadoop Developing Custom Flume Components Flume Data Flow and Common Extension Points Custom Flume Sources Developing a Flume Pollable Source Developing a Flume Event-Driven Source Custom Flume Interceptors Developing a Header-Modifying Flume Interceptor Developing a Filtering Flume Interceptor Writing Avro Objects with a Custom Flume Interceptor Managing Workflows with Apache Oozie The Need for Workflow Management What is Apache Oozie? Defining an Oozie Workflow Validation, Packaging, and Deployment Running and Tracking Workflows Using the CLI Hue UI for Oozie Processing Data Pipelines with Apache Crunch What is Apache Crunch? Understanding the Crunch Pipeline Comparing Crunch to Java MapReduce Working with Crunch Projects Reading and Writing Data in Crunch Data Collection API Functions Utility Classes in the Crunch API Working with Tables in Apache Hive What is Apache Hive? Accessing Hive Basic Query Syntax Creating and Populating Hive Tables How Hive Reads Data Using the RegexSerDe in Hive Developing User-Defined Functions What are User-Defined Functions? Implementing a User-Defined Function Deploying Custom Libraries in Hive Registering a User-Defined Function in Hive Executing Interactive Queries with Impala What is Impala? Comparing Hive to Impala Running Queries in Impala Support for User-Defined Functions Data and Metadata Management Understanding Cloudera Search What is Cloudera Search? Search Architecture Supported Document Formats Indexing Data with Cloudera Search Collection and Schema Management Morphlines Indexing Data in Batch Mode Indexing Data in Near Real Time Presenting Results to Users Solr Query Syntax Building a Search UI with Hue Accessing Impala through JDBC Powering a Custom Web Application with Impala and Search
Duration 1 Days 6 CPD hours This course is intended for This class is intended for the following: Data analysts, Data scientists, Business analysts getting started with Google Cloud Platform. Individuals responsible for designing pipelines and architectures for data processing, creating and maintaining machine learning and statistical models, querying datasets, visualizing query results and creating reports. Executives and IT decision makers evaluating Google Cloud Platform for use by data scientists. Overview This course teaches students the following skills:Identify the purpose and value of the key Big Data and Machine Learning products in the Google Cloud Platform.Use Cloud SQL and Cloud Dataproc to migrate existing MySQL and Hadoop/Pig/Spark/Hive workloads to Google Cloud Platform.Employ BigQuery and Cloud Datalab to carry out interactive data analysis.Train and use a neural network using TensorFlow.Employ ML APIs.Choose between different data processing products on the Google Cloud Platform. This course introduces participants to the Big Data and Machine Learning capabilities of Google Cloud Platform (GCP). It provides a quick overview of the Google Cloud Platform and a deeper dive of the data processing capabilities. Introducing Google Cloud Platform Google Platform Fundamentals Overview. Google Cloud Platform Big Data Products. Compute and Storage Fundamentals CPUs on demand (Compute Engine). A global filesystem (Cloud Storage). CloudShell. Lab: Set up a Ingest-Transform-Publish data processing pipeline. Data Analytics on the Cloud Stepping-stones to the cloud. Cloud SQL: your SQL database on the cloud. Lab: Importing data into CloudSQL and running queries. Spark on Dataproc. Lab: Machine Learning Recommendations with Spark on Dataproc. Scaling Data Analysis Fast random access. Datalab. BigQuery. Lab: Build machine learning dataset. Machine Learning Machine Learning with TensorFlow. Lab: Carry out ML with TensorFlow Pre-built models for common needs. Lab: Employ ML APIs. Data Processing Architectures Message-oriented architectures with Pub/Sub. Creating pipelines with Dataflow. Reference architecture for real-time and batch data processing. Summary Why GCP? Where to go from here Additional Resources Additional course details: Nexus Humans Google Cloud Platform Big Data and Machine Learning Fundamentals training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the Google Cloud Platform Big Data and Machine Learning Fundamentals course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
Duration 1 Days 6 CPD hours This course is intended for This course does not have any technical knowledge prerequisites for the learners, besides being proficient in using a computer and the Internet. IT and/or AI knowledge is a benefit but not a hard requirement. Given the rapid development of AI and the broad range of its applications in everyday life, it is crucial for anyone to attend this course to update their digital skills in an ever-changing world. It is expected that all learners have registered for a free account of OpenAI ChatGPT at https://chat.openai.com. Overview Discover how AI relates to other 4th industrial revolution technologies Learn about AI, ML, and associated cognitive services Overview of AI development frameworks, tools and services Evaluate the OpenAI ChatGPT4 / ChatGPT3.5 model features in more detail The core aim of this ?AI for beginners? course is to introduce its audience to Artificial Intelligence (AI) and Machine Learning (ML) technologies and allow them to understand the practical applications of AI in their everyday personal and professional life. Moreover, the course aims to provide a handful of demos and hands-on exercises to allow the learners to familiarize themselves with usage scenarios of OpenAI ChatGPT and other Generative AI (GenAI) models. The content of this course has been created primarily by using the OpenAI ChatGPT model. AI theoretical concepts. Introduction to AI, ML, and associated cognitive services (Computer vision, Natural language processing, Speech analysis, Decision making). How AI relates to other 4th industrial revolution technologies (cloud computing, edge computing, internet of things, blockchain, metaverse, robotics, quantum computing). AI model classification by utilizing mind maps and the distinctive role of Gen AI models. Introduction to the OpenAI ChatGPT model and alternative generative AI models. Familiarization with the basics of the ChatGPT interface (https://chat.openai.com). Talking about Responsible AI: Security, privacy, compliance, copyright, legal challenges, and ethical implications. AI practical applications Overview of AI development frameworks, tools and services. AI aggregators review. Hand-picked AI tool demos: a.Workplace productivity and the case of Microsoft 365 Copilot. b.The content creation industry. Create text, code, images, audio and video with Gen AI. c.Redefining the education sector with AI-powered learning. Evaluate the OpenAI ChatGPT4 / ChatGPT3.5 model features in more detail: a.Prompting and plugin demos. b.Code interpreter demos. Closing words. Discussion with an AI model on the future of AI. Additional course details: Nexus Humans AI for beginners training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the AI for beginners course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
Duration 5 Days 30 CPD hours This course is intended for This intermediate and beyond level course is geared for experienced technical professionals in various roles, such as developers, data analysts, data engineers, software engineers, and machine learning engineers who want to leverage Scala and Spark to tackle complex data challenges and develop scalable, high-performance applications across diverse domains. Practical programming experience is required to participate in the hands-on labs. Overview Working in a hands-on learning environment led by our expert instructor you'll: Develop a basic understanding of Scala and Apache Spark fundamentals, enabling you to confidently create scalable and high-performance applications. Learn how to process large datasets efficiently, helping you handle complex data challenges and make data-driven decisions. Gain hands-on experience with real-time data streaming, allowing you to manage and analyze data as it flows into your applications. Acquire practical knowledge of machine learning algorithms using Spark MLlib, empowering you to create intelligent applications and uncover hidden insights. Master graph processing with GraphX, enabling you to analyze and visualize complex relationships in your data. Discover generative AI technologies using GPT with Spark and Scala, opening up new possibilities for automating content generation and enhancing data analysis. Embark on a journey to master the world of big data with our immersive course on Scala and Spark! Mastering Scala with Apache Spark for the Modern Data Enterprise is a five day hands on course designed to provide you with the essential skills and tools to tackle complex data projects using Scala programming language and Apache Spark, a high-performance data processing engine. Mastering these technologies will enable you to perform a wide range of tasks, from data wrangling and analytics to machine learning and artificial intelligence, across various industries and applications.Guided by our expert instructor, you?ll explore the fundamentals of Scala programming and Apache Spark while gaining valuable hands-on experience with Spark programming, RDDs, DataFrames, Spark SQL, and data sources. You?ll also explore Spark Streaming, performance optimization techniques, and the integration of popular external libraries, tools, and cloud platforms like AWS, Azure, and GCP. Machine learning enthusiasts will delve into Spark MLlib, covering basics of machine learning algorithms, data preparation, feature extraction, and various techniques such as regression, classification, clustering, and recommendation systems. Introduction to Scala Brief history and motivation Differences between Scala and Java Basic Scala syntax and constructs Scala's functional programming features Introduction to Apache Spark Overview and history Spark components and architecture Spark ecosystem Comparing Spark with other big data frameworks Basics of Spark Programming SparkContext and SparkSession Resilient Distributed Datasets (RDDs) Transformations and Actions Working with DataFrames Spark SQL and Data Sources Spark SQL library and its advantages Structured and semi-structured data sources Reading and writing data in various formats (CSV, JSON, Parquet, Avro, etc.) Data manipulation using SQL queries Basic RDD Operations Creating and manipulating RDDs Common transformations and actions on RDDs Working with key-value data Basic DataFrame and Dataset Operations Creating and manipulating DataFrames and Datasets Column operations and functions Filtering, sorting, and aggregating data Introduction to Spark Streaming Overview of Spark Streaming Discretized Stream (DStream) operations Windowed operations and stateful processing Performance Optimization Basics Best practices for efficient Spark code Broadcast variables and accumulators Monitoring Spark applications Integrating External Libraries and Tools, Spark Streaming Using popular external libraries, such as Hadoop and HBase Integrating with cloud platforms: AWS, Azure, GCP Connecting to data storage systems: HDFS, S3, Cassandra, etc. Introduction to Machine Learning Basics Overview of machine learning Supervised and unsupervised learning Common algorithms and use cases Introduction to Spark MLlib Overview of Spark MLlib MLlib's algorithms and utilities Data preparation and feature extraction Linear Regression and Classification Linear regression algorithm Logistic regression for classification Model evaluation and performance metrics Clustering Algorithms Overview of clustering algorithms K-means clustering Model evaluation and performance metrics Collaborative Filtering and Recommendation Systems Overview of recommendation systems Collaborative filtering techniques Implementing recommendations with Spark MLlib Introduction to Graph Processing Overview of graph processing Use cases and applications of graph processing Graph representations and operations Introduction to Spark GraphX Overview of GraphX Creating and transforming graphs Graph algorithms in GraphX Big Data Innovation! Using GPT and Generative AI Technologies with Spark and Scala Overview of generative AI technologies Integrating GPT with Spark and Scala Practical applications and use cases Bonus Topics / Time Permitting Introduction to Spark NLP Overview of Spark NLP Preprocessing text data Text classification and sentiment analysis Putting It All Together Work on a capstone project that integrates multiple aspects of the course, including data processing, machine learning, graph processing, and generative AI technologies.
Duration 3 Days 18 CPD hours This course is intended for This course is geared for attendees with Intermediate IT skills who wish to learn Computer Vision with tensor flow 2 Overview This 'skills-centric' course is about 50% hands-on lab and 50% lecture, with extensive practical exercises designed to reinforce fundamental skills, concepts and best practices taught throughout the course. Working in a hands-on learning environment, led by our Computer Vision expert instructor, students will learn about and explore how to Build, train, and serve your own deep neural networks with TensorFlow 2 and Keras Apply modern solutions to a wide range of applications such as object detection and video analysis Run your models on mobile devices and web pages and improve their performance. Create your own neural networks from scratch Classify images with modern architectures including Inception and ResNet Detect and segment objects in images with YOLO, Mask R-CNN, and U-Net Tackle problems faced when developing self-driving cars and facial emotion recognition systems Boost your application's performance with transfer learning, GANs, and domain adaptation Use recurrent neural networks (RNNs) for video analysis Optimize and deploy your networks on mobile devices and in the browser Computer vision solutions are becoming increasingly common, making their way into fields such as health, automobile, social media, and robotics. Hands-On Computervision with TensorFlow 2 is a hands-on course that thoroughly explores TensorFlow 2, the brandnew version of Google's open source framework for machine learning. You will understand how to benefit from using convolutional neural networks (CNNs) for visual tasks. This course begins with the fundamentals of computer vision and deep learning, teaching you how to build a neural network from scratch. You will discover the features that have made TensorFlow the most widely used AI library, along with its intuitive Keras interface. You'll then move on to building, training, and deploying CNNs efficiently. Complete with concrete code examples, the course demonstrates how to classify images with modern solutions, such as Inception and ResNet, and extract specific content using You Only Look Once (YOLO), Mask R-CNN, and U-Net. You will also build generative dversarial networks (GANs) and variational autoencoders (VAEs) to create and edit images, and long short-term memory networks (LSTMs) to analyze videos. In the process, you will acquire advanced insights into transfer learning, data augmentation, domain adaptation, and mobile and web deployment, among other key concepts Computer Vision and Neural Networks Computer Vision and Neural Networks Technical requirements Computer vision in the wild A brief history of computer vision Getting started with neural networks TensorFlow Basics and Training a Model TensorFlow Basics and Training a Model Technical requirements Getting started with TensorFlow 2 and Keras TensorFlow 2 and Keras in detail The TensorFlow ecosystem Modern Neural Networks Modern Neural Networks Technical requirements Discovering convolutional neural networks Refining the training process Influential Classification Tools Influential Classification Tools Technical requirements Understanding advanced CNN architectures Leveraging transfer learning Object Detection Models Object Detection Models Technical requirements Introducing object detection A fast object detection algorithm YOLO Faster R-CNN ? a powerful object detection model Enhancing and Segmenting Images Enhancing and Segmenting Images Technical requirements Transforming images with encoders-decoders Understanding semantic segmentation Training on Complex and Scarce Datasets Training on Complex and Scarce Datasets Technical requirements Efficient data serving How to deal with data scarcity Video and Recurrent Neural Networks Video and Recurrent Neural Networks Technical requirements Introducing RNNs Classifying videos Optimizing Models and Deploying on Mobile Devices Optimizing Models and Deploying on Mobile Devices Technical requirements Optimizing computational and disk footprints On-device machine learning Example app ? recognizing facial expressions
Duration 3 Days 18 CPD hours This course is intended for Data Science for Marketing Analytics is designed for developers and marketing analysts looking to use new, more sophisticated tools in their marketing analytics efforts. It'll help if you have prior experience of coding in Python and knowledge of high school level mathematics. Some experience with databases, Excel, statistics, or Tableau is useful but not necessary. Overview By the end of this course, you will be able to build your own marketing reporting and interactive dashboard solutions. The course starts by teaching you how to use Python libraries, such as pandas and Matplotlib, to read data from Python, manipulate it, and create plots, using both categorical and continuous variables. Then, you'll learn how to segment a population into groups and use different clustering techniques to evaluate customer segmentation.As you make your way through the course, you'll explore ways to evaluate and select the best segmentation approach, and go on to create a linear regression model on customer value data to predict lifetime value. In the concluding sections, you'll gain an understanding of regression techniques and tools for evaluating regression models, and explore ways to predict customer choice using classification algorithms. Finally, you'll apply these techniques to create a churn model for modeling customer product choices. Data Preparation and Cleaning Data Models and Structured Data pandas Data Manipulation Data Exploration and Visualization Identifying the Right Attributes Generating Targeted Insights Visualizing Data Unsupervised Learning: Customer Segmentation Customer Segmentation Methods Similarity and Data Standardization k-means Clustering Choosing the Best Segmentation Approach Choosing the Number of Clusters Different Methods of Clustering Evaluating Clustering Predicting Customer Revenue Using Linear Regression Understanding Regression Feature Engineering for Regression Performing and Interpreting Linear Regression Other Regression Techniques and Tools for Evaluation Evaluating the Accuracy of a Regression Model Using Regularization for Feature Selection Tree-Based Regression Models Supervised Learning: Predicting Customer Churn Classification Problems Understanding Logistic Regression Creating a Data Science Pipeline Fine-Tuning Classification Algorithms Support Vector Machine Decision Trees Random Forest Preprocessing Data for Machine Learning Models Model Evaluation Performance Metrics Modeling Customer Choice Understanding Multiclass Classification Class Imbalanced Data Additional course details: Nexus Humans Data Science for Marketing Analytics training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the Data Science for Marketing Analytics course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
ChatGPT, along with other AI tools, aims not to replace the human touch in management, but to enhance it. By addressing repetitive, daily tasks, these tools free up managers to concentrate on core responsibilities like strategic decision-making, team development, and innovation. As we move further into the digital age, integrating tools such as ChatGPT isn't a luxury; it's the future of proactive leadership. In this guide, we'll delve into 10 practical ways through which AI can elevate your efficiency and refine the quality of your work. Gain familiarity with prominent AI tools in the market Efficiently compose and respond to emails Generate concise summaries of complex reports and data. Obtain quick insights, data, and research across varied topics Streamline the writing of articles, training notes, and posts Craft interview tests, form relevant questions, and design checklists for the hiring process 1 Streamlining emails An inbox can be a goldmine of information but also a significant time drain for managers. Here's how to optimise it: Drafting responses: Give the AI a brief, and watch it craft a well-structured response. Sorting and prioritising: By employing user-defined rules and keywords, ChatGPT can flag important emails, ensuring no vital communication slips through the cracks. 2 Efficient report writing Reports, especially routine ones, can be time-intensive. Here's a smarter approach: Automate content: Supply key data points to the AI, and let it weave them into an insightful report. Proofreading: Lean on ChatGPT for grammar checks and consistency, ensuring each report remains crisp and error-free. 3 Rapid research From competitor insights to market trends, research is a pivotal part of management. Data synthesis: Feed raw data to the AI and receive succinct summaries in return. Question-answering: Pose specific questions about a dataset to ChatGPT and extract swift insights without diving deep into the entire content. 4 Reinventing recruitment Hiring can be a lengthy process. Here's how to make it more efficient: Resume screening: Equip the AI to spot keywords and qualifications, ensuring that only the most fitting candidates are shortlisted. Preliminary interviews: Leverage ChatGPT for the initial rounds of interviews by framing critical questions and evaluating the responses. 5 Enhancing training Especially for extensive teams, training can be a monumental task. Here's how ChatGPT can assist: Customised content: Inform the AI of your training goals, and it will draft tailored content suitable for various roles. PowerPoint design: Create visually appealing slide presentations on any topic in minimal time.
Artificial Intelligence (AI) is the most disruptive technology since the internet came onto the scene. AI is transforming every aspect of how we manage projects from developing a business case, to planning the work, managing risk, and tracking performance. Because the technology and market are moving so fast, it can be difficult to know how to start using AI on projects. Generative AI for Project Management will engage you with diverse Generative AI tools to start, plan, and manage either your own project or a generic case study. We will embrace a tool agnostic approach to adopting, integrating, and scaling Generative AI without compromising data or trust. You will have hands-on practice utilizing AI tools to optimize your time and your outcomes. You will be accessing a variety of AI tools requiring you to register for a free account. A computer is required for all traditional classroom deliveries. None At the end of this program, you will be able to: Define essential terms and concepts related to artificial intelligence (AI) Illustrate how prompts facilitate interaction with Generative AI Recognize the capabilities of Large Language Models Craft prompts to develop project origination documents Create prompts to assist in planning a project Develop user stories with Generative AI Analyze project performance using Generative AI Identify the limitations of Generative AI Identify the risks associated with using Generative AI Articulate the need for governance and ethics when establishing an AI program in an organization Course Overview Getting Started Foundation Concepts Understanding essential terms and concepts related to AI Exploring various Generative AI Models Understanding Prompts Creating Prompts for Project Startup Prompts for starting a project Prompts for planning a project Best Practices for prompt engineering Creating Prompts for Managing Projects Creating agile user stories Measuring project performance Analyzing a schedule Using Generative AI Responsibly Limitations of AI Models Establishing an AI governance framework Future trends and next steps Summary and Next Steps
Duration 5 Days 30 CPD hours This course is intended for This course is designed for students who want to learn the R programming language, particularly students who want to leverage R for data analysis and data science tasks in their organization. The course is also designed for students with an interest in applying statistics to real-world problems. A typical student in this course should have several years of experience with computing technology, along with a proficiency in at least one other programming language. Overview In this course, you will use R to perform common data science tasks.You will: Set up an R development environment and execute simple code. Perform operations on atomic data types in R, including characters, numbers, and logicals. Perform operations on data structures in R, including vectors, lists, and data frames. Write conditional statements and loops. Structure code for reuse with functions and packages. Manage data by loading and saving datasets, manipulating data frames, and more. Analyze data through exploratory analysis, statistical analysis, and more. Create and format data visualizations using base R and ggplot2. Create simple statistical models from data. In our data-driven world, organizations need the right tools to extract valuable insights from that data. The R programming language is one of the tools at the forefront of data science. Its robust set of packages and statistical functions makes it a powerful choice for analyzing data, manipulating data, performing statistical tests on data, and creating predictive models from data. Likewise, R is notable for its strong data visualization tools, enabling you to create high-quality graphs and plots that are incredibly customizable. This course will teach you the fundamentals of programming in R to get you started. It will also teach you how to use R to perform common data science tasks and achieve data-driven results for the business. Lesson 1: Setting Up R and Executing Simple Code Topic A: Set Up the R Development Environment Topic B: Write R Statements Lesson 2: Processing Atomic Data Types Topic A: Process Characters Topic B: Process Numbers Topic C: Process Logicals Lesson 3: Processing Data Structures Topic A: Process Vectors Topic B: Process Factors Topic C: Process Data Frames Topic D: Subset Data Structures Lesson 4: Writing Conditional Statements and Loops Topic A: Write Conditional Statements Topic B: Write Loops Lesson 5: Structuring Code for Reuse Topic A: Define and Call Functions Topic B: Apply Loop Functions Topic C: Manage R Packages Lesson 6: Managing Data in R Topic A: Load Data Topic B: Save Data Topic C: Manipulate Data Frames Using Base R Topic D: Manipulate Data Frames Using dplyr Topic E: Handle Dates and Times Lesson 7: Analyzing Data in R Topic A: Examine Data Topic B: Explore the Underlying Distribution of Data Topic C: Identify Missing Values Lesson 8: Visualizing Data in R Topic A: Plot Data Using Base R Functions Topic B: Plot Data Using ggplot2 Topic C: Format Plots in ggplot2 Topic D: Create Combination Plots Lesson 9: Modeling Data in R Topic A: Create Statistical Models in R Topic B: Create Machine Learning Models in R
Duration 3 Days 18 CPD hours This course is intended for This course is geared for attendees with solid Python skills who wish to learn and use basic machine learning algorithms and concepts Overview This 'skills-centric' course is about 50% hands-on lab and 50% lecture, with extensive practical exercises designed to reinforce fundamental skills, concepts and best practices taught throughout the course. Topics Covered: This is a high-level list of topics covered in this course. Please see the detailed Agenda below Getting Started & Optional Python Quick Refresher Statistics and Probability Refresher and Python Practice Probability Density Function; Probability Mass Function; Naive Bayes Predictive Models Machine Learning with Python Recommender Systems KNN and PCA Reinforcement Learning Dealing with Real-World Data Experimental Design / ML in the Real World Time Permitting: Deep Learning and Neural Networks Machine Learning Essentials with Python is a foundation-level, three-day hands-on course that teaches students core skills and concepts in modern machine learning practices. This course is geared for attendees experienced with Python, but new to machine learning, who need introductory level coverage of these topics, rather than a deep dive of the math and statistics behind Machine Learning. Students will learn basic algorithms from scratch. For each machine learning concept, students will first learn about and discuss the foundations, its applicability and limitations, and then explore the implementation and use, reviewing and working with specific use casesWorking in a hands-on learning environment, led by our Machine Learning expert instructor, students will learn about and explore:Popular machine learning algorithms, their applicability and limitationsPractical application of these methods in a machine learning environmentPractical use cases and limitations of algorithms Getting Started Installation: Getting Started and Overview LINUX jump start: Installing and Using Anaconda & Course Materials (or reference the default container) Python Refresher Introducing the Pandas, NumPy and Scikit-Learn Library Statistics and Probability Refresher and Python Practice Types of Data Mean, Median, Mode Using mean, median, and mode in Python Variation and Standard Deviation Probability Density Function; Probability Mass Function; Naive Bayes Common Data Distributions Percentiles and Moments A Crash Course in matplotlib Advanced Visualization with Seaborn Covariance and Correlation Conditional Probability Naive Bayes: Concepts Bayes? Theorem Naive Bayes Spam Classifier with Naive Bayes Predictive Models Linear Regression Polynomial Regression Multiple Regression, and Predicting Car Prices Logistic Regression Logistic Regression Machine Learning with Python Supervised vs. Unsupervised Learning, and Train/Test Using Train/Test to Prevent Overfitting Understanding a Confusion Matrix Measuring Classifiers (Precision, Recall, F1, AUC, ROC) K-Means Clustering K-Means: Clustering People Based on Age and Income Measuring Entropy LINUX: Installing GraphViz Decision Trees: Concepts Decision Trees: Predicting Hiring Decisions Ensemble Learning Support Vector Machines (SVM) Overview Using SVM to Cluster People using scikit-learn Recommender Systems User-Based Collaborative Filtering Item-Based Collaborative Filtering Finding Similar Movie Better Accuracy for Similar Movies Recommending movies to People Improving your recommendations KNN and PCA K-Nearest-Neighbors: Concepts Using KNN to Predict a Rating for a Movie Dimensionality Reduction; Principal Component Analysis (PCA) PCA with the Iris Data Set Reinforcement Learning Reinforcement Learning with Q-Learning and Gym Dealing with Real-World Data Bias / Variance Tradeoff K-Fold Cross-Validation Data Cleaning and Normalization Cleaning Web Log Data Normalizing Numerical Data Detecting Outliers Feature Engineering and the Curse of Dimensionality Imputation Techniques for Missing Data Handling Unbalanced Data: Oversampling, Undersampling, and SMOTE Binning, Transforming, Encoding, Scaling, and Shuffling Experimental Design / ML in the Real World Deploying Models to Real-Time Systems A/B Testing Concepts T-Tests and P-Values Hands-on With T-Tests Determining How Long to Run an Experiment A/B Test Gotchas Capstone Project Group Project & Presentation or Review Deep Learning and Neural Networks Deep Learning Prerequisites The History of Artificial Neural Networks Deep Learning in the TensorFlow Playground Deep Learning Details Introducing TensorFlow Using TensorFlow Introducing Keras Using Keras to Predict Political Affiliations Convolutional Neural Networks (CNN?s) Using CNN?s for Handwriting Recognition Recurrent Neural Networks (RNN?s) Using an RNN for Sentiment Analysis Transfer Learning Tuning Neural Networks: Learning Rate and Batch Size Hyperparameters Deep Learning Regularization with Dropout and Early Stopping The Ethics of Deep Learning Learning More about Deep Learning Additional course details: Nexus Humans Machine Learning Essentials with Python (TTML5506-P) training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the Machine Learning Essentials with Python (TTML5506-P) course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.