• Professional Development
  • Medicine & Nursing
  • Arts & Crafts
  • Health & Wellbeing
  • Personal Development

568 Machine Learning (ML) courses

Google Cloud Fundamentals - Core Infrastructure

By Nexus Human

Duration 1 Days 6 CPD hours This course is intended for Individuals planning to deploy applications and create application environments on Google Cloud. Developers, systems operations professionals, and solution architects getting started with Google Cloud. Executives and business decision makers evaluating the potential of Google Cloud to address their business needs. Overview Identify the purpose and value of Google Cloud products and services. Interact with Google Cloud services. Describe ways in which customers have used Google Cloud. Choose among and use application deployment environments on Google Cloud: App Engine, Google Kubernetes Engine, and Compute Engine. Choose among and use Google Cloud storage options: Cloud Storage, Cloud SQL, Cloud Bigtable, and Firestore. Make basic use of BigQuery, Google's managed data warehouse for analytics. This course uses lectures, demos, and hands-on labs to give you an overview of Google Cloud products and services so that you can learn the value of Google Cloud and how to incorporate cloud-based solutions into your business strategies. Introducing Google Cloud Platform Explain the advantages of Google Cloud Platform. Define the components of Google's network infrastructure, including: Points of presence, data centers, regions, and zones. Understand the difference between Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS). Getting Started with Google Cloud Platform Identify the purpose of projects on Google Cloud Platform. Understand the purpose of and use cases for Identity and Access Management. List the methods of interacting with Google Cloud Platform. Lab: Getting Started with Google Cloud Platform. Google Compute Engine and Networking Identify the purpose of and use cases for Google Compute Engine. Understand the basics of networking in Google Cloud Platform. Lab: Deploying Applications Using Google Compute Engine. Google Cloud Platform Storage Options Understand the purpose of and use cases for: Google Cloud Storage, Google Cloud SQL, and Google Cloud Bigtable. Learn how to choose between the various storage options on Google Cloud Platform. Lab: Integrating Applications with Google Cloud Storage. Google Container Engine Define the concept of a container and identify uses for containers. Identify the purpose of and use cases for Google Container Engine and Kubernetes. Introduction to Hybrid and Multi-Cloud computing (Anthos). Lab: Deploying Applications Using Google Container Engine. Google App Engine and Google Cloud Datastore Understand the purpose of and use cases for Google App Engine and Google Cloud Datastore. Contrast the App Engine Standard environment with the App Engine Flexible environment. Understand the purpose of and use cases for Google Cloud Endpoints. Lab: Deploying Applications Using App Engine and Cloud Datastore. Deployment and Monitoring Understand the purpose of template-based creation and management of resources. Understand the purpose of integrated monitoring, alerting, and debugging. Lab: Getting Started with Stackdriver and Deployment Manager. Big Data and Machine Learning Understand the purpose of and use cases for the products and services in the Google Cloud big data and machine learning platforms. Lab: Getting Started with BigQuery. Summary and Review Summary and Review. What's Next?.

Google Cloud Fundamentals - Core Infrastructure
Delivered OnlineFlexible Dates
Price on Enquiry

Hands-On Computervision with TensorFlow 2 (TTML6900)

By Nexus Human

Duration 4 Days 24 CPD hours This course is intended for This course is geared for attendees with Intermediate IT skills who wish to learn Computer Vision with tensor flow 2 Overview This 'skills-centric' course is about 50% hands-on lab and 50% lecture, with extensive practical exercises designed to reinforce fundamental skills, concepts and best practices taught throughout the course. Working in a hands-on learning environment, led by our Computer Vision expert instructor, students will learn about and explore how to Build, train, and serve your own deep neural networks with TensorFlow 2 and Keras Apply modern solutions to a wide range of applications such as object detection and video analysis Run your models on mobile devices and web pages and improve their performance. Create your own neural networks from scratch Classify images with modern architectures including Inception and ResNet Detect and segment objects in images with YOLO, Mask R-CNN, and U-Net Tackle problems faced when developing self-driving cars and facial emotion recognition systems Boost your application's performance with transfer learning, GANs, and domain adaptation Use recurrent neural networks (RNNs) for video analysis Optimize and deploy your networks on mobile devices and in the browser Computer vision solutions are becoming increasingly common, making their way into fields such as health, automobile, social media, and robotics. Hands-On Computervision with TensorFlow 2 is a hands-on course that thoroughly explores TensorFlow 2, the brand-new version of Google's open source framework for machine learning. You will understand how to benefit from using convolutional neural networks (CNNs) for visual tasks. This course begins with the fundamentals of computer vision and deep learning, teaching you how to build a neural network from scratch. You will discover the features that have made TensorFlow the most widely used AI library, along with its intuitive Keras interface. You'll then move on to building, training, and deploying CNNs efficiently. Complete with concrete code examples, the course demonstrates how to classify images with modern solutions, such as Inception and ResNet, and extract specific content using You Only Look Once (YOLO), Mask R-CNN, and U-Net. You will also build generative adversarial networks (GANs) and variational autoencoders (VAEs) to create and edit images, and long short-term memory networks (LSTMs) to analyze videos. In the process, you will acquire advanced insights into transfer learning, data augmentation, domain adaptation, and mobile and web deployment, among other key concepts. Computer Vision and Neural Networks Computer Vision and Neural Networks Technical requirements Computer vision in the wild A brief history of computer vision Getting started with neural networks TensorFlow Basics and Training a Model TensorFlow Basics and Training a Model Technical requirements Getting started with TensorFlow 2 and Keras TensorFlow 2 and Keras in detail The TensorFlow ecosystem Modern Neural Networks Modern Neural Networks Technical requirements Discovering convolutional neural networks Refining the training process Influential Classification Tools Influential Classification Tools Technical requirements Understanding advanced CNN architectures Leveraging transfer learning Object Detection Models Object Detection Models Technical requirements Introducing object detection A fast object detection algorithm ? YOLO Faster R-CNN ? a powerful object detection model Enhancing and Segmenting Images Enhancing and Segmenting Images Technical requirements Transforming images with encoders-decoders Understanding semantic segmentation Training on Complex and Scarce Datasets Training on Complex and Scarce Datasets Technical requirements Efficient data serving How to deal with data scarcity Video and Recurrent Neural Networks Video and Recurrent Neural Networks Technical requirements Introducing RNNs Classifying videos Optimizing Models and Deploying on Mobile Devices Optimizing Models and Deploying on Mobile Devices Technical requirements Optimizing computational and disk footprints On-device machine learning Example app ? recognizing facial expressions

Hands-On Computervision with TensorFlow 2 (TTML6900)
Delivered OnlineFlexible Dates
Price on Enquiry

Developing Applications with Google Cloud

By Nexus Human

Duration 3 Days 18 CPD hours This course is intended for Application developers who want to build cloud-native applications or redesign existing applications that will run on Google Cloud Platform Overview This course teaches participants the following skills: Use best practices for application development. Choose the appropriate data storage option for application data. Implement federated identity management. Develop loosely coupled application components or microservices. Integrate application components and data sources. Debug, trace, and monitor applications. Perform repeatable deployments with containers and deployment services. Choose the appropriate application runtime environment; use Google Container Engine as a runtime environment and later switch to a no-ops solution with Google App Engine flexible environment. Learn how to design, develop, and deploy applications that seamlessly integrate components from the Google Cloud ecosystem. This course uses lectures, demos, and hands-on labs to show you how to use Google Cloud services and pre-trained machine learning APIs to build secure, scalable, and intelligent cloud-native applications. Best Practices for Application Development Code and environment management. Design and development of secure, scalable, reliable, loosely coupled application components and microservices. Continuous integration and delivery. Re-architecting applications for the cloud. Google Cloud Client Libraries, Google Cloud SDK, and Google Firebase SDK How to set up and use Google Cloud Client Libraries, Google Cloud SDK, and Google Firebase SDK. Lab: Set up Google Client Libraries, Cloud SDK, and Firebase SDK on a Linux instance and set up application credentials. Overview of Data Storage Options Overview of options to store application data. Use cases for Google Cloud Storage, Cloud Firestore, Cloud Bigtable, Google Cloud SQL, and Cloud Spanner. Best Practices for Using Cloud Firestore Best practices related to using Cloud Firestore in Datastore mode for:Queries, Built-in and composite indexes, Inserting and deleting data (batch operations),Transactions,Error handling. Bulk-loading data into Cloud Firestore by using Google Cloud Dataflow. Lab: Store application data in Cloud Datastore. Performing Operations on Cloud Storage Operations that can be performed on buckets and objects. Consistency model. Error handling. Best Practices for Using Cloud Storage Naming buckets for static websites and other uses. Naming objects (from an access distribution perspective). Performance considerations. Setting up and debugging a CORS configuration on a bucket. Lab: Store files in Cloud Storage. Handling Authentication and Authorization Cloud Identity and Access Management (IAM) roles and service accounts. User authentication by using Firebase Authentication. User authentication and authorization by using Cloud Identity-Aware Proxy. Lab: Authenticate users by using Firebase Authentication. Using Pub/Sub to Integrate Components of Your Application Topics, publishers, and subscribers. Pull and push subscriptions. Use cases for Cloud Pub/Sub. Lab: Develop a backend service to process messages in a message queue. Adding Intelligence to Your Application Overview of pre-trained machine learning APIs such as Cloud Vision API and Cloud Natural Language Processing API. Using Cloud Functions for Event-Driven Processing Key concepts such as triggers, background functions, HTTP functions. Use cases. Developing and deploying functions. Logging, error reporting, and monitoring. Managing APIs with Cloud Endpoints Open API deployment configuration. Lab: Deploy an API for your application. Deploying Applications Creating and storing container images. Repeatable deployments with deployment configuration and templates. Lab: Use Deployment Manager to deploy a web application into Google App Engine flexible environment test and production environments. Execution Environments for Your Application Considerations for choosing an execution environment for your application or service:Google Compute Engine (GCE),Google Kubernetes Engine (GKE), App Engine flexible environment, Cloud Functions, Cloud Dataflow, Cloud Run. Lab: Deploying your application on App Engine flexible environment. Debugging, Monitoring, and Tuning Performance Application Performance Management Tools. Stackdriver Debugger. Stackdriver Error Reporting. Lab: Debugging an application error by using Stackdriver Debugger and Error Reporting. Stackdriver Logging. Key concepts related to Stackdriver Trace and Stackdriver Monitoring. Lab: Use Stackdriver Monitoring and Stackdriver Trace to trace a request across services, observe, and optimize performance.

Developing Applications with Google Cloud
Delivered OnlineFlexible Dates
Price on Enquiry

Symantec Data Loss Prevention 14.0 - Administration

By Nexus Human

Duration 5 Days 30 CPD hours This course is intended for This course is intended for anyone responsible for conf iguring, maintaining, and troubleshooting Symantec Data Loss Prevention. Additionally, this course is intended for technical users responsible for creating and maintaining Symantec Data Loss Prevention policies and the incident response structure. Overview At the completion of the course, you will be able to: Enforce server, detection servers, and DLP Agents as well as reporting, workflow, incident response management, policy management and detection, response management, user and role administration, directory integration, and filtering. This course is designed to provide you with the fundamental know ledge to configure and administer the Symantec Data Loss Prevention Enforce platform. Introduction to Symantec Data Loss Prevention Symantec Data Loss Prevention overview Symantec Data Loss Prevention architecture Navigation and Reporting Navigating the user interface Reporting and analysis Report navigation, preferences, and features Report filters Report commands Incident snapshot Incident Data Access Hands-on labs: Become familiar with navigation and tools in the user interface. Create, filter, summarize, customize, and distribute reports. Create users, roles, and attributes. Incident Remediation and Workflow Incident remediation and w orkf low Managing users and attributes Custom attribute lookup User Risk Summary Hands-on labs: Remediate incidents and configure a user?s reporting preferences Policy Management Policy overview Creating policy groups Using policy templates Building policies Policy development best practices Hands-on labs: Use policy templates and policy builder to configure and apply new policies Response Rule Management Response rule overview Configuring Automated Response rules Configuring Smart Response rules Response rule best practices Hands-On Labs: Create and use Automated and Smart Response rules Described Content Matching DCM detection methods Hands-on labs: Create policies that include DCM and then use those policies to capture incidents Exact Data Matching and Directory Group Matching Exact data matching (EDM) Advanced EDM Directory group matching (DGM) Hands-on labs: Create policies that include EDM and DGM, and then use those policies to capture incident Indexed Document Matching Indexed document matching (IDM) Hands-on labs: Create policies that include IDM rules and then use those policies to capture incidents Vector Machine Learning Vector Machine Learning (VML) Hands-on labs: Create a VML profile, import document sets, and create a VML policy Network Monitor Review of Network Monitor Protocols Traffic filtering Network Monitor best practices Hands-On Labs: Apply IP and L7 filters Network Prevent Network Prevent overview Introduction to Network Prevent (Email) Introduction to Network Prevent (Web) Hands-On Labs: Configure Network Prevent (E-mail) response rules, incorporate them into policies, and use the policies to capture incidents Mobile Email Monitor and Mobile Prevent Introduction to Mobile Email Monitor Mobile Prevent overview Configuration VPN configuration Policy and Response Rule Creation Reporting and Remediation Troubleshooting Network Discover and Network Protect Network Discover and Network Protect overview Configuring Discover targets Configuring Box cloud targets Protecting data Auto-discovery of servers and shares Running and managing scans Reports and remediation Network Discover and Network Protect best practices Hands-on labs: Create and run a filesystem target using various response rules, including quarantining Endpoint Prevent Endpoint Prevent overview Detection capabilities at the Endpoint Configuring Endpoint Prevent Creating Endpoint response rules Viewing Endpoint Prevent incidents Endpoint Prevent best practices Managing DLP Agents Hands-on labs: Create Agent Groups and Endpoint response rules, monitor and block Endpoint actions, view Endpoint incidents, and use the Enforce console to manage DLP Agents Endpoint Discover Endpoint Discover overview Creating and running Endpoint Discover targets Using Endpoint Discover reports and reporting features Hands-on labs: Create Endpoint Discover targets, run Endpoint Discover targets, and view Endpoint Discover incidents Enterprise Enablement Preparing for risk reduction Risk reduction DLP Maturity model System Administration Server administration Language support Incident Delete Credential management Troubleshooting Diagnostic tools Troubleshooting scenario Getting support Hands-on labs: Interpret event reports and traffic reports, configure alerts, and use the Log Collection and Configuration tool Additional course details: Nexus Humans Symantec Data Loss Prevention 14.0 - Administration training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the Symantec Data Loss Prevention 14.0 - Administration course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.

Symantec Data Loss Prevention 14.0 - Administration
Delivered OnlineFlexible Dates
Price on Enquiry

MB-260T00: Microsoft Customer Data Platform Specialty

By Nexus Human

Duration 4 Days 24 CPD hours This course is intended for Candidates should be familiar with Dynamics 365 Customer Insights and have firsthand experience with one or more additional Dynamics 365 apps, Power Query, Microsoft Dataverse, Common Data Model, and Microsoft Power Platform. They should also have working knowledge of practices related to privacy, compliance, consent, security, responsible AI, and data retention policy. Overview After completing this course, you will be able to: Clean, transform, and ingest data into Dynamics 365 Customer Insights Create a unified customer profile Work with Dynamics 365 Audience insights Enrich data and predictions Set up and manage external connections Administer and monitor Customer Insights Customer Data Platform specialists implement solutions that provide insight into customer profiles and that track engagement activities to help improve customer experiences and increase customer retention. In this course, students will learn about the Dynamics 365 Customer Insights solution, including how to unify customer data with prebuilt connectors, predict customer intent with rich segmentation, and maintain control of customer data. This specialty course starts with creating a unified profile and then working with customer data. Module 1: Get started with Dynamics 365 Customer Insights Introduction to the customer data platform Administer Dynamics 365 Customer Insights Explore user permissions in Dynamics 365 Customer Insights Module 2: Ingest data into Dynamics 365 Customer Insights Import and transform data Connect to data sources Work with data Module 3: Create a unified customer profile in Dynamics 365 Customer Insights Map data Match data Merge data Find customers Module 4: Work with Dynamics 365 Customer Insights Explore Audience insights Define relationships and activities Work with measures Work with segments Module 5: Enrich data and predictions with Audience insights Enrich data Use predictions Use machine learning models Module 6: Manage external connections with Customer Data Platform Export Customer Insights data Use Customer Insights with Microsoft Power Platform Display Customer Insights data in Dynamics 365 apps More ways to extend Customer Insights

MB-260T00: Microsoft Customer Data Platform Specialty
Delivered OnlineFlexible Dates
Price on Enquiry

Advanced Programming Techniques with Python v1.2

By Nexus Human

Duration 3 Days 18 CPD hours This course is intended for This course is designed for existing Python programmers who have at least one year of Python experience and who want to expand their programming proficiency in Python 3. Overview In this course, you will expand your Python proficiencies. You will: Select an object-oriented programming approach for Python applications. Create object-oriented Python applications. Create a desktop application. Create data-driven applications. Create and secure web service-connected applications. Program Python for data science. Implement unit testing and exception handling. Package an application for distribution. Python© continues to be a popular programming language, perhaps owing to its easy learning curve, small code footprint, and versatility for business, web, and scientific uses. Python is useful for developing custom software tools, applications, web services, and cloud applications. In this course, you'll build upon your basic Python skills, learning more advanced topics such as object-oriented programming patterns, development of graphical user interfaces, data management, creating web service-connected apps, performing data science tasks, unit testing, and creating and installing packages and executable applications. Lesson 1: Selecting an Object-Oriented Programming Approach for Python Applications Topic A: Implement Object-Oriented Design Topic B: Leverage the Benefits of Object-Oriented Programming Lesson 2: Creating Object-Oriented Python Applications Topic A: Create a Class Topic B: Use Built-in Methods Topic C: Implement the Factory Design Pattern Lesson 3: Creating a Desktop Application Topic A: Design a Graphical User Interface (GUI) Topic B: Create Interactive Applications Lesson 4: Creating Data-Driven Applications Topic A: Connect to Data Topic B: Store, Update, and Delete Data in a Database Lesson 5: Creating and Securing a Web Service-Connected App Topic A: Select a Network Application Protocol Topic B: Create a RESTful Web Service Topic C: Create a Web Service Client Topic D: Secure Connected Applications Lesson 6: Programming Python for Data Science Topic A: Clean Data with Python Topic B: Visualize Data with Python Topic C: Perform Linear Regression with Machine Learning Lesson 7: Implementing Unit Testing and Exception Handling Topic A: Handle Exceptions Topic B: Write a Unit Test Topic C: Execute a Unit Test Lesson 8: Packaging an Application for Distribution Topic A: Create and Install a Package Topic B: Generate Alternative Distribution Files

Advanced Programming Techniques with Python v1.2
Delivered OnlineFlexible Dates
Price on Enquiry

Advanced Programming Techniques with Python (v1.1)

By Nexus Human

Duration 3 Days 18 CPD hours This course is intended for This course is designed for existing Python programmers who have at least one year of Python experience and who want to expand their programming proficiency in Python 3. Overview In this course, you will expand your Python proficiencies. You will: Select an object-oriented programming approach for Python applications. Create object-oriented Python applications. Create a desktop application. Create data-driven applications. Create and secure web service-connected applications. Program Python for data science. Implement unit testing and exception handling. Package an application for distribution. Python continues to be a popular programming language, perhaps owing to its easy learning curve, small code footprint, and versatility for business, web, and scientific uses. Python is useful for developing custom software tools, applications, web services, and cloud applications. In this course, you'll build upon your basic Python skills, learning more advanced topics such as object-oriented programming patterns, development of graphical user interfaces, data management, creating web service-connected apps, performing data science tasks, unit testing, and creating and installing packages and executable applications. Lesson 1: Selecting an Object-Oriented Programming Approach for Python Applications Topic A: Implement Object-Oriented Design Topic B: Leverage the Benefits of Object-Oriented Programming Lesson 2: Creating Object-Oriented Python Applications Topic A: Create a Class Topic B: Use Built-in Methods Topic C: Implement the Factory Design Pattern Lesson 3: Creating a Desktop Application Topic A: Design a Graphical User Interface (GUI) Topic B: Create Interactive Applications Lesson 4: Creating Data-Driven Applications Topic A: Connect to Data Topic B: Store, Update, and Delete Data in a Database Lesson 5: Creating and Securing a Web Service-Connected App Topic A: Select a Network Application Protocol Topic B: Create a RESTful Web Service Topic C: Create a Web Service Client Topic D: Secure Connected Applications Lesson 6: Programming Python for Data Science Topic A: Clean Data with Python Topic B: Visualize Data with Python Topic C: Perform Linear Regression with Machine Learning Lesson 7: Implementing Unit Testing and Exception Handling Topic A: Handle Exceptions Topic B: Write a Unit Test Topic C: Execute a Unit Test Lesson 8: Packaging an Application for Distribution Topic A: Create and Install a Package Topic B: Generate Alternative Distribution Files

Advanced Programming Techniques with Python (v1.1)
Delivered OnlineFlexible Dates
Price on Enquiry

SAS Programming 1 - Essentials

By Nexus Human

Duration 3 Days 18 CPD hours This course is intended for Anyone starting to write SAS programs Overview Use SAS Studio and SAS Enterprise Guide to write and submit SAS programs. Access SAS, Microsoft Excel, and text data. Explore and validate data. Prepare data by subsetting rows and computing new columns. Analyze and report on data. Export data and results to Excel, PDF, and other formats. Use SQL in SAS to query and join tables. This course is for users who want to learn how to write SAS programs to access, explore, prepare, and analyze data. It is the entry point to learning SAS programming for data science, machine learning, and artificial intelligence. Essentials The SAS programming process. Using SAS programming tools. Understanding SAS syntax. Accessing Data Understanding SAS data. Accessing data through libraries. Importing data into SAS. Exploring and Validating Data Exploring data. Filtering rows. Formatting columns. Sorting data and removing duplicates. Preparing Data Reading and filtering data. Computing new columns. Conditional processing. Analyzing and Reporting on Data Enhancing reports with titles, footnotes, and labels. Creating frequency reports. Creating summary statistics reports. Exporting Results Exporting data. Exporting reports. Using SQL in SAS Using Structured Query Language in SAS. Joining tables using SQL in SAS. Additional course details: Nexus Humans SAS Programming 1 - Essentials training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the SAS Programming 1 - Essentials course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.

SAS Programming 1 - Essentials
Delivered OnlineFlexible Dates
Price on Enquiry

Building Data Analytics Solutions Using Amazon Redshift

By Nexus Human

Duration 1 Days 6 CPD hours This course is intended for This course is intended for data warehouse engineers, data platform engineers, and architects and operators who build and manage data analytics pipelines. Completed either AWS Technical Essentials or Architecting on AWS Completed Building Data Lakes on AWS Overview In this course, you will learn to: Compare the features and benefits of data warehouses, data lakes, and modern data architectures Design and implement a data warehouse analytics solution Identify and apply appropriate techniques, including compression, to optimize data storage Select and deploy appropriate options to ingest, transform, and store data Choose the appropriate instance and node types, clusters, auto scaling, and network topology for a particular business use case Understand how data storage and processing affect the analysis and visualization mechanisms needed to gain actionable business insights Secure data at rest and in transit Monitor analytics workloads to identify and remediate problems Apply cost management best practices In this course, you will build a data analytics solution using Amazon Redshift, a cloud data warehouse service. The course focuses on the data collection, ingestion, cataloging, storage, and processing components of the analytics pipeline. You will learn to integrate Amazon Redshift with a data lake to support both analytics and machine learning workloads. You will also learn to apply security, performance, and cost management best practices to the operation of Amazon Redshift. Module A: Overview of Data Analytics and the Data Pipeline Data analytics use cases Using the data pipeline for analytics Module 1: Using Amazon Redshift in the Data Analytics Pipeline Why Amazon Redshift for data warehousing? Overview of Amazon Redshift Module 2: Introduction to Amazon Redshift Amazon Redshift architecture Interactive Demo 1: Touring the Amazon Redshift console Amazon Redshift features Practice Lab 1: Load and query data in an Amazon Redshift cluster Module 3: Ingestion and Storage Ingestion Interactive Demo 2: Connecting your Amazon Redshift cluster using a Jupyter notebook with Data API Data distribution and storage Interactive Demo 3: Analyzing semi-structured data using the SUPER data type Querying data in Amazon Redshift Practice Lab 2: Data analytics using Amazon Redshift Spectrum Module 4: Processing and Optimizing Data Data transformation Advanced querying Practice Lab 3: Data transformation and querying in Amazon Redshift Resource management Interactive Demo 4: Applying mixed workload management on Amazon Redshift Automation and optimization Interactive demo 5: Amazon Redshift cluster resizing from the dc2.large to ra3.xlplus cluster Module 5: Security and Monitoring of Amazon Redshift Clusters Securing the Amazon Redshift cluster Monitoring and troubleshooting Amazon Redshift clusters Module 6: Designing Data Warehouse Analytics Solutions Data warehouse use case review Activity: Designing a data warehouse analytics workflow Module B: Developing Modern Data Architectures on AWS Modern data architectures

Building Data Analytics Solutions Using Amazon Redshift
Delivered OnlineFlexible Dates
Price on Enquiry

CertNexus Data Ethics for Business Professionals (DEBIZ)

By Nexus Human

Duration 1 Days 6 CPD hours This course is intended for This course is designed for business leaders and decision makers, including C-level executives, project and product managers, HR leaders, Marketing and Sales leaders, and technical sales consultants, who have a vested interest in the representation of ethical values in technology solutions. Other individuals who want to know more about data ethics are also candidates for this course. This course is also designed to assist learners in preparing for the CertNexus DEBIZ™ (Exam DEB-110) credential. The power of extracting value from data utilizing Artificial Intelligence, Data Science and Machine Learning exposes the learning differences between humans and machines. Humans can apply ethical principles throughout the decision-making process to avoid discrimination, societal harm, and marginalization to maintain and even enhance acceptable norms. Machines make decisions autonomously. So how do we train them to apply ethical principles as they learn from decisions they make? This course provides business professionals and consumers of technology core concepts of ethical principles, how they can be applied to emerging data driven technologies and the impact to an organization which ignores ethical use of technology. Introduction to Data Ethics Defining Data Ethics The Case for Data Ethics Identifying Ethical Issues Improving Ethical Data Practices Ethical Principles Ethical Frameworks Data Privacy Accountability Transparency and Explainability Human-Centered Values and Fairness Inclusive Growth, Sustainable Development, and Well-Being Applying Ethical Principles to Emerging Technology Improving Ethical Data Practices Sources of Ethical Risk Mitigating Bias Mitigating Discrimination Safety and Security Mitigating Negative Outputs Data Surveillance Assessing Risk Ethical Risks in sharing data Applying professional critical judgement Business Considerations Data Legislation Impact of Social and Behavioral Effects Trustworthiness Impact on Business Reputation Organizational Values and the Data Value Chain Building a Data Ethics Culture/Code of Ethics Balancing organizational goals with Ethical Practice Additional course details: Nexus Humans CertNexus Data Ethics for Business Professionals (DEBIZ) training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the CertNexus Data Ethics for Business Professionals (DEBIZ) course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.

CertNexus Data Ethics for Business Professionals (DEBIZ)
Delivered OnlineFlexible Dates
Price on Enquiry