Course Overview:

In this course, the students will implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-200: Implementing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 4 days

Course Outline:

  • Azure for the Data Engineer
  • Working with Data Storage
  • Enabling Team Based Data Science with Azure Databricks
  • Building Globally Distributed Databases with Cosmos DB
  • Working with Relational Data Stores in the Cloud
  • Performing Real-Time Analytics with Stream Analytics
  • Orchestrating Data Movement with Azure Data Factory
  • Securing Azure Data Platforms
  • Monitoring and Troubleshooting Data Storage and Processing

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

Through an introduction to Docker, Kubernetes, and Red Hat OpenShift Platform, this training course helps you understand one of the key tenets of the DevOps and DevSecOps Platform (DSOP) movement: continuous integration and continuous deployment. The CI/CD pipeline becomes well understood and implemented in an open architecture.  Containers have become a key technology for the configuration and deployment of applications and micro services. Kubernetes is a container orchestration platform that provides foundational services in Red Hat OpenShift Container Platform, which allows enterprises to manage container deployments and scale their applications using Kubernetes.

This training course provides an overview of the DoD Enterprise DevSecOps Platform (DSOP) Reference Design, its current state, and ties to DoD Cloud Platform One (P1). Workflows of the DoD Iron Bank container repository are introduced, along with an overview of the DoD Pipeline as represented in Big Bang.  Continuous authorization cATO via Party Bus within NIST RMF is presented. You will become aware of the Platform One (P1) integrations and relationship to Docker, Kubernetes, Istio (Red Hat OpenShift Service Mesh) and Red Hat OpenShift Platform.

In addition to gaining an understanding of these tools, you will build core administration skills through the installation, configuration, and management of an OpenShift cluster and containerized applications.

Course Objectives:

  • Learn about Containers, Docker, Kubernetes, and OpenShift architecture
  • Overview DoD Enterprise DevSecOps Platform (DSOP) Reference Design and DoD Cloud Platform One (P1)
  • Tie together awareness of various DoD Cloud offerings and their relationships
  • Create containerized services
  • Manage containers and container images
  • Deploy multi-container applications
  • Install an OpenShift cluster
  • Configure and manage masters and nodes
  • Secure OpenShift
  • Control access to resources on OpenShift
  • Monitor and collect metrics on OpenShift
  • Deploy applications on OpenShift using source-to-image (S2I)
  • Manage storage on OpenShift

Course Outline:

  • Getting started with container technology
  • Creating containerized services
  • Managing containers
  • Managing container images
  • Creating custom container images
  • Deploying containerized applications on OpenShift
  • Deploying multi-container applications
  • Troubleshooting containerized applications
  • Comprehensive Review of Introduction to Container, Kubernetes, and RedHat OpenShift
  • Introducing Red Hat OpenShift Container Platform
  • Installing OpenShift Container Platform
  • Describing and exploring OpenShift networking concepts
  • Executing commands
  • Controlling access to OpenShift resources
  • Allocating persistent storage
  • Managing application deployments
  • Installing and configuring the metrics subsystem
  • Managing and monitoring OpenShift Container Platform

Dates/Locations:

No Events

Duration: 5 Days

Prerequisites:

  • Ability to use a Linux® terminal session and issue operating system commands
  • Good foundation in Linux
  • Experience with web application architectures and their corresponding technologies

Target Audience:

  • Developers who wish to containerize software applications
  • Administrators who are new to container technology and container orchestration
  • Architects who are considering using container technologies in software architectures
  • System administrators
  • System architects
  • Architects and developers who want to install and configure OpenShift Container Platform
  • Those working in the field of DevSecOps supporting DoD Platform One (P1) and other implementations

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will design various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, NoSQL, or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data. The students will also explore how to design data security, including data access, data policies, and standards. They will also design Azure data solutions, which includes the optimization, availability, and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-201: Designing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 3 days

Course Outline:

  • Data Platform Architecture Considerations
  • Azure Batch Processing Reference Architectures
  • Azure Real-Time Reference Architectures
  • Data Platform Security Design Considerations
  • Designing for Resiliency and Scale
  • Design for Efficiency and Operations

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals
      • DP-200: Implementing an Azure Data Solution

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

Here are all of our Microsoft Server Products related courses:

in   

After you press "Request Registration" near the bottom of this form, within 30 seconds, status will be provided at the bottom of the form, you will also be contacted by phone for credit card information.

Tech Now is pleased to have the opportunity to provide you training for "Windows Security Automation and Threat Hunting with PowerShell” at CheddarCon 2018!

Scroll down to see the course description.

First Name*
Last Name*
Your Email*
Your Organization*
Phone*

Questions:

After you press "Request Registration" on this form, within 30 seconds, status will be provided at the bottom of the form, you will also be contacted by phone for credit card information.

Windows Security Automation and Threat Hunting with PowerShell Seminar

Location: 400 W Wisconsin Ave, Milwaukee, WI 53203, USA

Date: October 10, 2018 8:00am – 4:00pm

Duration: 8 hours

Audience: Cyber Security professionals and Windows administrators

Attendees Environment: Laptops not required, but suggested to have better hands-on absorption of subject matter.

Description:
PowerShell is both a command-line shell and scripting language. Fight fires quickly using existing or custom PowerShell commands or scripts at the shell. PowerShell is made for Security Operations (SecOps) automation on Windows. This seminar does not require prior programming skills. The seminar focuses on PowerShell programming, giving a beginner skills to be productive in windows scripting to automate tasks and also remediate problems.

Cyber Security is the objective of this seminar, and the PowerShell examples will demonstrate PowerShell capabilities that help lock down a Windows system and also report security status.

Objectives:

PowerShell Overview

  • Getting started running commands
  • Security cmdlets
  • Using and updating the built-in help
  • Execution policies
  • Fun tricks with the ISE graphical editor
  • Piping .NET and COM objects, not text
  • Using properties and methods of objects
  • Helping Linux admins feel more at home
  • Aliases, cmdlets, functions, modules, etc.

PowerShell Utilities and Tips

  • Customizing your profile script
  • PowerShell remote command execution
  • Security setting across the network
  • File copy via PowerShell remoting
  • Capturing the output of commands
  • Parsing text files and logs with regex patterns
  • Parsing Security Logs
  • Searching remote event logs
  • Mounting the registry as a drive
  • Security settings in the Registry
  • Exporting data to CSV, HTML and JSON files
  • Running scripts as scheduled jobs
  • Continued Security Compliance
  • Pushing out scripts through Group Policy
  • Importing modules and dot-sourcing functions
  • http://www.PowerShellGallery.com

PowerShell Scripting

  • PowerShell Scripting to implement Security Practices
  • Writing your own functions to automate security status and settings
  • Passing arguments into your scripts
  • Function parameters and returning output
  • Flow control: if-then, foreach, that make security decisions
  • How to pipe data in/out of your scripts for security compliance and reporting

Attendees to this seminar, Windows Security Automation and Threat Hunting with PowerShell, will receive TechNow approved course materials and expert instruction.[/wr_text][/wr_column][/wr_row]