TechNow has heard many students talk about virtualized/remote training that TechNow Does Not Do.  While training our most recent offering of PA-215: Palo Alto Networks Firewall Essentials FastTrack a student told his story of how he endend up in our course.  His story we have heard for other technologies like Cisco, VMware, BlueCoat and other products.

A large percentage of training is moving to the virtualized/remote lab environments.  Students are asked to use some variant of remote access software and remote into the training company's lab environment. Our student in our Palo Alto Networks Firewall course informed us that he went to a very costly offering of that course from the vendor and was not able to perform any labs.  There were either network connectivity issues, or issues with the remote access software, or other problems.  The whole training experience was very frustrating and not productive.

We keep our labs open to students if they would like after hours, or before hours access.  Repeatedly going through a lab engrains that knowledge for later recall.  Touching hardware is so critical in understanding the problems that arise when a cable comes loose, or a cable gets plugged in the wrong port.  There are other scenarios such as just pulling the power cable, or turning off a power strip, or accidently overwriting a configuration.  These disaster scenarious requires hands-on physical access to hardware.  Preventing and recovering from disasters is what it's all about, and that requires hands-on, instructor led, real hardware.

 

Course Overview:

The introduction to SQL Databases training course is designed to train the learners on the fundamentals of database concepts. You will not only learn about the different types of databases, the languages and designs as well as describe important database concepts using SQL Server 2016. Anyone who is moving into a database role will benefit from taking this course.

 

Attendees to MS-5002: Introduction to SQL Databases will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 2 days

 

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

Through an introduction to Docker, Kubernetes, and Red Hat OpenShift Platform, this training course helps you understand one of the key tenets of the DevOps and DevSecOps Platform (DSOP) movement: continuous integration and continuous deployment. The CI/CD pipeline becomes well understood and implemented in an open architecture.  Containers have become a key technology for the configuration and deployment of applications and micro services. Kubernetes is a container orchestration platform that provides foundational services in Red Hat OpenShift Container Platform, which allows enterprises to manage container deployments and scale their applications using Kubernetes.

This training course provides an overview of the DoD Enterprise DevSecOps Platform (DSOP) Reference Design, its current state, and ties to DoD Cloud Platform One (P1). Workflows of the DoD Iron Bank container repository are introduced, along with an overview of the DoD Pipeline as represented in Big Bang.  Continuous authorization cATO via Party Bus within NIST RMF is presented. You will become aware of the Platform One (P1) integrations and relationship to Docker, Kubernetes, Istio (Red Hat OpenShift Service Mesh) and Red Hat OpenShift Platform.

In addition to gaining an understanding of these tools, you will build core administration skills through the installation, configuration, and management of an OpenShift cluster and containerized applications.

Course Objectives:

  • Learn about Containers, Docker, Kubernetes, and OpenShift architecture
  • Overview DoD Enterprise DevSecOps Platform (DSOP) Reference Design and DoD Cloud Platform One (P1)
  • Tie together awareness of various DoD Cloud offerings and their relationships
  • Create containerized services
  • Manage containers and container images
  • Deploy multi-container applications
  • Install an OpenShift cluster
  • Configure and manage masters and nodes
  • Secure OpenShift
  • Control access to resources on OpenShift
  • Monitor and collect metrics on OpenShift
  • Deploy applications on OpenShift using source-to-image (S2I)
  • Manage storage on OpenShift

Course Outline:

  • Getting started with container technology
  • Creating containerized services
  • Managing containers
  • Managing container images
  • Creating custom container images
  • Deploying containerized applications on OpenShift
  • Deploying multi-container applications
  • Troubleshooting containerized applications
  • Comprehensive Review of Introduction to Container, Kubernetes, and RedHat OpenShift
  • Introducing Red Hat OpenShift Container Platform
  • Installing OpenShift Container Platform
  • Describing and exploring OpenShift networking concepts
  • Executing commands
  • Controlling access to OpenShift resources
  • Allocating persistent storage
  • Managing application deployments
  • Installing and configuring the metrics subsystem
  • Managing and monitoring OpenShift Container Platform

Dates/Locations:

No Events

Duration: 5 Days

Prerequisites:

  • Ability to use a Linux® terminal session and issue operating system commands
  • Good foundation in Linux
  • Experience with web application architectures and their corresponding technologies

Target Audience:

  • Developers who wish to containerize software applications
  • Administrators who are new to container technology and container orchestration
  • Architects who are considering using container technologies in software architectures
  • System administrators
  • System architects
  • Architects and developers who want to install and configure OpenShift Container Platform
  • Those working in the field of DevSecOps supporting DoD Platform One (P1) and other implementations

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

CompTIA Cloud Essentials+ is for both IT and non-technical professionals who require the essential business acumen needed to make informed cloud service decisions.  Cloud Essentials is a vendor-neutral credential designed to validate the candidate has an understanding of basic terms and definitions of cloud computing along with the different processes involved in the successful adoption of cloud computing and its implications for organizations’ use.

TechNow is a CompTIA partner and uses official CompTIA Cloud Essentials+ curriculum.

Attendees to CT-213: Cloud Essentials+ will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 3 days

Course Objectives:

  • Domain 1 : Cloud Concepts
    • Understand cloud principles
    • Identify cloud networking concepts & storage techniques
    • Understand cloud design aspects
  • Domain 2: Business Principles of Cloud Environments
    • Identify and employ appropriate cloud assessments like feasibility studies, benchmarking, or gap analysis
    • Highlight key business aspects of cloud vendor relation adoption, and comprehend cloud migration approaches
  • Domain 3: Management and Technical Operations
    • Explain aspects of operating within the cloud, such as data management or optimization
    • Understand the role of DevOps in cloud environments, like API integration or provisioning
  • Domain 4: Governance, Risk, Compliance and Security for the Cloud
    • Understand risk management and response concepts related to cloud services and identify the importance and impacts of compliance in the cloud, such as regulatory concerns or international standards.

 

Course Prerequisites:

  • CompTIA recommends that a candidate have at least 6 months of experience in an IT environment, with direct involvement in IT-related tasks responsibilities and/or decision making.

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will design various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, NoSQL, or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data. The students will also explore how to design data security, including data access, data policies, and standards. They will also design Azure data solutions, which includes the optimization, availability, and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-201: Designing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 3 days

Course Outline:

  • Data Platform Architecture Considerations
  • Azure Batch Processing Reference Architectures
  • Azure Real-Time Reference Architectures
  • Data Platform Security Design Considerations
  • Designing for Resiliency and Scale
  • Design for Efficiency and Operations

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals
      • DP-200: Implementing an Azure Data Solution

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!