Course Overview:

The HCISPP is the only certification that combines cybersecurity skills with privacy best practices and techniques. It demonstrates you have the knowledge and ability to implement, manage, and assess security and privacy controls to protect healthcare organizations using policies and procedures established by the cybersecurity experts at (ISC)2. TechNows HCISPP Certification Boot Camp is a comprehensive review of Healthcare cybersecurity with privacy best practices & industry best practices.

Attendees to TN-8155: HCISPP Certification Preparation Seminar will receive TechNow approved course materials and expert instruction..

Date/Locations:

No Events

Course Duration: 5 days

Course Objectives:

  • Strategically focus your preparation for HCISPP Certification
  • Cover a broad spectrum of topics in the 7 domains of the HCISPP Common Body of Knowledge (CBK)
  • Gain knowledge on the Healthcare industry including third party relationships and health data management concepts
  • Identify applicable regulations, compliance frameworks, privacy principles and policies to protect information security
  • Develop risk management methodology and identify control assessment procedures

Audience:

  • The HCISPP certification is ideal for security professionals responsible for safeguarding protected health information (PHI). Take this HCISPP training course to prepare to manage and implement security controls for healthcare information. HCISPPs are instrumental to a variety of job functions: Compliance Officer, Information Security Manager, Privacy Officer, Compliance Auditor, Risk Analyst, Medical Records Supervisor, IT Manager, Privacy & Security Consultants, and Health Information Manager.

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview: 

 

This course identifies how business analysts can elicit Agile requirements by writing user stories from use cases and personas of customer profiles.  This leads to the processes of confirming the validity and usability for quality of the product development.  An Agile Business Analyst has become a new recognized role within the other Agile Framework roles.

Attendees to PM-242:Defining Agile Requirements with User Stories will receive TechNow approved course materials and expert instruction.

Dates/Locations:

No Events

Duration: 2 Days

Course Objectives: At the conclusion of this course, students will be able to:

  • Understand the Scrum Flow, the core components of the Scrum framework
  • Understand the scope of the Product Owner role in detail
  • Understand the scope of the Agile Business Analysis role in coordination with the Product Owner, Scrum Master and Development Team
  • Understand the scope of the Scrum Master role at a high level
  • Understand the scope of the Scrum Development Team roles
  • Document the interactions between the user of a system and the system itself
  • Dive into understanding the Agile principles for requirements using user stories in a card, conversation, and confirmation format

 

Target Student:

  • Designed specifically for Agile project team members, product owners, project leaders and business analysts or anyone wanting to understand the Agile Framework.

Comments

Latest comments from students


Liked the class?  Then let everyone know!

 

Course Overview:

Through an introduction to Docker, Kubernetes, and Red Hat OpenShift Platform, this training course helps you understand one of the key tenets of the DevOps and DevSecOps Platform (DSOP) movement: continuous integration and continuous deployment. The CI/CD pipeline becomes well understood and implemented in an open architecture.  Containers have become a key technology for the configuration and deployment of applications and micro services. Kubernetes is a container orchestration platform that provides foundational services in Red Hat OpenShift Container Platform, which allows enterprises to manage container deployments and scale their applications using Kubernetes.

This training course provides an overview of the DoD Enterprise DevSecOps Platform (DSOP) Reference Design, its current state, and ties to DoD Cloud Platform One (P1). Workflows of the DoD Iron Bank container repository are introduced, along with an overview of the DoD Pipeline as represented in Big Bang.  Continuous authorization cATO via Party Bus within NIST RMF is presented. You will become aware of the Platform One (P1) integrations and relationship to Docker, Kubernetes, Istio (Red Hat OpenShift Service Mesh) and Red Hat OpenShift Platform.

In addition to gaining an understanding of these tools, you will build core administration skills through the installation, configuration, and management of an OpenShift cluster and containerized applications.

Course Objectives:

  • Learn about Containers, Docker, Kubernetes, and OpenShift architecture
  • Overview DoD Enterprise DevSecOps Platform (DSOP) Reference Design and DoD Cloud Platform One (P1)
  • Tie together awareness of various DoD Cloud offerings and their relationships
  • Create containerized services
  • Manage containers and container images
  • Deploy multi-container applications
  • Install an OpenShift cluster
  • Configure and manage masters and nodes
  • Secure OpenShift
  • Control access to resources on OpenShift
  • Monitor and collect metrics on OpenShift
  • Deploy applications on OpenShift using source-to-image (S2I)
  • Manage storage on OpenShift

Course Outline:

  • Getting started with container technology
  • Creating containerized services
  • Managing containers
  • Managing container images
  • Creating custom container images
  • Deploying containerized applications on OpenShift
  • Deploying multi-container applications
  • Troubleshooting containerized applications
  • Comprehensive Review of Introduction to Container, Kubernetes, and RedHat OpenShift
  • Introducing Red Hat OpenShift Container Platform
  • Installing OpenShift Container Platform
  • Describing and exploring OpenShift networking concepts
  • Executing commands
  • Controlling access to OpenShift resources
  • Allocating persistent storage
  • Managing application deployments
  • Installing and configuring the metrics subsystem
  • Managing and monitoring OpenShift Container Platform

Dates/Locations:

No Events

Duration: 5 Days

Prerequisites:

  • Ability to use a Linux® terminal session and issue operating system commands
  • Good foundation in Linux
  • Experience with web application architectures and their corresponding technologies

Target Audience:

  • Developers who wish to containerize software applications
  • Administrators who are new to container technology and container orchestration
  • Architects who are considering using container technologies in software architectures
  • System administrators
  • System architects
  • Architects and developers who want to install and configure OpenShift Container Platform
  • Those working in the field of DevSecOps supporting DoD Platform One (P1) and other implementations

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will design various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, NoSQL, or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data. The students will also explore how to design data security, including data access, data policies, and standards. They will also design Azure data solutions, which includes the optimization, availability, and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-201: Designing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 3 days

Course Outline:

  • Data Platform Architecture Considerations
  • Azure Batch Processing Reference Architectures
  • Azure Real-Time Reference Architectures
  • Data Platform Security Design Considerations
  • Designing for Resiliency and Scale
  • Design for Efficiency and Operations

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals
      • DP-200: Implementing an Azure Data Solution

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-200: Implementing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 4 days

Course Outline:

  • Azure for the Data Engineer
  • Working with Data Storage
  • Enabling Team Based Data Science with Azure Databricks
  • Building Globally Distributed Databases with Cosmos DB
  • Working with Relational Data Stores in the Cloud
  • Performing Real-Time Analytics with Stream Analytics
  • Orchestrating Data Movement with Azure Data Factory
  • Securing Azure Data Platforms
  • Monitoring and Troubleshooting Data Storage and Processing

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!