Course Overview:

This course provides the knowledge and skills to design and implement DevOps processes and practices. Students will learn how to plan for DevOps, use source control, scale Git for an enterprise, consolidate artifacts, design a dependency management strategy, manage secrets, implement continuous integration, implement a container build strategy, design a release strategy, set up a release management workflow, implement a deployment pattern, and optimize feedback mechanisms.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to AZ-400: Microsoft Azure DevOps Solutions will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 5 days

Course Outline:

  • Planning for DevOps
  • Getting started with Source Control
  • Scaling Git for enterprise DevOps
  • Consolidating Artifacts & Designing a Dependency Management Strategy
  • Implementing Continuous Integration with Azure Pipelines
  • Managing Application Config and Secrets
  • Managing Code Quality and Security Policies
  • Implementing a Container Build Strategy
  • Manage Artifact versioning, security & compliance
  • Design a Release Strategy
  • Set up a Release Management Workflow
  • Implement an appropriate deployment pattern
  • Implement process for routing system feedback to development teams
  • Implement a mobile DevOps strategy
  • Infrastructure and Configuration Azure Tools
  • Azure Deployment Models and Services
  • Create and Manage Kubernetes Service Infrastructure
  • Third Party Infrastructure as Code Tools available with Azure
  • Implement Compliance and Security in your Infrastructure
  • Recommend and design system feedback mechanisms
  • Optimize feedback mechanisms

Prerequisites :

      • AZ-900: Microsoft Azure Fundamentals
      • Fundamental knowledge about Azure, version control, Agile software development, and core software development principles. It would be helpful to have experience in an organization that delivers software.

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-200: Implementing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 4 days

Course Outline:

  • Azure for the Data Engineer
  • Working with Data Storage
  • Enabling Team Based Data Science with Azure Databricks
  • Building Globally Distributed Databases with Cosmos DB
  • Working with Relational Data Stores in the Cloud
  • Performing Real-Time Analytics with Stream Analytics
  • Orchestrating Data Movement with Azure Data Factory
  • Securing Azure Data Platforms
  • Monitoring and Troubleshooting Data Storage and Processing

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will design various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, NoSQL, or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data. The students will also explore how to design data security, including data access, data policies, and standards. They will also design Azure data solutions, which includes the optimization, availability, and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-201: Designing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 3 days

Course Outline:

  • Data Platform Architecture Considerations
  • Azure Batch Processing Reference Architectures
  • Azure Real-Time Reference Architectures
  • Data Platform Security Design Considerations
  • Designing for Resiliency and Scale
  • Design for Efficiency and Operations

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals
      • DP-200: Implementing an Azure Data Solution

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

Through an introduction to Docker, Kubernetes, and Red Hat OpenShift Platform, this training course helps you understand one of the key tenets of the DevOps and DevSecOps Platform (DSOP) movement: continuous integration and continuous deployment. The CI/CD pipeline becomes well understood and implemented in an open architecture.  Containers have become a key technology for the configuration and deployment of applications and micro services. Kubernetes is a container orchestration platform that provides foundational services in Red Hat OpenShift Container Platform, which allows enterprises to manage container deployments and scale their applications using Kubernetes.

This training course provides an overview of the DoD Enterprise DevSecOps Platform (DSOP) Reference Design, its current state, and ties to DoD Cloud Platform One (P1). Workflows of the DoD Iron Bank container repository are introduced, along with an overview of the DoD Pipeline as represented in Big Bang.  Continuous authorization cATO via Party Bus within NIST RMF is presented. You will become aware of the Platform One (P1) integrations and relationship to Docker, Kubernetes, Istio (Red Hat OpenShift Service Mesh) and Red Hat OpenShift Platform.

In addition to gaining an understanding of these tools, you will build core administration skills through the installation, configuration, and management of an OpenShift cluster and containerized applications.

Course Objectives:

  • Learn about Containers, Docker, Kubernetes, and OpenShift architecture
  • Overview DoD Enterprise DevSecOps Platform (DSOP) Reference Design and DoD Cloud Platform One (P1)
  • Tie together awareness of various DoD Cloud offerings and their relationships
  • Create containerized services
  • Manage containers and container images
  • Deploy multi-container applications
  • Install an OpenShift cluster
  • Configure and manage masters and nodes
  • Secure OpenShift
  • Control access to resources on OpenShift
  • Monitor and collect metrics on OpenShift
  • Deploy applications on OpenShift using source-to-image (S2I)
  • Manage storage on OpenShift

Course Outline:

  • Getting started with container technology
  • Creating containerized services
  • Managing containers
  • Managing container images
  • Creating custom container images
  • Deploying containerized applications on OpenShift
  • Deploying multi-container applications
  • Troubleshooting containerized applications
  • Comprehensive Review of Introduction to Container, Kubernetes, and RedHat OpenShift
  • Introducing Red Hat OpenShift Container Platform
  • Installing OpenShift Container Platform
  • Describing and exploring OpenShift networking concepts
  • Executing commands
  • Controlling access to OpenShift resources
  • Allocating persistent storage
  • Managing application deployments
  • Installing and configuring the metrics subsystem
  • Managing and monitoring OpenShift Container Platform

Dates/Locations:

No Events

Duration: 5 Days

Prerequisites:

  • Ability to use a Linux® terminal session and issue operating system commands
  • Good foundation in Linux
  • Experience with web application architectures and their corresponding technologies

Target Audience:

  • Developers who wish to containerize software applications
  • Administrators who are new to container technology and container orchestration
  • Architects who are considering using container technologies in software architectures
  • System administrators
  • System architects
  • Architects and developers who want to install and configure OpenShift Container Platform
  • Those working in the field of DevSecOps supporting DoD Platform One (P1) and other implementations

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

CCFE Core Competencies

  • Procedures and Legal Issues
  • Computer Fundamentals
  • Partitioning Schemes
  • Data Recovery
  • Windows File Systems
  • Windows Artifacts
  • Report writing (Presentation of Finding)
  • Procedures and Legal issues
  1. Knowledge of search and subjection and rules for evidence as applicable to computer forensics.
  2. Ability to explain the on-scene action taken for evidence preservation.
  3. Ability to maintain and document an environment consolidating the computer forensics.
  • Computer Fundamentals
  1. Understand BIOS
  2. Computer hardware
  3. Understanding of numbering system (Binary, hexadecimal, bits, bytes).
  4. Knowledge of sectors, clusters, files.
  5. Understanding of logical and physical files.
  6. Understanding of logical and physical drives.
  • Partitioning schemes
  1. Identification of current partitioning schemes.
  2. Understanding of primary and extended partition.
  3. Knowledge of partitioning schemes and structures and system used by it.
  4. Knowledge of GUID and its application.
  • Windows file system
  1. Understanding of concepts of files.
  2. Understanding of FAT tables, root directory, subdirectory along with how they store data.
  3. Identification, examination, analyzation of NTFS master file table.
  4. Understanding of $MFT structure and how they store data.
  5. Understanding of Standard information, Filename, and data attributes.
  • Data Recovery
  1. Ability to validate forensic hardware, software, examination procedures.
  2. Email headers understanding.
  3. Ability to generate and validate forensically sterile media.
  4. Ability to generate and validate a forensic image of media.
  5. Understand hashing and hash sets.
  6. Understand file headers.
  7. Ability to extract file metadata from common file types.
  8. Understanding of file fragmentation.
  9. Ability to extract component files from compound files.
  10. Knowledge of encrypted files and strategies for recovery.
  11. Knowledge of Internet browser artifacts.
  12. Knowledge of search strategies for examining electronic
  • Windows Artifacts
  1. Understanding the purpose and structure of component files that create the windows registry.
  2. Identify and capability to extract the relevant data from the dead registry.
  3. Understand the importance of restore points and volume shadow copy services.
  4. Knowledge of the locations of common Windows artifacts.
  5. Ability to analyze recycle bin.
  6. Ability to analyze link files.
  7. Analyzing of logs
  8. Extract and view windows logs
  9. Ability to locate, mount and examine VHD files.
  10. Understand the Windows swap and hibernation files.
  • Report Writing (Presentation of findings)
  1. Ability to conclude things strongly based on examination observations.
  2. Able to report findings using industry standard technically accurate terminologies.
  3. Ability to explain the complex things in simple and easy terms so that non-technical people can understand clearly.
  4. Be able to consider legal boundaries when undertaking a forensic examination