Course Overview:

This course provides the knowledge and skills to design and implement DevOps processes and practices. Students will learn how to plan for DevOps, use source control, scale Git for an enterprise, consolidate artifacts, design a dependency management strategy, manage secrets, implement continuous integration, implement a container build strategy, design a release strategy, set up a release management workflow, implement a deployment pattern, and optimize feedback mechanisms.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to AZ-400: Microsoft Azure DevOps Solutions will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 5 days

Course Outline:

  • Planning for DevOps
  • Getting started with Source Control
  • Scaling Git for enterprise DevOps
  • Consolidating Artifacts & Designing a Dependency Management Strategy
  • Implementing Continuous Integration with Azure Pipelines
  • Managing Application Config and Secrets
  • Managing Code Quality and Security Policies
  • Implementing a Container Build Strategy
  • Manage Artifact versioning, security & compliance
  • Design a Release Strategy
  • Set up a Release Management Workflow
  • Implement an appropriate deployment pattern
  • Implement process for routing system feedback to development teams
  • Implement a mobile DevOps strategy
  • Infrastructure and Configuration Azure Tools
  • Azure Deployment Models and Services
  • Create and Manage Kubernetes Service Infrastructure
  • Third Party Infrastructure as Code Tools available with Azure
  • Implement Compliance and Security in your Infrastructure
  • Recommend and design system feedback mechanisms
  • Optimize feedback mechanisms

Prerequisites :

      • AZ-900: Microsoft Azure Fundamentals
      • Fundamental knowledge about Azure, version control, Agile software development, and core software development principles. It would be helpful to have experience in an organization that delivers software.

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will design various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, NoSQL, or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data. The students will also explore how to design data security, including data access, data policies, and standards. They will also design Azure data solutions, which includes the optimization, availability, and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-201: Designing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 3 days

Course Outline:

  • Data Platform Architecture Considerations
  • Azure Batch Processing Reference Architectures
  • Azure Real-Time Reference Architectures
  • Data Platform Security Design Considerations
  • Designing for Resiliency and Scale
  • Design for Efficiency and Operations

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals
      • DP-200: Implementing an Azure Data Solution

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-200: Implementing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 4 days

Course Outline:

  • Azure for the Data Engineer
  • Working with Data Storage
  • Enabling Team Based Data Science with Azure Databricks
  • Building Globally Distributed Databases with Cosmos DB
  • Working with Relational Data Stores in the Cloud
  • Performing Real-Time Analytics with Stream Analytics
  • Orchestrating Data Movement with Azure Data Factory
  • Securing Azure Data Platforms
  • Monitoring and Troubleshooting Data Storage and Processing

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 
 

Course Overview:

 

Install, configure, and manage Red Hat JBoss Enterprise Application Platform

Red Hat JBoss® Application Administration I teaches you the best practices for installing and configuring Red Hat JBoss Enterprise Application Platform 6. Through hands-on labs, learn the essential, real-world tasks that a system administrator needs to know to effectively deploy and manage applications on JBoss Enterprise Application Platform.

Attendees to RH-345: Red Hat JBoss Application Administration I, will receive TechNow approved course materials and expert instruction.

Dates/Locations:

Duration: 5 Days

Course Objectives:

  • Overview of JBoss Enterprise Application Platform
  • Configure JBoss Enterprise Application Platform in standalone mode
  • Configure JBoss Enterprise Application Platform in domain mode
  • Configure servers
  • Use the CLI tool
  • The datasource subsystem
  • The logging subsystem
  • The messaging subsystem
  • The security subsystem
  • JVM configuration
  • Migrating applications to JBoss Enterprise Application Platform 6
  • The web subsystem

Prerequisites:

  • Linux System Administration

Comments

Latest comments from students


 

  

Liked the class?  Then let everyone know!

CCFE Core Competencies

  • Procedures and Legal Issues
  • Computer Fundamentals
  • Partitioning Schemes
  • Data Recovery
  • Windows File Systems
  • Windows Artifacts
  • Report writing (Presentation of Finding)
  • Procedures and Legal issues
  1. Knowledge of search and subjection and rules for evidence as applicable to computer forensics.
  2. Ability to explain the on-scene action taken for evidence preservation.
  3. Ability to maintain and document an environment consolidating the computer forensics.
  • Computer Fundamentals
  1. Understand BIOS
  2. Computer hardware
  3. Understanding of numbering system (Binary, hexadecimal, bits, bytes).
  4. Knowledge of sectors, clusters, files.
  5. Understanding of logical and physical files.
  6. Understanding of logical and physical drives.
  • Partitioning schemes
  1. Identification of current partitioning schemes.
  2. Understanding of primary and extended partition.
  3. Knowledge of partitioning schemes and structures and system used by it.
  4. Knowledge of GUID and its application.
  • Windows file system
  1. Understanding of concepts of files.
  2. Understanding of FAT tables, root directory, subdirectory along with how they store data.
  3. Identification, examination, analyzation of NTFS master file table.
  4. Understanding of $MFT structure and how they store data.
  5. Understanding of Standard information, Filename, and data attributes.
  • Data Recovery
  1. Ability to validate forensic hardware, software, examination procedures.
  2. Email headers understanding.
  3. Ability to generate and validate forensically sterile media.
  4. Ability to generate and validate a forensic image of media.
  5. Understand hashing and hash sets.
  6. Understand file headers.
  7. Ability to extract file metadata from common file types.
  8. Understanding of file fragmentation.
  9. Ability to extract component files from compound files.
  10. Knowledge of encrypted files and strategies for recovery.
  11. Knowledge of Internet browser artifacts.
  12. Knowledge of search strategies for examining electronic
  • Windows Artifacts
  1. Understanding the purpose and structure of component files that create the windows registry.
  2. Identify and capability to extract the relevant data from the dead registry.
  3. Understand the importance of restore points and volume shadow copy services.
  4. Knowledge of the locations of common Windows artifacts.
  5. Ability to analyze recycle bin.
  6. Ability to analyze link files.
  7. Analyzing of logs
  8. Extract and view windows logs
  9. Ability to locate, mount and examine VHD files.
  10. Understand the Windows swap and hibernation files.
  • Report Writing (Presentation of findings)
  1. Ability to conclude things strongly based on examination observations.
  2. Able to report findings using industry standard technically accurate terminologies.
  3. Ability to explain the complex things in simple and easy terms so that non-technical people can understand clearly.
  4. Be able to consider legal boundaries when undertaking a forensic examination