Course Overview:

In this course, the students will implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-200: Implementing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 4 days

Course Outline:

  • Azure for the Data Engineer
  • Working with Data Storage
  • Enabling Team Based Data Science with Azure Databricks
  • Building Globally Distributed Databases with Cosmos DB
  • Working with Relational Data Stores in the Cloud
  • Performing Real-Time Analytics with Stream Analytics
  • Orchestrating Data Movement with Azure Data Factory
  • Securing Azure Data Platforms
  • Monitoring and Troubleshooting Data Storage and Processing

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

In this course, the students will design various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, NoSQL, or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data. The students will also explore how to design data security, including data access, data policies, and standards. They will also design Azure data solutions, which includes the optimization, availability, and disaster recovery of big data, batch processing, and streaming data solutions.

TechNow has worked worldwide enterprise infrastructures for over 20 years and has developed demos and labs to exemplify the techniques required to demonstrate cloud technologies and to effectively manage security in the cloud environment.

Attendees to DP-201: Designing an Azure Data Solution will receive TechNow approved course materials and expert instruction.

Date/Locations:

No Events

Course Duration: 3 days

Course Outline:

  • Data Platform Architecture Considerations
  • Azure Batch Processing Reference Architectures
  • Azure Real-Time Reference Architectures
  • Data Platform Security Design Considerations
  • Designing for Resiliency and Scale
  • Design for Efficiency and Operations

Prerequisites :

      • In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
      • AZ-900: Microsoft Azure Fundamentals
      • DP-200: Implementing an Azure Data Solution

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 

Course Overview:

The HCISPP is the only certification that combines cybersecurity skills with privacy best practices and techniques. It demonstrates you have the knowledge and ability to implement, manage, and assess security and privacy controls to protect healthcare organizations using policies and procedures established by the cybersecurity experts at (ISC)2. TechNows HCISPP Certification Boot Camp is a comprehensive review of Healthcare cybersecurity with privacy best practices & industry best practices.

Attendees to TN-8155: HCISPP Certification Preparation Seminar will receive TechNow approved course materials and expert instruction..

Date/Locations:

No Events

Course Duration: 5 days

Course Objectives:

  • Strategically focus your preparation for HCISPP Certification
  • Cover a broad spectrum of topics in the 7 domains of the HCISPP Common Body of Knowledge (CBK)
  • Gain knowledge on the Healthcare industry including third party relationships and health data management concepts
  • Identify applicable regulations, compliance frameworks, privacy principles and policies to protect information security
  • Develop risk management methodology and identify control assessment procedures

Audience:

  • The HCISPP certification is ideal for security professionals responsible for safeguarding protected health information (PHI). Take this HCISPP training course to prepare to manage and implement security controls for healthcare information. HCISPPs are instrumental to a variety of job functions: Compliance Officer, Information Security Manager, Privacy Officer, Compliance Auditor, Risk Analyst, Medical Records Supervisor, IT Manager, Privacy & Security Consultants, and Health Information Manager.

Comments

Latest comments from students


 

Liked the class?  Then let everyone know!

 
 

Course Overview:

 

Install, configure, and manage Red Hat JBoss Enterprise Application Platform

Red Hat JBoss® Application Administration I teaches you the best practices for installing and configuring Red Hat JBoss Enterprise Application Platform 6. Through hands-on labs, learn the essential, real-world tasks that a system administrator needs to know to effectively deploy and manage applications on JBoss Enterprise Application Platform.

Attendees to RH-345: Red Hat JBoss Application Administration I, will receive TechNow approved course materials and expert instruction.

Dates/Locations:

Duration: 5 Days

Course Objectives:

  • Overview of JBoss Enterprise Application Platform
  • Configure JBoss Enterprise Application Platform in standalone mode
  • Configure JBoss Enterprise Application Platform in domain mode
  • Configure servers
  • Use the CLI tool
  • The datasource subsystem
  • The logging subsystem
  • The messaging subsystem
  • The security subsystem
  • JVM configuration
  • Migrating applications to JBoss Enterprise Application Platform 6
  • The web subsystem

Prerequisites:

  • Linux System Administration

Comments

Latest comments from students


 

  

Liked the class?  Then let everyone know!

CCFE Core Competencies

  • Procedures and Legal Issues
  • Computer Fundamentals
  • Partitioning Schemes
  • Data Recovery
  • Windows File Systems
  • Windows Artifacts
  • Report writing (Presentation of Finding)
  • Procedures and Legal issues
  1. Knowledge of search and subjection and rules for evidence as applicable to computer forensics.
  2. Ability to explain the on-scene action taken for evidence preservation.
  3. Ability to maintain and document an environment consolidating the computer forensics.
  • Computer Fundamentals
  1. Understand BIOS
  2. Computer hardware
  3. Understanding of numbering system (Binary, hexadecimal, bits, bytes).
  4. Knowledge of sectors, clusters, files.
  5. Understanding of logical and physical files.
  6. Understanding of logical and physical drives.
  • Partitioning schemes
  1. Identification of current partitioning schemes.
  2. Understanding of primary and extended partition.
  3. Knowledge of partitioning schemes and structures and system used by it.
  4. Knowledge of GUID and its application.
  • Windows file system
  1. Understanding of concepts of files.
  2. Understanding of FAT tables, root directory, subdirectory along with how they store data.
  3. Identification, examination, analyzation of NTFS master file table.
  4. Understanding of $MFT structure and how they store data.
  5. Understanding of Standard information, Filename, and data attributes.
  • Data Recovery
  1. Ability to validate forensic hardware, software, examination procedures.
  2. Email headers understanding.
  3. Ability to generate and validate forensically sterile media.
  4. Ability to generate and validate a forensic image of media.
  5. Understand hashing and hash sets.
  6. Understand file headers.
  7. Ability to extract file metadata from common file types.
  8. Understanding of file fragmentation.
  9. Ability to extract component files from compound files.
  10. Knowledge of encrypted files and strategies for recovery.
  11. Knowledge of Internet browser artifacts.
  12. Knowledge of search strategies for examining electronic
  • Windows Artifacts
  1. Understanding the purpose and structure of component files that create the windows registry.
  2. Identify and capability to extract the relevant data from the dead registry.
  3. Understand the importance of restore points and volume shadow copy services.
  4. Knowledge of the locations of common Windows artifacts.
  5. Ability to analyze recycle bin.
  6. Ability to analyze link files.
  7. Analyzing of logs
  8. Extract and view windows logs
  9. Ability to locate, mount and examine VHD files.
  10. Understand the Windows swap and hibernation files.
  • Report Writing (Presentation of findings)
  1. Ability to conclude things strongly based on examination observations.
  2. Able to report findings using industry standard technically accurate terminologies.
  3. Ability to explain the complex things in simple and easy terms so that non-technical people can understand clearly.
  4. Be able to consider legal boundaries when undertaking a forensic examination