Skip to content

Commit

Permalink
Merge branch 'main' into docs2
Browse files Browse the repository at this point in the history
  • Loading branch information
willtome authored Sep 18, 2023
2 parents 2bbdede + 2cd3ec6 commit 8fd40f6
Show file tree
Hide file tree
Showing 9 changed files with 302 additions and 3 deletions.
11 changes: 8 additions & 3 deletions cloud/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
- [Configure Credentials](#configure-credentials)
- [Add Workshop Credential Password](#add-workshop-credential-password)
- [Remove Inventory Variables](#remove-inventory-variables)
- [Getting your Puiblic Key for Create Infra Job](#getting-your-puiblic-key-for-create-infra-job)
- [Getting your Puiblic Key for Create Keypair Job](#getting-your-puiblic-key-for-create-keypair-job)
- [Suggested Usage](#suggested-usage)
- [Known Issues](#known-issues)

Expand All @@ -20,8 +20,11 @@ This category of demos shows examples of multi-cloud provisioning and management
### Jobs

- [**Cloud / Create Infra**](create_infra.yml) - Creates a VPC with required routing and firewall rules for provisioning VMs
- [**Cloud / Create Keypair**](aws_key.yml) - Creates a keypair for connecting to EC2 instances
- [**Cloud / Create VM**](create_vm.yml) - Create a VM based on a [blueprint](blueprints/) in the selected cloud provider
- [**Cloud / Destroy VM**](destroy_vm.yml) - Destroy a VM that has been created in a cloud provider. VM must be imported into dynamic inventory to be deleted.
- [**Cloud / Snapshot EC2**](snapshot_ec2.yml) - Snapshot a VM that has been created in a cloud provider. VM must be imported into dynamic inventory to be snapshot.
- [**Cloud / Restore EC2 from Snapshot**](snapshot_ec2.yml) - Restore a VM that has been created in a cloud provider. By default, volumes will be restored from their latest snapshot. VM must be imported into dynamic inventory to be patched.

### Inventory

Expand All @@ -46,7 +49,7 @@ After running the setup job template, there are a few steps required to make the

1) Remove Workshop Inventory variables on the Details page of the inventory. Required until [RFE](https://github.com/ansible/workshops/issues/1597]) is complete

### Getting your Puiblic Key for Create Infra Job
### Getting your Puiblic Key for Create Keypair Job

1) Connect to the command line of your Controller server. This is easiest to do by opening the VS Code Web Editor from the landing page where you found the Controller login details.
2) Open a Terminal Window in the VS Code Web Editor.
Expand All @@ -56,9 +59,11 @@ After running the setup job template, there are a few steps required to make the

## Suggested Usage

**Cloud / Create Infra** -The Create Infra job builds cloud infrastructure based on the provider definition in the included `demo.cloud` collection.
**Cloud / Create Keypair** - The Create Keypair job creates an EC2 keypair which can be used when creating EC2 instances to enable SSH access.

**Cloud / Create VM** - The Create VM job builds a VM in the given provider based on the included `demo.cloud` collection. VM [blueprints](blueprints/) define variables for each provider that override the defaults in the collection. When creating VMs it is recommended to follow naming conventions that can be used as host patterns. (eg. VM names: `win1`, `win2`, `win3`. Host Pattern: `win*` )

**Cloud / AWS / Patch EC2 Workflow** - Create a VPC and one or more linux VM(s) in AWS using the `Cloud / Create VPC` and `Cloud / Create VM` templates. Run the workflow and observe the instance snapshots followed by patching operation. Optionally, use the survey to force a patch failure in order to demonstrate the restore path. At this time, the workflow does not support patching Windows instances.

## Known Issues
Azure does not work without a custom execution environment that includes the Azure dependencies.
10 changes: 10 additions & 0 deletions cloud/restore_ec2.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
---
- name: Restore ec2 instance from snapshot
hosts: "{{ _hosts | default(omit) }}"
gather_facts: false

tasks:
- name: Include restore from snapshot role
ansible.builtin.include_role:
name: "demo.cloud.aws"
tasks_from: restore_vm
146 changes: 146 additions & 0 deletions cloud/setup.yml
Original file line number Diff line number Diff line change
Expand Up @@ -249,6 +249,14 @@ controller_templates:
variable: create_vm_aws_keypair_name
required: true
default: aws-test-key
- question_name: AWS Instance Type (defaults to blueprint value)
type: text
variable: create_vm_aws_instance_size
required: false
- question_name: AWS Image Filter (defaults to blueprint value)
type: text
variable: create_vm_aws_image_filter
required: false

- name: Cloud / AWS / Delete VM
job_type: run
Expand Down Expand Up @@ -367,6 +375,91 @@ controller_templates:
variable: aws_keypair_owner
required: true

- name: Cloud / AWS / Snapshot EC2
job_type: run
organization: Default
credentials:
- AWS
project: Ansible official demo project
playbook: cloud/snapshot_ec2.yml
inventory: Demo Inventory
notification_templates_started: Telemetry
notification_templates_success: Telemetry
notification_templates_error: Telemetry
survey_enabled: true
survey:
name: ''
description: ''
spec:
- question_name: AWS Region
type: multiplechoice
variable: aws_region
required: true
default: us-east-1
choices:
- us-east-1
- us-east-2
- us-west-1
- us-west-2
- question_name: Specify target hosts
type: text
variable: _hosts
required: false

- name: Cloud / AWS / Restore EC2 from Snapshot
job_type: run
organization: Default
credentials:
- AWS
project: Ansible official demo project
playbook: cloud/restore_ec2.yml
inventory: Demo Inventory
notification_templates_started: Telemetry
notification_templates_success: Telemetry
notification_templates_error: Telemetry
survey_enabled: true
survey:
name: ''
description: ''
spec:
- question_name: AWS Region
type: multiplechoice
variable: aws_region
required: true
default: us-east-1
choices:
- us-east-1
- us-east-2
- us-west-1
- us-west-2
- question_name: Specify target hosts
type: text
variable: _hosts
required: false

- name: "LINUX / Patching"
job_type: check
inventory: "Demo Inventory"
project: "Ansible official demo project"
playbook: "linux/patching.yml"
execution_environment: Default execution environment
notification_templates_started: Telemetry
notification_templates_success: Telemetry
notification_templates_error: Telemetry
use_fact_cache: true
ask_job_type_on_launch: true
credentials:
- "Demo Credential"
survey_enabled: true
survey:
name: ''
description: ''
spec:
- question_name: Server Name or Pattern
type: text
variable: _hosts
required: true

controller_workflows:
- name: Deploy Cloud Stack in AWS
description: A workflow to deploy a cloud stack
Expand Down Expand Up @@ -475,3 +568,56 @@ controller_workflows:
feedback: Failed to create AWS instance
- identifier: Tag Report
unified_job_template: Cloud / AWS / Tags Report

- name: Cloud / AWS / Patch EC2 Workflow
description: A workflow to patch ec2 instances with snapshot and restore on failure.
organization: Default
notification_templates_started: Telemetry
notification_templates_success: Telemetry
notification_templates_error: Telemetry
survey_enabled: true
survey:
name: ''
description: ''
spec:
- question_name: AWS Region
type: multiplechoice
variable: aws_region
required: true
default: us-east-1
choices:
- us-east-1
- us-east-2
- us-west-1
- us-west-2
- question_name: Specify target hosts
type: text
variable: _hosts
required: true
default: os_linux
simplified_workflow_nodes:
- identifier: Project Sync
unified_job_template: Ansible official demo project
success_nodes:
- Take Snapshot
- identifier: Inventory Sync
unified_job_template: AWS Inventory
success_nodes:
- Take Snapshot
- identifier: Take Snapshot
unified_job_template: Cloud / AWS / Snapshot EC2
success_nodes:
- Patch Instance
- identifier: Patch Instance
unified_job_template: LINUX / Patching
job_type: run
failure_nodes:
- Restore from Snapshot
- identifier: Restore from Snapshot
unified_job_template: Cloud / AWS / Restore EC2 from Snapshot
failure_nodes:
- Ticket - Restore Failed
- identifier: Ticket - Restore Failed
unified_job_template: 'SUBMIT FEEDBACK'
extra_data:
feedback: Cloud / AWS / Patch EC2 Workflow | Failed to restore ec2 from snapshot
10 changes: 10 additions & 0 deletions cloud/snapshot_ec2.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
---
- name: Snapshot ec2 instance
hosts: "{{ _hosts | default(omit) }}"
gather_facts: false

tasks:
- name: Include snapshot role
ansible.builtin.include_role:
name: "demo.cloud.aws"
tasks_from: snapshot_vm
Original file line number Diff line number Diff line change
Expand Up @@ -21,3 +21,4 @@ aws_env_tag: prod
aws_purpose_tag: ansible_demo
aws_ansiblegroup_tag: cloud
aws_ec2_wait: true
aws_snapshots: {}
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
---
- name: AWS | RESTORE VM
delegate_to: localhost
block:
- name: AWS | RESTORE VM | stop vm
amazon.aws.ec2_instance:
region: "{{ aws_region }}"
instance_ids: "{{ instance_id }}"
state: stopped
wait: true

- name: AWS | RESTORE VM | get volumes
register: r_vol_info
amazon.aws.ec2_vol_info:
region: "{{ aws_region }}"
filters:
attachment.instance-id: "{{ instance_id }}"

- name: AWS | RESTORE VM | detach volumes
loop: "{{ r_vol_info.volumes }}"
loop_control:
loop_var: volume
label: "{{ volume.id }}"
amazon.aws.ec2_vol:
region: "{{ aws_region }}"
id: "{{ volume.id }}"
instance: None

- name: AWS | RESTORE VM | attach snapshots from stat
when: inventory_hostname in aws_snapshots
loop: "{{ aws_snapshots[inventory_hostname] }}"
loop_control:
loop_var: snap
label: "{{ snap.snapshot_id }}"
amazon.aws.ec2_vol:
region: "{{ aws_region }}"
instance: "{{ instance_id }}"
snapshot: "{{ snap.snapshot_id }}"
device_name: "{{ snap.device }}"

- name: AWS | RESTORE VM | get all snapshots
when: inventory_hostname not in aws_snapshots
register: r_snapshots
amazon.aws.ec2_snapshot_info:
region: "{{ aws_region }}"
filters:
"tag:Name": "{{ inventory_hostname }}"

- name: AWS | RESTORE VM | create volume from latest snapshot
when: inventory_hostname not in aws_snapshots
amazon.aws.ec2_vol:
region: "{{ aws_region }}"
instance: "{{ instance_id }}"
snapshot: "{{ r_snapshots.snapshots[0].snapshot_id }}"
device_name: "/dev/sda1"

- name: AWS | RESTORE VM | start vm
amazon.aws.ec2_instance:
region: "{{ aws_region }}"
instance_ids: "{{ instance_id }}"
state: started
wait: true
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
---
- name: AWS | SNAPSHOT VM
delegate_to: localhost
block:
- name: AWS | SNAPSHOT VM | assert id
ansible.builtin.assert:
that: instance_id is defined
fail_msg: "instance_id is required for snapshot operations"

- name: AWS | SNAPSHOT VM | include vars
ansible.builtin.include_vars:
file: snapshot_vm.yml

- name: AWS | SNAPSHOT VM | get volumes
register: r_vol_info
amazon.aws.ec2_vol_info:
region: "{{ aws_region }}"
filters:
attachment.instance-id: "{{ instance_id }}"

- name: AWS | SNAPSHOT VM | take snapshots
loop: "{{ r_vol_info.volumes }}"
loop_control:
loop_var: volume
label: "{{ volume.id }}"
register: r_snapshots
amazon.aws.ec2_snapshot:
region: "{{ aws_region }}"
volume_id: "{{ volume.id }}"
description: "Snapshot taken by Red Hat Product demos"
snapshot_tags: "{{ tags }}"

- name: AWS | SNAPSHOT VM | format snapshot stat
ansible.builtin.set_fact:
snapshot_stat:
- key: "{{ inventory_hostname }}"
value: "{{ r_snapshots.results | json_query(aws_ec2_snapshot_query) }}"

- name: AWS | SNAPSHOT VM | record snapshot with host key
ansible.builtin.set_stats:
data:
aws_snapshots: "{{ snapshot_stat | items2dict }}"
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Set stat_snapshots with model:
# [
# {
# "snapshot_id": "snap-0e981f05704e19ffd",
# "vol_id": "vol-0bd55f313bb7bcdd8",
# "device": "/dev/sda1"
# },
# ...
# ]
aws_ec2_snapshot_query: "[].{snapshot_id: snapshot_id, vol_id: volume.id, device: volume.attachment_set[?instance_id=='{{ instance_id }}'].device | [0]}"
13 changes: 13 additions & 0 deletions linux/patching.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,22 @@
- ansible_local.insights.system_id is defined

- name: Deploy report server
when: not ansible_check_mode
delegate_to: "{{ report_server }}"
run_once: true # noqa: run-once[task]
block:
- name: Install firewall dependencies
ansible.builtin.dnf:
name:
- firewalld
- python3-firewall
state: present

- name: Start firewalld
ansible.builtin.service:
name: firewalld
state: started

- name: Build report server
ansible.builtin.include_role:
name: "{{ item }}"
Expand Down

0 comments on commit 8fd40f6

Please sign in to comment.