Using the pyafc Package
Using the pyafc package
For instructions on how to install the latest version of the pyafc
package follow the instructions in the Installing pyafc guide.
pyafc
Package Structure
pyafc
Package Structurepyafc
│ README.md
│ CONTRIBUTING.md
│ ...
│
└───pyafc
│ │ DESIGN.md
└───────afc
└───────common
└───────dss
└───────fabric
│ │ ...
│
└───docs
│ │ ...
│
└───workflows
│ ...
The pyafc
package is a directory containing files and subdirectories. Contained directly within the top level pyafc
directory are informative files such as the readme, licensing information, contribution guidelines, and release notes. Also contained directly within the top level directory are the relevant code-containing directories pyafc
and workflows
.
Using the pyafc
Package
pyafc
PackageAfter installing the package, you'll be able to run and create your own workflows.
There's example workflows available in the GitHub repository that you can pull and use as a template for custom workflows.
First clone the repository:
ubuntu-vm$ git clone https://github.com/aruba/pyafc.git
Change into the newly cloned repository and enter the /workflows/
directory to view the example workflow files:
ubuntu-vm$cd pyafc/
ubuntu-vm$cd workflows/
ubuntu-vm$pwd
/home/sandbox/pyafc/workflows
ubuntu-vm$ls
__init__.py create_fabric.py create_syslog.py delete_ntp.py
assign_fabric.py create_ip_interface.py create_vlan.py delete_prefix_list.py
configure_overlay.py create_leaf_spine.py create_vrf.py delete_radius.py
configure_ports.py create_ntp.py create_vsx.py delete_resource_pool.py
configure_underlay.py create_ospf_area.py delete_aspath_list.py delete_route_map.py
create_aspath_list.py create_ospf_interface.py delete_community_list.py delete_sflow.py
create_community_list.py create_ospf_router.py delete_dhcp_relay.py delete_snmp.py
create_dhcp_relay.py create_prefix_list.py delete_dns.py delete_stp.py
create_dns.py create_radius.py delete_dss_eg.py delete_syslog.py
create_dss_eg.py create_resource_pool.py delete_dss_qualifier.py delete_vlan.py
create_dss_policy.py create_route_map.py delete_dss_rule.py delete_vrf.py
create_dss_qualifier.py create_sflow.py delete_evpn.py inputs.yml
create_dss_rule.py create_snmp.py delete_fabric.py run_discovery.py
create_evpn.py create_stp.py delete_ip_interface.py update_bgp.py
These workflows utilize a YAML formatted input file inputs.yml
that contains the data the Python scripts will utilize to configure features within the HPE Aruba Networking Fabric Composer. Each workflow requires a Afc
object which is used as the primary connection source to the HPE Aruba Networking Fabric Composer.
HPE ANFC REST API
Every module in
pyafc
is built from the HPE ANFC REST API - reference the API Reference guide when trying to determine the data necessary for a particular feature/module.
# (C) Copyright 2019-2025 Hewlett Packard Enterprise Development LP.
# Apache License 2.0
from pyafc.afc import afc
import yaml
from pyafc.fabric import fabric
filename = "inputs.yml"
with open(filename, "r") as stream:
input_data = yaml.load(stream, Loader=yaml.FullLoader)
stream.close()
data = {
"ip": input_data["afc_ip"],
"username": input_data["afc_username"],
"password": input_data["afc_password"],
}
Once you have your Afc
object you can begin configuring features of the HPE Aruba Networking using Classes and functions. In the example create_fabric.py
file provided in the repository, it shows the simplicity in using the pyafc
package:
Example of data within inputs.yml
:
# Fabric properties
fabric_name: 'Test-Fabric'
fabric_timezone: "Europe/London"
# (C) Copyright 2019-2025 Hewlett Packard Enterprise Development LP.
# Apache License 2.0
from pyafc.afc import afc
import yaml
from pyafc.fabric import fabric
filename = "inputs.yml"
with open(filename, "r") as stream:
input_data = yaml.load(stream, Loader=yaml.FullLoader)
stream.close()
data = {
"ip": input_data["afc_ip"],
"username": input_data["afc_username"],
"password": input_data["afc_password"],
}
fabric_data = {"timezone": input_data["fabric_timezone"]}
fabric_name = input_data["fabric_name"]
afc_instance = afc.Afc(data=data)
# Create fabric
fabric_instance = fabric.Fabric(afc_instance.client, name=fabric_name, **fabric_data)
message, status, changed = fabric_instance.create_fabric(
name=fabric_name, **fabric_data
)
print(f"Message: {message}\nStatus: {status}\nChanged: {changed}")
Creating Custom Workflows
Similar to the provided example, you'll need to create your Afc
object in order to create a connection to the HPE Aruba Networking Fabric Composer. Alternatively, if you're hesitant in storing credentials in plaintext, you're able to use Python libraries to either get credentials at runtime or store them in an encrypted file using Vault.
Once you have your Afc
object you can begin configuring features of the HPE Aruba Networking Fabric Composer using Classes and functions.
Updated 10 days ago