When using CFEngine with the Masterfiles Policy Framework there are two standard “stages” involved in periodic maintenance: update the policy (update.cf) and evaluated the policy (promises.cf). In a standard install the cf-execd component periodically runs first the update policy and then the policy proper.
We have talked in the past about Extending the CFEngine Policy Update Procedure as well as Writing a cfbs module for your custom policy update.
While both of these previous strategies are very useful I have a couple of different itches to scratch this time:
- only copy files to certain clients depending on what role they play
- copy files which don’t match the
input_name_patternsor that have no extension as part of their name - what happens if the policy server is not available?
For these three purposes it is most easy to write a new bundle and have it evaluated after the standard update policy is run. This easily be done using an Augment available with the MPF: Evaluate additional bundles during update. Specifying a bundle name in this augment allows us to append additional bundles to evaluate at the end of standard update policy.
How does update policy copy files to clients?
The update policy only copies certain files from the policy server to each client.
It decides which files based on an internal variable called input_name_patterns.
We have augments to replace or extend this variable:
Override files considered for copy during policy updates
and
Extend files considered for copy during policy updates
And while we could add new extensions to this list that strategy would not solve our two issues: files without extensions and files that should only be distributed to certain clients.
Custom update policy
I think the easiest way to accomplish this task is to use CFEngine Build, add a json file with our augment and a policy file as well as some of those tasty special files to transfer to special hosts.
Adding this to an existing project is easy. First create a sub-folder for this work
mkdir custom-update
mkdir -p files/special files/common
echo "only tell special clients" > files/special/this-is-kind-of-secret
echo "yeah, I am an open book (file)" > files/common/this-file-has-no-secretsWe add both the policy file to update_inputs as well as the bundle name to control_common_update_bundlesequence_end in custom-update/def.json:
{
"variables": {
"default:def.update_inputs": {
"value": [
"services/cfbs/custom-update/custom_update_bundle.cf"
]
},
"default:def.control_common_update_bundlesequence_end": {
"value": [
"custom_update_bundle"
]
}
}
}And finally some policy.
One important thing to remember is that the update policy does NOT use the standard library so if you want to leverage some helpful bundles and bodies you will have to dig a bit in the internals.
Most common items can be prefixed with a u_ e.g. u_results, u_mog, etc.
bundle agent custom_update_bundle
{
classes:
"special_host" expression => fileexists("/etc/cfengine-special-host.flag");
vars:
"files_dir" string => "${sys.masterdir}/services/cfbs/custom-update/files";
files:
special_host::
"${sys.inputdir}/files/special"
perms => u_mog("0644", "root", "root" ),
copy_from => u_remote_dcp_missing_ok("${files_dir}/special", "${sys.policy_hub}"),
depth_search => u_recurse("inf");
any::
"${sys.inputdir}/files/common"
perms => u_mog("0644", "root", "root" ),
copy_from => u_remote_dcp_missing_ok("${files_dir}/common", "${sys.policy_hub}"),
depth_search => u_recurse("inf");
}At this point, just add the custom-update folder as a module being careful to NOT accept the default of adding custom_update_bundle bundle to the (default policy) bundle sequence.
$ cfbs add custom-update/
Which bundle should be evaluated (added to bundle sequence)?
1. ./custom-update/custom_update_bundle.cf:custom_update_bundle (default)
2. (None)
[1/2]: 2Adding the custom-update directory as a module has the benefit of copying ALL of the files inside to the masterfiles policy set which includes our odd special and common files with no extensions.
Now we can configure a policy server to use this policy and bootstrap our special and common clients and see the results!
Setup an environment
I like to use lower-resource hardware so setting up an environment with 3 hosts: 1 policy server and two clients calls for the use of containers.
A community member contributed instructions for Installing Community Using Containers. I had to adjust slightly to use redhat 10 instead of redhat 9 as our latest LTS: 3.27.0 requires a newer selinux policy version to install. (I have a pendingpull reqeust to fix that detail.)
torizon@verdin-imx8mp-06849243:~$ docker-compose -f cfengine-community-3.yaml up -d
[+] Running 4/4
✔ Network cfengine-demo_control-plane Created 0.1s
✔ Container cfengine-policy-server Healthy 0.1s
✔ Container cfengine-demo-client-1-1 Started 0.1s
✔ Container cfengine-demo-client-2-1 Started 0.1sInstall and configure masterfiles-stage
Since I am using a Build project and community edition I need to install and setup masterfiles-stage manually.
In order to synchronize and build a Build project from a git repository we must install git and rsync on the policy server. I did this manually with:
sudo dnf install -y rsync gitFollowing the instructions I download the install script and run it:
[root@ba1a7e623d16 ~]# ./install-masterfiles-stage.sh
Now, edit /opt/cfengine/dc-scripts/params.sh to conigure your upstream repository.
Then, run '/var/cfengine/httpd/htdocs/api/dc-scripts/masterfiles-stage.sh --DEBUG' to test deploymentIn GitHub I add a fine-grained access token.
I use this token value as the GIT_PASSWORD variable in params.sh mentioned above.
GIT_URL is set to https://github.com/cfengine/play-cfbs and GIT_REFSPEC is set to extend-update-policy
Finally VCS_TYPE is set to GIT_CFBS so that masterfiles-stage knows to grab a git repo as well as run cfbs build on the repo to deploy the policy.
After modifying params.sh and running the masterfiles-stage script again my policy is in place and the clients will pick up the policy.
Video
The video recording is available on YouTube:
At the end of every webinar, we stop the recording for a nice and relaxed, off-the-record chat with attendees. Join the next webinar to not miss this discussion.
Post-recording discussion about AI
One of us wondered if others felt awkward often talking about AI and we agreed. We also agreed that a lot is going on in the AI/LLM realm and that learning, considering and reflecting about this topic is worth the effort. Some recent articles we found helpful are:
- https://www.media.mit.edu/projects/your-brain-on-chatgpt/overview/
- https://shumer.dev/something-big-is-happening
- https://www.cato.org/commentary/something-big-happening-ai-thats-only-thing-matt-shumer-got-right
Links
- Build project for this episode: extend-update-policy
- Connect w/ Cody, Craig, or Nick
- All Episodes