I've been busy, and lost my writing 'mojo' for a while, but it's time to update what's going on with this experience.
I LOVE IT!
I truly do.
I think I've converted all of my little utility scripts to PowerShell and things are working rather smooth.
In the past couple of weeks I have not dropped into bash or zsh on all of my primary Linux machines, and I've even got the environment setup on my Windows machine.
While there are a lot of tools that are pretty much tailored to my liking, if somebody is reading this, it's probably more interesting to understand the challenges I've met and the solutions I chose, instead of the specific implementation of utility scripts.
This post is going to be a bit different than past ones, as I'm not going to focus on the problems and solutions in detail, instead I'll list a short description of the problem and the solution I implemented.
If the topic is interesting enough, I'll expand on it in future posts.
Managing system configuration
Especially in Linux/Unix, configuration is stored in textual files in different well-known locations.
Edit-MyConfig script helps editing these simple-text files by specifying shortcut names. One of these configuration is the json file used by the script to determine which config file to open in an editor.
src/config.json contains definitions for:
- myconfig (in essence it's this list)
- mymodules (see Missing Modules below)
Working on multiple servers
The scripts themselves, once installed are not designed to make changes to the environment, and the script package is designed to be able to work in the most 'plain vanilla' environment of (a modern version of) PowerShell.
This introduced some problem
The environment works best in conjunction with other well known and publicly available (and free) modules. But these might not be installed.
profile.d/Test-MyModules.ps1 script reads the list of modules from
src/modules.json (easily edited with
If a module is detected as missing, a warning will show up (but it will do so only once per session).
SystemName class to manage system identification, this in turn is used by
Reload-MyScripts to make sure only the generic and system-specific scripts are available, hiding scripts that are not relevant in the current system.
Future plans: To allow system specific scripts and configuration in the git repo that will only be relevant to specific machines / environments, but not for other, allowing local configuration to toggle these as needed.
Have not decided yet on how to implement this.
Controlling multiple machines
Regardless to my excursion into my 'PowerShell Everywhere' everywhere experience, I've also started using Ansible a few weeks prior to starting the blog.
Just like my wish to make sudo easier to use in PowerShell, I saw great potential in utilizing Ansible together with PowerShell, and that is exactly what I did.
Invoke-ViaAnsible constructs a PowerShell call that will be executed via Ansible, making working with remote systems much easier on me.
As I mentioned earlier, I've found myself using PowerShell exclusively in the past couple of month, mainly after creating my
docker-compose aliases to match my set of functions and aliases from zsh.
I've also added a snippet from the PowerShell Cookbook into
profile.d: Output to
Out-Default is captures (for up to 500 objects) and can be accessed via the
So what now?
Now I'm really using the system, and I'll come back here to share any insights about using PowerShell everywhere.
One thing I already noticed, is that I still type the POSIX commands for working with the file system (
mv etc...). This is one habit I'm going to need to overcome, as I find the string output nature of these commands to be very underpowered.
Also, a lot of my docker commands and aliases currently lack good auto-complete functionality, I'll need to fix that as well.
So these will probably be my main priorities, unless I procrastinate and find some other rabbit to chase down the hole - we'll see.
* Source Code
Latest commit at the time of writing: