How to fix PlatformIO ESP8266/ESP32 fatal error: SPI.h: No such file or directory

Problem:

You are trying to compile your PlatformIO application for the ESP8266 or ESP32 but you’re seeing an error message like

In file included from .pio/libdeps/d1_mini/TFT_eSPI/TFT_eSPI.cpp:17:0:
.pio/libdeps/d1_mini/TFT_eSPI/TFT_eSPI.h:32:17: fatal error: SPI.h: No such file or directory

*************************************************************
* Looking for SPI.h dependency? Check our library registry!
*
* CLI  > platformio lib search "header:SPI.h"
* Web  > https://platformio.org/lib/search?query=header:SPI.h
*
*************************************************************

 #include <SPI.h>

This problem is common using the TFT_eSPI library.

Solution:

First, ensure that your platformio.ini has

framework = arduino

If you’re using a different frameworkSPI.h won’t be available since it’s a part of the Arduino framework !

Secondly, add this line to your platformio.ini:

lib_ldf_mode = deep+

and recompile your source code. This will reconfigure the library dependency finder (ldf) to find dependencies of dependency libraries:

Dependency Graph
|-- <TFT_eSPI> 2.3.52
|   |-- <SPI> 1.0

 

Posted by Uli Köhler in ESP8266/ESP32, PlatformIO

How to generate filename with date and time in PowerShell

You can use Get-Date like this to generate a date that is compatible with characters allowed in filenames:

Get-Date -UFormat "%Y-%m-%d_%H-%m-%S"

Example output:

2020-12-11_01-12-26

In order to generate a complete filename, surround that with $() and prepend/append other parts of your desired filename:

mysqldump-$(Get-Date -UFormat "%Y-%m-%d_%H-%m-%S").sql

 

Posted by Uli Köhler in PowerShell, Windows

How to list gitlab-rake tasks

You can list the tasks available in your GitLab’s gitlab-rake using

gitlab-rake -T

In case you’re running a docker-compose based configuration, you can use

docker-compose exec gitlab gitlab-rake -T

where gitlab is the name of the docker-compose service in docker-compose.yml.

This will show you a list like

rake about                                                      # List versions of all Rails frameworks and the environment
rake acts_as_taggable_on_engine:install:migrations              # Copy migrations from acts_as_taggable_on_engine to app...
rake acts_as_taggable_on_engine:tag_names:collate_bin           # Forcing collate of tag names to utf8_bin
rake acts_as_taggable_on_engine:tag_names:collate_ci            # Forcing collate of tag names to utf8_general_ci
rake app:template                                               # Applies the template supplied by LOCATION=(/path/to/te...
rake app:update                                                 # Update configs and some other initially generated file...
rake assets:clean[keep]                                         # Remove old compiled assets
rake assets:clobber                                             # Remove compiled assets
rake assets:environment                                         # Load asset compile environment
rake assets:precompile                                          # Compile all the assets named in config.assets.precompile
rake brakeman                                                   # Security check via brakeman
rake cache:clear:redis                                          # GitLab | Cache | Clear redis cache
rake cache_digests:dependencies                                 # Lookup first-level dependencies for TEMPLATE (like mes...
rake cache_digests:nested_dependencies                          # Lookup nested dependencies for TEMPLATE (like messages...
rake ci:cleanup:builds                                          # GitLab | CI | Clean running builds
rake clean                                                      # Remove any temporary products
rake clobber                                                    # Remove any generated files
rake config_lint                                                # Checks syntax for shell scripts and nginx config files...
rake danger_local                                               # Run local Danger rules
rake db:create                                                  # Creates the database from DATABASE_URL or config/datab...
rake db:drop                                                    # Drops the database from DATABASE_URL or config/databas...
rake db:environment:set                                         # Set the environment value for the database
rake db:fixtures:load                                           # Loads fixtures into the current environment's database
rake db:migrate                                                 # Migrate the database (options: VERSION=x, VERBOSE=fals...
rake db:migrate:status                                          # Display status of migrations
rake db:obsolete_ignored_columns                                # Show a list of obsolete `ignored_columns`
rake db:prepare                                                 # Runs setup if database does not exist, or runs migrati...
rake db:rollback                                                # Rolls the schema back to the previous version (specify...
rake db:schema:cache:clear                                      # Clears a db/schema_cache.yml file
rake db:schema:cache:dump                                       # Creates a db/schema_cache.yml file
rake db:schema:dump                                             # Creates a db/schema.rb file that is portable against a...
rake db:schema:load                                             # Loads a schema.rb file into the database
rake db:seed                                                    # Loads the seed data from db/seeds.rb
rake db:seed:replant                                            # Truncates tables of each database for current environm...
rake db:seed_fu                                                 # Loads seed data for the current environment
rake db:setup                                                   # Creates the database, loads the schema, and initialize...
rake db:structure:dump                                          # Dumps the database structure to db/structure.sql
rake db:structure:load                                          # Recreates the databases from the structure.sql file
rake db:version                                                 # Retrieves the current schema version number
rake dev:load                                                   # GitLab | Dev | Eager load application
rake dev:setup                                                  # GitLab | Dev | Setup developer environment (db, fixtures)
rake downtime_check                                             # Checks if migrations in a branch require downtime
rake file_hooks:validate                                        # Validate existing file hooks
rake gemojione:aliases                                          # Generates Emoji SHA256 digests
rake gemojione:install_assets                                   # Install Emoji Image Assets
rake gettext:add_language[language]                             # add a new language
rake gettext:find                                               # Update pot/po files
rake gettext:lint                                               # Lint all po files in `locale/
rake gettext:pack                                               # Create mo-files
rake gettext:po_to_json                                         # Convert PO files to JS files
rake gettext:regenerate                                         # Regenerate gitlab.pot file
rake gettext:store_model_attributes                             # write the model attributes to <locale_path>/model_attr...
rake gitlab:app:check                                           # GitLab | App | Check the configuration of the GitLab R...
rake gitlab:artifacts:check                                     # GitLab | Artifacts | Check integrity of uploaded job a...
rake gitlab:artifacts:migrate                                   # GitLab | Artifacts | Migrate files for artifacts to co...
rake gitlab:assets:check_page_bundle_mixins_css_for_sideeffects # GitLab | Assets | Check that scss mixins do not introd...
rake gitlab:assets:clean                                        # GitLab | Assets | Clean up old compiled frontend assets
rake gitlab:assets:compile                                      # GitLab | Assets | Compile all frontend assets
rake gitlab:assets:compile_webpack_if_needed                    # GitLab | Assets | Compile all Webpack assets
rake gitlab:assets:fix_urls                                     # GitLab | Assets | Fix all absolute url references in CSS
rake gitlab:assets:purge                                        # GitLab | Assets | Remove all compiled frontend assets
rake gitlab:assets:purge_modules                                # GitLab | Assets | Uninstall frontend dependencies
rake gitlab:assets:vendor                                       # GitLab | Assets | Compile vendor assets
rake gitlab:backup:create                                       # GitLab | Backup | Create a backup of the GitLab system
rake gitlab:backup:restore                                      # GitLab | Backup | Restore a previously created backup
rake gitlab:check                                               # GitLab | Check the configuration of GitLab and its env...
rake gitlab:cleanup:block_removed_ldap_users                    # GitLab | Cleanup | Block users that have been removed ...
rake gitlab:cleanup:moved                                       # GitLab | Cleanup | Delete moved repositories
rake gitlab:cleanup:orphan_job_artifact_files                   # GitLab | Cleanup | Clean orphan job artifact files
rake gitlab:cleanup:orphan_lfs_file_references                  # GitLab | Cleanup | Clean orphan LFS file references
rake gitlab:cleanup:orphan_lfs_files                            # GitLab | Cleanup | Clean orphan LFS files
rake gitlab:cleanup:project_uploads                             # GitLab | Cleanup | Clean orphaned project uploads
rake gitlab:cleanup:remote_upload_files                         # GitLab | Cleanup | Clean orphan remote upload files th...
rake gitlab:cleanup:sessions:active_sessions_lookup_keys        # GitLab | Cleanup | Sessions | Clean ActiveSession look...
rake gitlab:container_registry:configure                        # GitLab | Container Registry | Configure
rake gitlab:db:clean_structure_sql                              # This adjusts and cleans db/structure.sql - it runs aft...
rake gitlab:db:composite_primary_keys_add                       # GitLab | DB | Adds primary keys to tables that only ha...
rake gitlab:db:composite_primary_keys_drop                      # GitLab | DB | Removes previously added composite prima...
rake gitlab:db:configure                                        # GitLab | DB | Configures the database by running migra...
rake gitlab:db:create_dynamic_partitions                        # Create missing dynamic database partitions
rake gitlab:db:downtime_check[ref]                              # GitLab | DB | Checks if migrations require downtime or...
rake gitlab:db:drop_tables                                      # GitLab | DB | Drop all tables
rake gitlab:db:dump_custom_structure                            # This dumps GitLab specific database details - it runs ...
rake gitlab:db:load_custom_structure                            # This loads GitLab specific database details - runs aft...
rake gitlab:db:mark_migration_complete[version]                 # GitLab | DB | Manually insert schema migration version
rake gitlab:db:reindex[index_name]                              # reindex a regular (non-unique) index without downtime ...
rake gitlab:db:setup_ee                                         # GitLab | DB | Sets up EE specific database functionality
rake gitlab:db:unattended                                       # GitLab | DB | Run database migrations and print `unatt...
rake gitlab:doctor:secrets                                      # GitLab | Check if the database encrypted values can be...
rake gitlab:env:info                                            # GitLab | Env | Show information about GitLab and its e...
rake gitlab:exclusive_lease:clear[scope]                        # GitLab | Exclusive Lease | Clear existing exclusive le...
rake gitlab:external_diffs:force_object_storage                 # Override external diffs in file storage to be in objec...
rake gitlab:features:enable_rugged                              # GitLab | Features | Enable direct Git access via Rugge...
rake gitlab:generate_sample_prometheus_data[environment_id]     # GitLab | Generate Sample Prometheus Data
rake gitlab:git:fsck                                            # GitLab | Git | Check all repos integrity
rake gitlab:gitaly:check                                        # GitLab | Gitaly | Check the health of Gitaly
rake gitlab:gitaly:install[dir,storage_path,repo]               # GitLab | Gitaly | Install or upgrade gitaly
rake gitlab:gitlab_shell:check                                  # GitLab | GitLab Shell | Check the configuration of Git...
rake gitlab:import:all_users_to_all_groups                      # GitLab | Import | Add all users to all groups (admin u...
rake gitlab:import:all_users_to_all_projects                    # GitLab | Import | Add all users to all projects (admin...
rake gitlab:import:repos[import_path]                           # GitLab | Import | Import bare repositories from reposi...
rake gitlab:import:user_to_groups[email]                        # GitLab | Import | Add a specific user to all groups (a...
rake gitlab:import:user_to_projects[email]                      # GitLab | Import | Add a specific user to all projects ...
rake gitlab:import_export:bump_version                          # GitLab | Import/Export | Bumps the Import/Export versi...
rake gitlab:import_export:data                                  # GitLab | Import/Export | Display exported DB structure
rake gitlab:import_export:export                                # GitLab | Import/Export | EXPERIMENTAL | Export large p...
rake gitlab:import_export:import                                # GitLab | Import/Export | EXPERIMENTAL | Import large p...
rake gitlab:import_export:version                               # GitLab | Import/Export | Show Import/Export version
rake gitlab:incoming_email:check                                # GitLab | Incoming Email | Check the configuration of R...
rake gitlab:ldap:rename_provider[old_provider,new_provider]     # GitLab | LDAP | Rename provider
rake gitlab:lfs:check                                           # GitLab | LFS | Check integrity of uploaded LFS objects
rake gitlab:lfs:migrate                                         # GitLab | LFS | Migrate LFS objects to remote storage
rake gitlab:orphans:check                                       # Gitlab | Orphans | Check for orphaned namespaces and r...
rake gitlab:orphans:check_namespaces                            # GitLab | Orphans | Check for orphaned namespaces in th...
rake gitlab:orphans:check_repositories                          # GitLab | Orphans | Check for orphaned repositories in ...
rake gitlab:packages:events:generate                            # GitLab | Packages | Events | Generate hll counter even...
rake gitlab:packages:migrate                                    # GitLab | Packages | Migrate packages files to remote s...
rake gitlab:praefect:replicas[project_id]                       # GitLab | Praefect | Check replicas
rake gitlab:seed:group_seed[subgroups_depth,username]           # Seed groups with sub-groups/projects/epics/milestones ...
rake gitlab:seed:issues                                         # GitLab | Seed | Seeds issues
rake gitlab:setup                                               # GitLab | Setup production application
rake gitlab:shell:build_missing_projects                        # GitLab | Shell | Build missing projects
rake gitlab:shell:install[repo]                                 # GitLab | Shell | Install or upgrade gitlab-shell
rake gitlab:shell:setup                                         # GitLab | Shell | Setup gitlab-shell
rake gitlab:sidekiq:check                                       # GitLab | Sidekiq | Check the configuration of Sidekiq
rake gitlab:snippets:list_non_migrated                          # GitLab | Show non migrated snippets
rake gitlab:snippets:migrate[ids]                               # GitLab | Migrate specific snippets to git
rake gitlab:snippets:migration_status                           # GitLab | Show whether there are snippet background mig...
rake gitlab:storage:hashed_attachments                          # Gitlab | Storage | Summary of project attachments usin...
rake gitlab:storage:hashed_projects                             # Gitlab | Storage | Summary of existing projects using ...
rake gitlab:storage:legacy_attachments                          # Gitlab | Storage | Summary of project attachments usin...
rake gitlab:storage:legacy_projects                             # Gitlab | Storage | Summary of existing projects using ...
rake gitlab:storage:list_hashed_attachments                     # Gitlab | Storage | List existing project attachments u...
rake gitlab:storage:list_hashed_projects                        # Gitlab | Storage | List existing projects using Hashed...
rake gitlab:storage:list_legacy_attachments                     # Gitlab | Storage | List existing project attachments u...
rake gitlab:storage:list_legacy_projects                        # Gitlab | Storage | List existing projects using Legacy...
rake gitlab:storage:migrate_to_hashed                           # GitLab | Storage | Migrate existing projects to Hashed...
rake gitlab:storage:rollback_to_legacy                          # GitLab | Storage | Rollback existing projects to Legac...
rake gitlab:tcp_check[host,port]                                # GitLab | Check TCP connectivity to a specific host and...
rake gitlab:test                                                # GitLab | Run all tests
rake gitlab:two_factor:disable_for_all_users                    # GitLab | 2FA | Disable Two-factor authentication (2FA)...
rake gitlab:two_factor:rotate_key:apply                         # GitLab | 2FA | Rotate Key | Encrypt user OTP secrets w...
rake gitlab:two_factor:rotate_key:rollback                      # GitLab | 2FA | Rotate Key | Rollback to secrets encryp...
rake gitlab:update_project_templates                            # GitLab | Update project templates
rake gitlab:update_templates                                    # GitLab | Update templates
rake gitlab:uploads:check                                       # GitLab | Uploads | Check integrity of uploaded files
rake gitlab:uploads:migrate                                     # GitLab | Uploads | Migrate the uploaded files of speci...
rake gitlab:uploads:migrate:all                                 # GitLab | Uploads | Migrate all uploaded files to objec...
rake gitlab:uploads:migrate_to_local                            # GitLab | Uploads | Migrate the uploaded files of speci...
rake gitlab:uploads:migrate_to_local:all                        # GitLab | Uploads | Migrate all uploaded files to local...
rake gitlab:uploads:sanitize:remove_exif                        # GitLab | Uploads | Remove EXIF from images
rake gitlab:usage_data:dump_sql_in_json                         # GitLab | UsageData | Generate raw SQLs for usage ping ...
rake gitlab:usage_data:dump_sql_in_yaml                         # GitLab | UsageData | Generate raw SQLs for usage ping ...
rake gitlab:web_hook:add                                        # GitLab | Webhook | Adds a webhook to the projects
rake gitlab:web_hook:list                                       # GitLab | Webhook | List webhooks
rake gitlab:web_hook:rm                                         # GitLab | Webhook | Remove a webhook from the projects
rake gitlab:workhorse:install[dir,repo]                         # GitLab | Workhorse | Install or upgrade gitlab-workhorse
rake gitlab:x509:update_signatures                              # GitLab | X509 | Update signatures when certificate sto...
rake grape:path_helpers                                         # Print route helper methods
rake grape:routes                                               # Print compiled grape routes
rake hipchat:send[message]                                      # Sends a HipChat message as a particular user
rake import:github[token,gitlab_username,project_path]          # GitLab | Import | Import a GitHub project - Example: i...
rake jira:generate_consumer_key                                 # Generate a consumer key for your application
rake jira:generate_public_cert                                  # Run the system call to generate a RSA public certificate
rake log:clear                                                  # Truncates all/specified *.log files in log/ to zero by...
rake metrics:setup_common_metrics                               # GitLab | Metrics | Setup common metrics
rake middleware                                                 # Prints out your Rack middleware stack
rake migrate_iids                                               # GitLab | Build internal ids for issues and merge requests
rake postgresql_md5_hash                                        # GitLab | Generate PostgreSQL Password Hash
rake raven:test[dsn]                                            # Send a test event to the remote Sentry server
rake restart                                                    # Restart app by touching tmp/restart.txt
rake secret                                                     # Generate a cryptographically secure secret key (this i...
rake setup                                                      # GitLab | Setup gitlab db
rake stats                                                      # Report code statistics (KLOCs, etc) from the applicati...
rake test                                                       # GitLab | Run all tests
rake test:db                                                    # Run tests quickly, but also reset db
rake test:system                                                # Run system tests only
rake time:zones[country_or_offset]                              # List all time zones, list by two-letter country code (...
rake tmp:clear                                                  # Clear cache, socket and screenshot files from tmp/ (na...
rake tmp:create                                                 # Creates tmp directories for cache, sockets, and pids
rake tokens:reset_all_email                                     # Reset all GitLab incoming email tokens
rake tokens:reset_all_feed                                      # Reset all GitLab feed tokens
rake yarn                                                       # Install Node dependencies with Yarn
rake yarn:available                                             # Ensure Yarn is installed
rake yarn:check                                                 # Ensure Node dependencies are installed
rake yarn:clobber                                               # Remove Node dependencies
rake yarn:install                                               # Install Node dependencies with Yarn / Install all Java...
rake zeitwerk:check                                             # Checks project structure for Zeitwerk compatibility

 

Posted by Uli Köhler in Ruby

How to export gitlab backup in Gitlab Docker container

GitLab provides an integrated feature to export a TAR file containing all data from the current GitLab instance.

How to make a GitLab backup?

In order to run this for a GitLab instance running on docker-compose use

docker-compose exec gitlab gitlab-backup create STRATEGY=copy

where gitlab is the container running the gitlab image, e.g. gitlab/gitlab-ce:latest.

In case you are running a docker-based setup without docker-compose, run

docker exec my-gitlab-container gitlab-backup create STRATEGY=copy

where my-gitlab-container is the ID or name of the container

Where to find the backups?

By default, gitlab stores the backups in /var/opt/gitlab/backups. In case you need to change this, look for the following line in /etc/gitlab/gitlab.rb:

gitlab_rails['backup_path'] = "/var/opt/gitlab/backups"

In my docker-compose configuration, I map out the entire /var/opt/gitlab directory:

volumes:
   - './data:/var/opt/gitlab'

hence I can find the backups in ./data/backups:

$ ls data/backups/
1607642274_2020_12_10_13.6.3_gitlab_backup.tar
Posted by Uli Köhler in Docker

How to create a systemd backup timer & service in 10 seconds

In our previous post Create a systemd service for your docker-compose project in 10 seconds we introduced a script that automatically creates a systemd service to start a docker-compose-based project. In this post, we’ll show

First, you need to create a file named backup.sh in the directory where docker-compose.yml is located. This file will be run by the systemd service every day. What that file contains is entirely up to you and we will provide examples in future blogposts.

Secondly, run

wget -qO- https://techoverflow.net/scripts/create-backup-service.sh | sudo bash /dev/stdin

from the directory where docker-compose.yml is located. Note that the script will use the directory name as a name for the service and timer that is created. For example, running the script in /var/lib/redmine-mydomain will cause redmine-mydomain-backup to be used a service name.

Example output from the script:

Creating systemd service... /etc/systemd/system/redmine-mydomain-backup.service
Creating systemd timer... /etc/systemd/system/redmine-mydomain-backup.timer
Enabling & starting redmine-mydomain-backup.timer
Created symlink /etc/systemd/system/timers.target.wants/redmine-mydomain-backup.timer → /etc/systemd/system/redmine-mydomain-backup.timer.

The script will create /etc/systemd/systemd/redmine-mydomain-backup.service containing the specification on what exactly to run:

[Unit]
Description=redmine-mydomain-backup

[Service]
Type=oneshot
ExecStart=/bin/bash backup.sh
WorkingDirectory=/var/lib/redmine-mydomain

and /etc/systemd/systemd/redmine-mydomain-backup.timer containing the logic when the .service is started:

[Unit]
Description=redmine-mydomain-backup

[Timer]
OnCalendar=daily
Persistent=true

[Install]
WantedBy=timers.target

and will automatically start and enable the timer. This means: no further steps are needed after running this script!

In order to show the current status of the service, use e.g.

sudo systemctl status redmine-mydomain-backup.timer

Example output:

● redmine-mydomain-backup.timer - redmine-mydomain-backup
     Loaded: loaded (/etc/systemd/system/redmine-mydomain-backup.timer; enabled; vendor preset: enabled)
     Active: active (waiting) since Thu 2020-12-10 02:50:31 CET; 19min ago
    Trigger: Fri 2020-12-11 00:00:00 CET; 20h left
   Triggers: ● redmine-mydomain-backup.service

Dec 10 02:50:31 myserverhostname systemd[1]: Started redmine-mydomain-backup.

In the

Trigger: Fri 2020-12-11 00:00:00 CET; 20h left

line you can see when the service will be run next. By default, the script generates tasks that run OnCalendar=daily, which means the service will be run on 00:00:00 every day. Checkout the systemd.time manpage for further information on the syntax you can use to specify other timeframes.

In order to run the backup immediately (it will still run daily after doing this), do

sudo systemctl start redmine-mydomain-backup.service

(note that you need to run systemctl start on the .service! Running systemctl start on the .timer will only enable the timer and not run the service immediately).

In order to view the logs, use

sudo journalctl -xfu redmine-mydomain-backup.service

(just like above, you need to run journalctl -xfu on the .service, not on the .timer).

In order to disable automatic backups, use e.g.

sudo systemctl disable redmine-mydomain-backup.timer

Source code:

#!/bin/bash
# Create a systemd service & timer that runs the given backup daily
# by Uli Köhler - https://techoverflow.net
# Licensed as CC0 1.0 Universal
export SERVICENAME=$(basename $(pwd))-backup

export SERVICEFILE=/etc/systemd/system/${SERVICENAME}.service
export TIMERFILE=/etc/systemd/system/${SERVICENAME}.timer

echo "Creating systemd service... $SERVICEFILE"
sudo cat >$SERVICEFILE <<EOF
[Unit]
Description=$SERVICENAME

[Service]
Type=oneshot
ExecStart=/bin/bash backup.sh
WorkingDirectory=$(pwd)
EOF

echo "Creating systemd timer... $TIMERFILE"
sudo cat >$TIMERFILE <<EOF
[Unit]
Description=$SERVICENAME

[Timer]
OnCalendar=daily
Persistent=true

[Install]
WantedBy=timers.target
EOF

echo "Enabling & starting $SERVICENAME.timer"
sudo systemctl enable $SERVICENAME.timer
sudo systemctl start $SERVICENAME.timer

 

Posted by Uli Köhler in Docker, Linux

How to use pg_dump in Gitlab Docker container

When using the offical gitlab Docker container, you can use this command to run psql:

docker exec -t -u gitlab-psql [container name] pg_dump -h /var/opt/gitlab/postgresql/ -d gitlabhq_production > gitlab-dump.sql

This will save the SQL dump of the database into gitlab-dump.sql

In case you’re using a docker-compose based setup, use this command:

docker-compose exec -u gitlab-psql gitlab pg_dump -h /var/opt/gitlab/postgresql/ -d gitlabhq_production > gitlab-dump.sql

Note that gitlab in this command is the container name.

Posted by Uli Köhler in Docker, Linux

How to run psql in Gitlab Docker image

When using the offical gitlab Docker container, you can use this command to run psql:

docker exec -t -u gitlab-psql [container name] psql -h /var/opt/gitlab/postgresql/ -d gitlabhq_production

In case you’re using a docker-compose based setup, use this command:

docker-compose exec -u gitlab-psql gitlab psql -h /var/opt/gitlab/postgresql/ -d gitlabhq_production

Note that gitlab in this command is the container name.

Posted by Uli Köhler in Databases, Docker, Linux

How to fix XCP-NG XENAPI_MISSING_PLUGIN(xscontainer) or Error on getting the default coreOS cloud template

Problem:

When creating a CoreOS container on your XCP-NG host, XCP-NG center or XenOrchestra tells you

Cloud config: Error on getting the default coreOS cloud template

with the error message

XENAPI_MISSING_PLUGIN(xscontainer)
This is a XenServer/XCP-ng error

Solution:

Log into the host’s console as root using SSH or the console in XCP-NG center or XenOrchestra and run

yum install xscontainer

After that, reload the page (F5) you use to create your container. No host restart is required.

Note that if you have multiple hosts, you need to yum install xscontainer for each host individually.

Posted by Uli Köhler in Docker, Virtualization

What does ‘offline’ mean for flyback regulators/controllers?

If you are working on power supplies, you will often see ICs that are labeled offline flyback controllers/regulators (for example, the LT3799).

What does offline mean in this context?

Offline means that the flyback regulator operates off the AC line, i.e. it is intended for use with the 115VAC / 230VAC grid. There are no online flyback controllers, but many other flyback controllers are not suitable for use with a very high (and typically rectified AC) input voltage but can only be used to convert DC/DC

Typically, offline controllers require a bridge rectifier and some additional circuitry like EMI filters to be connected to the AC grid, so be sure to check their datasheet and/or application notes on how exactly they are intended to be used.

Note that designing high-voltage circuits such as offline flyback power supplies can be dangerous and should only be done by experienced engineers.

Posted by Uli Köhler in Electronics

How to run Nextcloud cron job manually using docker-compose

For docker-compose based Nextcloud installations, this is the command to run the cron job manually:

docker-compose exec -u www-data nextcloud php cron.php

You need to run this from the directory where docker-compose.yml is located.

Posted by Uli Köhler in Linux, Nextcloud

How to fix Nextcloud nextcloudcmd CLI “skipped due to earlier error, trying again in …”

Problem:

Your Nextcloud CLI client fails for some files (upload or download) with an error message like this

"Server replied "413 Request Entity Too Large" to "PUT https://example.com/remote.php/dav/uploads/username/XXXXXXXX/YYYYYY" (skipped due to earlier error, trying again in 6 hour(s))
PATH/TO/FILE.bmp

Solution:

The nextcloud CLI client nextcloudcmd stores the sync SQLite database in ~/.local/share/nextcloudcmd/._sync_############.db where ############ is a hex code. If you have multiple such files in ~/.local/share/nextcloudcmd, try out this procedure for each of them:

While nextcloudcmd is not running, use the SQLite3 command line tool to open the database, for example:

sqlite3 ~/.local/share/nextcloudcmd/._sync_bf15278da518.db

Then run this SQL command:

DELETE FROM 'blacklist';

and exit using Ctrl-D. Now try re-running nextcloudcmd, it should immediately retry syncing the file.

Posted by Uli Köhler in Nextcloud

How to install nextcloud CLI client on Ubuntu in 20 seconds

Run this:

sudo add-apt-repository ppa:nextcloud-devs/client && sudo apt update && sudo apt -y install nextcloud-client
Posted by Uli Köhler in Linux

How to install bup on Ubuntu 22.04

I’ve built a PPA that currently publishes bup 0.33 for Ubuntu 22.04 for x64 computers

This one-liner installs the PPA, updates the APT package cache and installs bup:

sudo add-apt-repository -y ppa:ulikoehler/bup && sudo apt update && sudo apt -y install bup

Want to build it yourself?

The bup package has been built using my deb-buildscripts toolchain. In order to build it yourself:

git clone https://github.com/ulikoehler/deb-buildscripts
cd deb-buildscripts
./deb-bup.py

You might need to install some build dependencies for the build process to work, but the script will tell you what is missing.

Additionally you should install python3-xattr– the package in Ubuntu is too old, so install it using

sudo pip3 install --upgrade pyxattr

In case you get

sudo: pip3: command not found

install pip3 using:

sudo apt -y install python3-pip
Posted by Uli Köhler in Linux

How to backup data from docker-compose MariaDB container using mysqldump

For containers with a MYSQL_ROOT_PASSWORD stored in .env

This is the recommended best practice. For this example, we will assume that .env looks like this:

MARIADB_ROOT_PASSWORD=mophur3roh6eegiL8Eeto7goneeFei

To create a dump:

source .env && docker-compose exec mariadb mysqldump -uroot -p${MARIADB_ROOT_PASSWORD} --all-databases > mariadb-dump-$(date +%F_%H-%M-%S).sql

To restore a dump from mariadb-dump.sql, ensure the container is NOT running before this command:

source .env && docker-compose run -T mariadb mariadb -uroot -p${MARIADB_ROOT_PASSWORD} < mariadb-dump.sql

Note that you have to replace mariadb by the name of your container in docker-compose.yml.

For containers with a MYSQL_ROOT_PASSWORD set to some value not stored in .env

This is secure but you typically have to copy the password multiple times: One time for the mariadb container, one time for whatever container or application uses the database, and one time for any backup script that exports a SQL dump of the entire database

To create a dump:

docker-compose exec mariadb mysqldump -uroot -pYOUR_MARIADB_ROOT_PASSWORD --all-databases > dump-$(date +%F_%H-%M-%S).sql

To restore a dump from mariadb-dump.sql:

docker-compose exec -T mariadb mysql -uroot -pYOUR_MARIADB_ROOT_PASSWORD  < mariadb-dump.sql

Replace YOUR_MARIADB_ROOT_PASSWORD by the password of your installation.

Furthermore, you have to replace mariadb by the name of your container in docker-compose.yml

For containers with MYSQL_ALLOW_EMPTY_PASSWORD=yes

This configuration is a security risk – see The security risk of running docker mariadb/mysql with MYSQL_ALLOW_EMPTY_PASSWORD=yes.

To create a dump:

docker-compose exec mariadb mysqldump -uroot --all-databases > mariadb-dump-$(date +%F_%H-%M-%S).sql

To restore a dump from mariadb-dump.sql:

docker-compose exec -T mariadb mysql -uroot < mariadb-dump.sql

More posts on this topic

TechOverflow is currently planning a post on how to use bup in order to provide quick & efficient backups of docker-based MariaDB/MySQL installations.

Posted by Uli Köhler in Docker

The security risk of running docker mariadb/mysql with MYSQL_ALLOW_EMPTY_PASSWORD=yes

This is part of a common docker-compose.yml which is frequently seen on the internet

version: '3'
services:
  mariadb:
    image: 'mariadb:latest'
    environment:
      - MYSQL_ALLOW_EMPTY_PASSWORD=yes
      - MYSQL_DATABASE=redmine
    volumes:
      - './mariadb_data:/var/lib/mysql'
 [...]

Simple and secure, right? A no-root-password MariaDB instance that’s running in a separate container and does not have its port 3306 exposed – so only services from the same docker-compose.yml can reach it since docker-compose puts all those services in a separate network.

Wrong.

While the MariaDB instance is not reachable from the internet since no, it can be reached by any process via its internal IP address.

In order to comprehend what’s happening, we shall take a look at docker’s networks. In this case, my docker-compose config is called redmine.

$ docker network ls | grep redmine
ea7ed38f469b        redmine_default           bridge              local

This is the network that docker-compose creates without any explicit network configuration. Let’s inspect the network to show the hosts:

[
    // [...]
        "Containers": {
            "2578fc65b4dab9f204d0a252e421dd4ddd9f41c35642d48350f4e59370581757": {
                "Name": "redmine_mariadb_1",
                "EndpointID": "1e6d81acc096a12fc740173f4e107090333c42e8a86680ac5c9886c148d578e7",
                "MacAddress": "02:42:ac:12:00:02",
                "IPv4Address": "172.18.0.2/16",
                "IPv6Address": ""
            },
            "7867f71d2a36265c34c133b70aea487b90ea68fcf30ecb42d6e7e9a376cf8e07": {
                "Name": "redmine_redmine_1",
                "EndpointID": "f5ac7b3325aa9bde12f0c625c4881f9a6fc9957da4965767563ec9a3b76c19c3",
                "MacAddress": "02:42:ac:12:00:03",
                "IPv4Address": "172.18.0.3/16",
                "IPv6Address": ""
            }
        },
    // [...]
]

We can see that the IP address of the redmine_mariadb_1 container is 172.18.0.2.

Using the internal IP 172.18.0.2, you can access the MySQL server.

Any process on the host (even from unprivileged users) can connect to the container without any password, e.g.

$ mysqldump -uroot -h172.18.0.2 --all-databases
// This will show the dump of the entire MariaDB database

How to mitigate this security risk?

Mitigation is quite easy since we only need to set a root password for the MariaDB instance.

My recommended best practice is to avoid duplicate passwords. In order to do this, create a .env file in the directory where docker-compose.yml is located.

MARIADB_ROOT_PASSWORD=aiPaipei6ookaemue4voo0NooC0AeH

Remember to replace the password by a random password or use this shell script to automatically create it:

echo MARIADB_ROOT_PASSWORD=$(pwgen 30) > .env

Now we can use ${MARIADB_ROOT_PASSWORD} in docker-compose.yml whereever the MariaDB root password is required, for example:

version: '3'
services:
  mariadb:
    image: 'mariadb:latest'
    environment:
      - MYSQL_ROOT_PASSWORD=${MARIADB_ROOT_PASSWORD}
      - MYSQL_DATABASE=redmine
    volumes:
      - './mariadb_data:/var/lib/mysql'
  redmine:
    image: 'redmine:latest'
    environment:
      - REDMINE_USERNAME=admin
      - REDMINE_PASSWORD=redmineadmin
      - [email protected]
      - REDMINE_DB_MYSQL=mariadb
      - REDMINE_DB_USERNAME=root
      - REDMINE_DB_PASSWORD=${MARIADB_ROOT_PASSWORD}
    ports:
      - '3718:3000'
    volumes:
      - './redmine_data/conf:/usr/src/redmine/conf'
      - './redmine_data/files:/usr/src/redmine/files'
      - './redmine_themes:/usr/src/redmine/public/themes'
    depends_on:
      - mariadb

Note that the mariadb docker image will not change the root password if the database directory already exists (mariadb_data in this example).

My recommended best practice for changing the root password is to use mysqldump --all-databases to export the entire database to a SQL file, then backup and delete the data directory, then re-start the container so the new root password will be set. After that, re-import the dump from the SQL file.

Posted by Uli Köhler in Databases, Docker, Linux

Best practice for installing & autostarting OpenVPN client/server configurations on Ubuntu

This post details my systemd-based setup for installing and activating OpenVPN client or server configs on Ubuntu. It might also work for other Linux distributions that are based on systemd..

First, place the OpenVPN config (usually a .ovpn file, but it can also be a .conf file) in /etc/openvpnYou need to change the filename extension to .conf.ovpn won’t work. Furthermore, ensure that there are no spaces in the filename.

In this example, our original OpenVPN config will be called techoverflow.ovpn, hence it needs to be copied to /etc/openvpn/techoverflow.conf!

Now we can enable (i.e. autostart on boot – but not start immediately) the config using

sudo systemctl enable openvpn@techoverflow

For techoverflow.conf you need to systemctl enableopenvpn@techoverflow whereas for a hypothetical foo.conf you would need to systemctl enable openvpn@foo.

Now we can start the VPN config – i.e. run it immediately using

sudo systemctl start openvpn@techoverflow

Now your VPN client or server is running – or is it? We shall check the logs using

journalctl -xfu openvpn@techoverflow

In order to manually restart the VPN client or server use

sudo systemctl restart openvpn@techoverflow

and similarly run this to stop the VPN client or server:

sudo systemctl stop openvpn@techoverflow

In order to show if the instance is running – i.e. show its status, use

sudo systemctl status openvpn@techoverflow

Example output for an OpenVPN client:

[email protected] - OpenVPN connection to techoverflow
     Loaded: loaded (/lib/systemd/system/[email protected]; enabled; vendor preset: enabled)
     Active: active (running) since Sun 2020-11-29 03:37:52 CET; 953ms ago
       Docs: man:openvpn(8)
             https://community.openvpn.net/openvpn/wiki/Openvpn24ManPage
             https://community.openvpn.net/openvpn/wiki/HOWTO
   Main PID: 4123809 (openvpn)
     Status: "Pre-connection initialization successful"
      Tasks: 1 (limit: 18689)
     Memory: 1.3M
     CGroup: /system.slice/system-openvpn.slice/[email protected]
             └─4123809 /usr/sbin/openvpn --daemon ovpn-techoverflow --status /run/openvpn/techoverflow.status 10 --cd /etc/openvpn --script-security 2 --config /etc/ope>

Nov 29 03:37:52 localgrid systemd[1]: Starting OpenVPN connection to techoverflow...
Nov 29 03:37:52 localgrid ovpn-techoverflow[4123809]: OpenVPN 2.4.7 x86_64-pc-linux-gnu [SSL (OpenSSL)] [LZO] [LZ4] [EPOLL] [PKCS11] [MH/PKTINFO] [AEAD] built on Sep >
Nov 29 03:37:52 localgrid ovpn-techoverflow[4123809]: library versions: OpenSSL 1.1.1f  31 Mar 2020, LZO 2.10
Nov 29 03:37:52 localgrid systemd[1]: Started OpenVPN connection to techoverflow.
Nov 29 03:37:52 localgrid ovpn-techoverflow[4123809]: TCP/UDP: Preserving recently used remote address: [AF_INET]83.135.163.227:19011
Nov 29 03:37:52 localgrid ovpn-techoverflow[4123809]: UDPv4 link local (bound): [AF_INET][undef]:1194
Nov 29 03:37:52 localgrid ovpn-techoverflow[4123809]: UDPv4 link remote: [AF_INET]83.135.163.22:19011
Nov 29 03:37:53 localgrid ovpn-techoverflow[4123809]: [nas-vpn.haar.techoverflow.net] Peer Connection Initiated with [AF_INET]83.135.163.227:19011

 

Posted by Uli Köhler in Linux, VPN

Simple self-hosted WebWormhole.io using docker-compose

Note: This config is currently missing a TURN server, so it won’t work if the clients can’t reach each other! I am working on this.

WebWormhole.io is a new service similar to and inspired by magic-wormhole that allows easily sharing files between browsers without the need to install a software. Internally, it uses WebRTC, allowing direct transfer of files between computers even through firewalls.

While there is no official Docker image published on Docker Hub, the WebWormhole GitHub project provides an official Dockerfile. Based on this, I have published ulikoehler/webwormhole which has been built using

git clone https://github.com/saljam/webwormhole.git
cd webwormhole
docker build -t ulikoehler/webwormhole:latest .
docker push ulikoehler/webwormhole:latest

This is the docker-compose.yml that you can use to run WebWormhole behind a reverse proxy:

version: '3'
services:
  webwormhole:
    image: 'ulikoehler/webwormhole:latest'
    entrypoint: ["/bin/ww", "server", "-http=localhost:52618", "-https="]
    network_mode: host

and this is my nginx config:

server {
    server_name  webwormhole.mydomain.com;

    access_log off;
    error_log /var/log/nginx/webwormhole.mydomain.com.error.log;

    location / {
        proxy_pass http://localhost:52618/;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "Upgrade";
        proxy_redirect default;
    }

    listen 443 ssl; # managed by Certbot
    ssl_certificate /etc/letsencrypt/live/webwormhole.mydomain.com/fullchain.pem; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/webwormhole.mydomain.com/privkey.pem; # managed by Certbot
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    #ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
    if ($host = webwormhole.mydomain.com) {
        return 301 https://$host$request_uri;
    }

    server_name webwormhole.mydomain.com;

    listen 80;
    return 404; # managed by Certbot
}

I store docker-compose.yml in /var/lib/webwormhole.mydomain.com and I used the script from our previous post Create a systemd service for your docker-compose project in 10 seconds in order to create this systemd config file in /etc/systemd/system/webwormhole.mydomain.com.service:

[Unit]
Description=webwormhole.mydomain.com
Requires=docker.service
After=docker.service

[Service]
Restart=always
User=root
Group=docker
WorkingDirectory=/var/lib/webwormhole.mydomain.com
# Shutdown container (if running) when unit is started
ExecStartPre=/usr/local/bin/docker-compose -f docker-compose.yml down
# Start container when unit is started
ExecStart=/usr/local/bin/docker-compose -f docker-compose.yml up
# Stop container when unit is stopped
ExecStop=/usr/local/bin/docker-compose -f docker-compose.yml down

[Install]
WantedBy=multi-user.target

which you can start and enable using

sudo systemctl enable webwormhole.mydomain.com
sudo systemctl start webwormhole.mydomain.com

 

Posted by Uli Köhler in Docker, Linux

How to make PowerShell output error messages in English

If you want to see a PowerShell output (e.g. an error message) in english instead of your local language, prefix your command by

[Threading.Thread]::CurrentThread.CurrentUICulture = 'en-US';

For example, in order to run to run My-Cmdlet -Arg 1 with output in English instead of your local language, use

[Threading.Thread]::CurrentThread.CurrentUICulture = 'en-US'; My-Cmdlet -Arg 1

[Threading.Thread]::CurrentThread.CurrentUICulture only affects the current command and does not have any effect for other commands. Hence your need to copy the command before each and every command for which you want to see the output in English.

Possibly you also need to install the English help files in order to see more messages in English. In order to do that, run this command in PowerShell as an administrator:

Update-Help -UICulture en-US

 

Posted by Uli Köhler in PowerShell, Windows

How to restore MySQL database dump in docker-compose mariadb container

Use this snippet to restore a SQL file in your MariaDB container:

docker-compose exec -T [container name] mysql -uroot < mydump.sql

This assumes you have not set a root password. In order to use a root password, use

docker-compose exec -T mariadb mysql -uroot -pmysecretrootpassword < mydump.sql

-T means don’t use a TTY, in other words, don’t expect interactive input. This avoids the

the input device is not a TTY

error message.

Posted by Uli Köhler in Container, Docker

How to use child_process.exec in Koa (async/await)

First install the child-process-promise library

npm i --save child-process-promise

Then you can use it like this:

const router = require('koa-router')();
const {exec} = require('child-process-promise');

router.get('/test', async ctx => {
  const [stdout, stderr] = await exec('python myscript.py')
  const ipv6 = stdout.toString();
  ctx.body = ipv6;
});

 

Posted by Uli Köhler in Javascript
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPTPrivacy &amp; Cookies Policy