Embedded

How to fix ModuleNotFoundError: No module named ‘picamera’

Problem:

You want to run a Python script using the Raspberry Pi camera but you see an error message like

Traceback (most recent call last):
  File "mycamera.py", line 2, in <module>
    import picamera
ModuleNotFoundError: No module named 'picamera'

Solution:

You need to install the picamera Python module using pip:

sudo pip3 install picamera

or, if you are still using Python 2.x:

sudo pip install picamera

In case you see

sudo: pip3: command not found

install pip3 using

sudo apt install -y python3-pip

 

Posted by Uli Köhler in Python, Raspberry Pi

How to install picamraw using pip

First try installing it normally:

sudo pip3 install picamraw

In case that fails with this error message (like for me):

Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple
Collecting picamraw
Could not install packages due to an EnvironmentError: 404 Client Error: Not Found for url: https://www.piwheels.org/simple/picamraw/

download it and install it manually: Copy the link of the most recent .whl file from https://pypi.org/project/picamraw/#files, download it using wget and install it using pip3, e.g.:

wget https://files.pythonhosted.org/packages/1e/47/4efb0d0ab5d40142424e7f3db545e276733a45bd7f7f9095919ef30c96b3/picamraw-1.2.64-py3-none-any.whl
sudo pip3 install picamraw-1.2.64-py3-none-any.whl

 

Posted by Uli Köhler in Python, Raspberry Pi

How to capture Raspi Camera image using OpenCV & Python

First, install OpenCV for Python 3:

sudo apt install python3-opencv

Here’s the code to acquire the image and store it in image.png:

#!/usr/bin/env python3
import cv2
video_capture = cv2.VideoCapture(0)
# Check success
if not video_capture.isOpened():
    raise Exception("Could not open video device")
# Read picture. ret === True on success
ret, frame = video_capture.read()

cv2.imwrite('image.png', frame)
# Close device
video_capture.release()

Run it using

python3 cv-raspicapture.py

 

Posted by Uli Köhler in OpenCV, Python, Raspberry Pi

How to fix Raspi camera ‘mmal_vc_component_create: failed to create component ‘vc.ril.camera’ (1:ENOMEM)’

Problem:

You are trying to acess the Raspberry Pi camera using raspistill or raspivid, but you see an error message like this:

mmal: Cannot read camera info, keeping the defaults for OV5647
mmal: mmal_vc_component_create: failed to create component 'vc.ril.camera' (1:ENOMEM)
mmal: mmal_component_create_core: could not create component 'vc.ril.camera' (1)
mmal: Failed to create camera component
mmal: main: Failed to create camera component
mmal: Camera is not enabled in this build. Try running "sudo raspi-config" and ensure that "camera" has been enabled

Solution:

The most common issue here is that you don’t have the camera interface enabled. Read How to enable Raspberry Pi camera using raspi-config for instructions on how to do that.

If the camera interface has already been enabled (remember that you need to reboot for the changes to take effect), the most likely reason for error messages like this is that the camera is not connected correctly to the Raspberry Pi:

  • Is the CSI cable inserted the right way? The silvery contacts need to face away from the Ethernet connector!
  • Is the CSI cable fully seated?
  • Did you insert the CSI cable into the Display connector? It needs to be inserted into the CSI connector, which is the one closer to the Ethernet connector.
  • Is the other end of the CSI cable correctly attached to the camera board?
Posted by Uli Köhler in Embedded, Raspberry Pi

How to enable Raspberry Pi camera using raspi-config

You can enable the camera interface by running

sudo raspi-config

Select 5 Interfacing Options and press Return.

Now select P1 Camera and press Return.

Select Yes and press Return.

Now that the camera interface has been enabled, press Return.

Press Tab twice to select Finish and press Return. You are now asked if you want to reboot:

Select Yes and wait for your Raspberry Pi to reboot. Your camera interface will be enabled after the reboot.

In case you are not asked to reboot, your camera interface was already enabled. In this case, select Finish on the main raspi-config screen by pressing Tab twice and pressing Return:

After pressing Return, check your shell again. Reboot (sudo reboot) so that any previous settings will take effect and then try

sudo raspistill -o myimg.jpg

in order to test your camera.

Posted by Uli Köhler in Embedded, Raspberry Pi

How to fix Raspi camera ‘failed to open vchiq instance’

Problem:

You are trying to access the Raspberry Pi camera using raspistill or raspivid. Although you already enabled the camera interface using raspi-config, you see this error message:

* failed to open vchiq instance

Solution:

Your user does not have the permissions to access the camera interface. As a quick fix, you can run raspistill or raspivid as root using sudo, or add your user to the video group to acquire the required permissions:

sudo usermod -a -G video $USER

After doing this, log out and log back in again for the changes to take effect – or close your SSH session and connect again. If in doubt, try to reboot.

Posted by Uli Köhler in Embedded, Raspberry Pi

How I fixed my Arduino ISP ‘Device signature = …’ errors

Problem:

I was trying to program my Arduino using an AVRISP MK II programmer, but every time I tried to program it (using avrdude or the Arduino software), I encountered errors like this:

avrdude: stk500v2_command(): command failed
avrdude: stk500v2_program_enable(): bad AVRISPmkII connection status: MOSI fail, RST fail, SCK fail, Target reverse inserted
avrdude: initialization failed, rc=-1
avrdude: AVR device initialized and ready to accept instructions
avrdude: Device signature = 0xb058e2
avrdude: Expected signature for ATmega2560 is 1E 98 01
avrdude: NOTE: "flash" memory has been specified, an erase cycle will be performed
         To disable this feature, specify the -D option.

Solution:

In my case, the reason for this issue was that there was a shield connected to the SPI bus.

Since the SPI bus on the Arduino is shared with the ICSP header for In-System Programming (ISP), which is the protocol used by external programmers for programming AVR chips, connecting anything else to the SPI bus may cause issues.

Try to disconnect anything from the SPI bus. If the CS (chip select) signal of the extra device on the SPI bus is low, it might interfere with the communication with the AVR chip!

Posted by Uli Köhler in Arduino

How to fix Arduino ‘U8glib.h: No such file or directory’

Problem:

You want to compile an Arduino sketch, but you see an error message like

In file included from sketch/ultralcd.cpp:96:0:
ultralcd_impl_DOGM.h:46:20: error: U8glib.h: No such file or directory
compilation terminated.
exit status 1
U8glib.h: No such file or directory

Solution:

In Arduino, click on the Tools menu, then click on Manage libraries. Now enter u8glib into the search bar an press Enter.

Scroll down to the U8glib library, hover over it with your mouse and click Install on the bottom right. Now the error should be gone.

Posted by Uli Köhler in Arduino

How to fix SDCC ‘at 1: warning 119: don’t know what to do with file ‘main.o’. file extension unsupported’

If you use SDCC to compile your C code for your microcontroller and you encounter an error message like this:

at 1: warning 119: don't know what to do with file 'main.o'. file extension unsupported

you have to configure your build system to use the .rel suffix for object files instead of the standard .o. SDCC expects built object files to have the .rel extension! See How to change CMake object file suffix from default “.o” for details on how to do that in CMake.

Posted by Uli Köhler in Build systems, Embedded

A working SDCC STM8 CMake configuration

If you have been looking desperately for a working CMake example for the SDCC compiler for STM8 microcontrollers here’s my take on it:

cmake_minimum_required(VERSION 3.2)

set(CMAKE_C_OUTPUT_EXTENSION ".rel")
set(CMAKE_C_COMPILER sdcc)
set(CMAKE_SYSTEM_NAME Generic) # No linux target etc

# Prevent default configuration
set(CMAKE_C_FLAGS_INIT "")
set(CMAKE_EXE_LINKER_FLAGS_INIT "")

project(STM8Blink C)
SET(CMAKE_C_FLAGS "-mstm8 --std-c99")
add_executable(main.ihx main.c)

# Flash targets
add_custom_target(flash ALL COMMAND stm8flash -c stlink -p stm8s105c6 -w main.ihx)

This will build main.ihx from main.c. main.ihx is a Intel Hex file which can be directly flashed using stm8flash.

The last lines setup make flash ; you might need to use the correct microcontroller (stm8s105c6 in this example, run stm8flash -l to show supported devices) and the correct flash adapter (stlink, stlinkv2, stlinkv21, stlinkv3 or espstlink).

The setup example shown here is for the STM8S eval board.

I suppose it can be easily modified for other microcontrollers, but I haven’t tried that so far.

Posted by Uli Köhler in CMake, Embedded, Hardware

How to fix raspivid ‘* failed to open vchiq instance’

Problem:

You want to capture a video using raspivid, e.g. using a command like

raspivid -o vid.h264

but you only see this error message:

* failed to open vchiq instance

with no video being captured.

Quick solution:

Run raspivid using sudo:

sudo raspivid -o vid.h264

Now either the video will be captured or you will see another error message if there is another issue with your camera setup.

Better solution:

Add your user to the video group so you don’t have to run raspivid as root:

sudo usermod -a -G video $USER

You only need to do this once. After that, log out and log back in (or close your SSH session and connect again). If in doubt, restart the Raspberry Pi. See What does &#8217;sudo usermod -a -G group $USER&#8216; do on Linux? for more details.

Now you will be able to capture video as normal user:

raspivid -o vid.h264

 

Posted by Uli Köhler in Embedded, Linux

How to enable SSH on Raspbian without a screen

You can open the boot partition on the SD card (the FAT32 partition) and create an empty file named ssh in the root directory of that partition. Ensure that the file is names ssh and not ssh.txt !

If you are in the correct working directory in the command line, use

touch ssh

On recent Ubuntu version, this will switch to the correct directory and create the file (but you need to mount the directory manually e.g. using your file explorer:

cd /media/$USER/boot && touch ssh

Don’t forget to unmount the boot drive before removing the SD card. Once you restart the Raspberry Pi with the modified SD card, SSH will be enabled without you having to attach a keyoard or screen to the Pi.

This approach was tested with the 2018-11-13 version of Raspbian and works with Raspberry Pi 1, Raspberry Pi 2 and Raspberry Pi 3.

Credits to Yahor for the original solution on StackOverflow!

Posted by Uli Köhler in Embedded, Linux

What is the default SSH username/password for OctoPrint/OctoPi?

OctoPrint/OctoPi uses the standard Raspbian credentials, that is:

Username: pi
Password: raspberry

Tested with OctoPrint 0.16.0

Posted by Uli Köhler in Embedded, Linux

How to configure OctoPrint/OctoPi with Ethernet using a static IP

In order to configure OctoPrint/OctoPi to use the Raspberry Pi Ethernet interface with a static IP, first open the rootfs partition on the SD card. After that, open etc/network/interfaces in your preferred text editor (you might need to open it as root, e.g. sudo nano etc/network/interfaces – ensure that you don’t edit your local computer’s /etc/network/interfaces but the one on the SD card).

Now copy the following text to the end of etc/network/interfaces:

auto eth0
allow-hotplug eth0
iface eth0 inet static
    address 192.168.1.234
    netmask 255.255.255.0
    gateway 192.168.1.1
    network 192.168.1.0
    broadcast 192.168.0.255
    dns-nameservers 8.8.8.8 8.8.4.4

You might need to adjust the IP addresses so they match your router.

Save the file and insert the SD card into your Raspberry Pi. You should be able to ping in – in our example, ping 192.168.1.234.

Tested with OctoPrint 0.16.0

Original source: OctoPrint forum

Posted by Uli Köhler in Embedded, Linux

How to fix MicroPython I2C no data

Problem:

You’ve configured MicroPython’s I2C similar to this (in my case on the ESP8266 but this applies to many MCUs):

i2c = machine.I2C(-1, machine.Pin(5), machine.Pin(4))

but you can’t find any devices on the bus:

>>> i2c.scan()
[]

Solution:

Likely you forgot to configure the pins as pullups. I2C needs pullups to work, and many MCUs (like the ESP8266) provide support for integrated (weak) pull-ups.

p4 = machine.Pin(4, mode=machine.Pin.OUT, pull=machine.Pin.PULL_UP)
p5 = machine.Pin(5, mode=machine.Pin.OUT, pull=machine.Pin.PULL_UP)
i2c = machine.I2C(-1, p5, p4)

i2c.scan() # [47]

You can also verify this by checking with a multimeter or an oscilloscope: When no communication is going on on the I2C bus, the voltage should be equivalent to the supply voltage of your MCU (usually 3.3V or 5V – 0V indicates a missing pullup or some other error).

Posted by Uli Köhler in Electronics, MicroPython, Python

How to fix MicroPython ‘ValueError: invalid I2C peripheral’

If you see the error message

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: invalid I2C peripheral

you are likely running code like this:

import machine

i2c = machine.I2C(machine.Pin(5), machine.Pin(4))

Solution

The MicroPython API has changed (source: forum). You need to use this syntax instead:

import machine

i2c = machine.I2C(-1, machine.Pin(5), machine.Pin(4))

-1 is the I2C ID that selects a specific peripheral. -1 selects a software I2C implementation which can work on most pins. See the MicroPython I2C class documentation for more details.

Posted by Uli Köhler in Electronics, MicroPython, Python

How to calculate STM32 bxCAN bit timings using Python

When I needed to configure some STM32 microcontrollers with nonstandard clock speeds to use the correct CAN bit timings.

While this script has been written for the STM32F413 series, it should be applicable to pretty much all STM32 MCUs with bxCAN – however, you need to check the details, especially from which clock the CAN is derived.

You need to know:

  • The frequency of the clock from which the CAN clock is derived (PCLK1 in my example: 48 MHz)
  • Your desired Baudrate: 1 MBaud in my example

Using an iterative trial and error approach, we will determine the appropriate values for BRP, TS1 and TS2 (these are specific sets of bits in the CAN):

#!/usr/bin/env python3
# Configuration. Change here
master_clock = 48e6 # PCLK1, or whatever CLK CAN is derived from
desired_baudrate = 1e6 # 1e6 = 1 MBaud, 500e3 = 500 kBaud
# Register values
brp = 5 # Baud rate prescaler
ts1 = 4 # (Length of time segment 1) - 1
ts2 = 1 # (Length of time segment 2) - 1

# Calculation. You usually don't need to change this.
bs1_segments = ts1 +1
bs2_segments = ts2 + 1
total_segments = 1 + bs1_segments + bs2_segments
bittime = 1. / desired_baudrate
master_time = 1. / master_clock
tq = (brp + 1) * master_time
total_time = total_segments * tq
effective_rate = 1. / total_time
sample_point = 1 - (bs2_segments / total_segments)

# Print results
print(f"Effective Baudrate: {effective_rate:.2f} Baud = {100. * effective_rate / desired_baudrate:.2f} % of desired baudrate")
print(f"Sample point: {100. * sample_point:.2f}  %")

When you run this script, you will see an output like this:

Effective Baudrate: 1000000.00 Baud = 100.00 % of desired baudrate
Sample point: 75.00  %

Of course you need to fill the master clock frequency and the desired baudrate at the top. You can start with the pre-configured values for BRP, TS1 and TS2.

In the end, you should achieve 100% of desired baudrate and a sample point that is 87.5% (80% to 90% is usually OK, 75% might not work with long cables. There might be a specific value dictated by your comms standard, e.g. CANOpen or J1939).

Change the values of for BRP, TS1 and TS2., re-running the script and observing the output each time. I recommend first changing BRP until you get close and then adjusting using this algorithm:

  • If your actual baudrate is too slow (i.e. < 100%), decrease BRP (larger step) and/or decrease TS1 + TS2
  • If your actual baudrate is too fast (i.e. > 100%), increase BRP (larger step) and/or increase TS1 + TS2
  • If your sample point percentage is too small, increase the ratio of TS1 to TS2
  • If your sample point percentage is too large, decrease the ratio of TS1 to TS2

Note that there might be master clock speeds and baudrates for which there is no ideal setting. Be sure to check whether a slightly different baudrate is still OK in your application (usually it’s not). If it is not, you need to find a way to use a different PCLK1 speed.

Posted by Uli Köhler in Electronics, Embedded

How to fix Arduino SigFox.h: No such file or directory

Problem:

You are trying to run the Arduino SigFox first configuration tutorial but when trying to upload the sketch, you get this error message:

SigFox.h: No such file or directory

Solution:

While you have installed the board libraries and compiler for the SigFox boards, you have not installed the SigFox library itself.

Press Ctrl+Shift+I to open the library manager (you can also open it from the Tools menu).

Look for Arduino SigFox for MKRFox1200 by Arduino and click the Install button

Since the SigFox sketches also require the Arduino low power libraries (i.e. you would get the error message ArduinoLowPower.h: No such file or directory), also click the Install button for the Arduino Low Power by Arduino library.

After that, enter RTCZero in the search bar at the top of the library manager and click the Install button for the RTCZero by Arduino library.

After that, retry uploading the sketch.

Posted by Uli Köhler in Arduino, Electronics, Embedded

Identifying the frame length for an unknown serial protocol

Let’s suppose you’re reverse-engineering a serial protocol. You already know the correct configuration for the serial port (baudrate etc.) and you assume the protocol is built from frames of equal length.

For simplicity we will also assume that the device sends the data without the necessity to request data from it first. If this is not the case, you can use a different and much simpler approach (just send the request character and se

The next step is to determine the the frame length of the protocol. This post not only details two variants of one of the algorithms you can use in order to do this, but also provides a ready-to-use Python script you can use for your own protocols.

Approach 1: Autocorrelation with argmax

We will use a simple mathematical approach in order to find out what the most likely frame length will be. This is based on the assumption that frames will have a high degree of self similarity, i.e. many of the bytes in a single frame will match the corresponding bytes in the next frame.

It is not required that all bytes are the same in every frame, but if you have entirely different bytes in every frame, the approach will likely not deduce the correct frame length.

This approach is based on autocorrelation. Although it sounds complicated, it means nothing more Compare a sequence by a delayed/shifted version of itself.

This means we will perform the following steps:

  • Read a set of characters from the serial device
  • Correlate the set of characters with shifted versions of itself
  • The framelength is the shift where the maximum similarity occurs (using np.argmax)

As similiarity score, we’ll use 1 if the bytes equal or 0 else. For specific protocols, it might be a more viable approach to introduce individual bit matching, but this will also introduce noise into the process. 

For most simple protocols I’ve seen, this approach works very well for both ASCII and binary.

Plotting the correlation looks like this:

Approach 2: Multiple-shift aware Autocorrelation

This modified algorithms works well for protocols where there is insignificant similarity between any two frames or if there is a lot of noise. For such protocol, the maximum score approach does not yield the correct result.

However, we can use the property of constant-framelength protocols that we get high matching scores by shifting a frame by an integer multiple of the (unknown) framelength. Instead of just taking one maximum peak, we multiply all the scores for the integer-multiples of any length.

While this approach doesn’t sound too complicated compared to the first one, it has more caveats and pitfalls, e.g. that there are no integer multiples within the data array for the second half of the correlation result array, and the second quarter is not very significant as there are not many multiples to multiply.

The script (see below) works around these issues by only computing the first quarter of the possible result space. Use the -n parameter in order to increase the number of characters read by the script.

After computing the multiple-shift aware correlation, we can use argmax just like in the first approach to find the best correlation. Sometimes this identifies a multiple of the frame length due to noise. You can look at the plot (use -p) and manually determine the frame length in order to find the correct frame length.

As you can see from the result, the “noise” (in between the frame-length shift matches, caused by random matches between characters) is mostly gone.

In many real usecases, this algorithm will produce a more distinct signal in the plot, but the automatically calculated frame size will not be correct as several effects tend to increase the lobe height for multiples of the frame height. Therefore, it is adviseable to have a look at the plot (-p in the script) before taking the result as granted.

Automating the algorithm

Here’s the Python3 script to this article which works well without modification, but for some protocols you might need to adjust it to fit your needs:

#!/usr/bin/env python3
"""
ProtocolFrameLength.py

Determine the frame length of an unknown serial protocol
with constant-length frames containing similar bytes in every frame.

For an explanation, see
Identifying the frame length for an unknown serial protocol
Example usage: $ python3 ProtocolFrameLength.py -b 115200 /dev/ttyACM0 """ import serial import numpy as np import math from functools import reduce import operator __author__ = "Uli Köhler" __license__ = "Apache License v2.0" __version__ = "1.0" __email__ = "[email protected]" def match_score(c1, c2): """ Correlation score for two characters, c1 and c2. Uses simple binary score """ if c1 is None or c2 is None: # Fill chars return 0 return 1 if c1 == c2 else 0 def string_match_score(s1, s2): assert len(s1) == len(s2) ln = len(s1) return sum(match_score(s1[i], s2[i]) for i in range(ln)) def compute_correlation_scores(chars, nomit=-1): # Omit the last nomit characters as single-char matches would be over-valued if nomit == -1: # Auto-value nomit = len(chars) // 10 corr = np.zeros(len(chars) - nomit) # Note: autocorrelation for zero shift is always 1, left out intentionally! for i in range(1, corr.size): # build prefix by Nones prefix = [None] * i s2 = prefix + list(chars[:-i]) # Normalize by max score attainable due to Nones and the sequence length (there are i Nones) corr[i] = string_match_score(chars, s2) / (len(chars) - i) return corr def print_most_likely_frame_length(correlations): # Find the largest correlation coefficient. This model does not require a threshold idx = np.argmax(correlations) print("Frame length is likely {} bytes".format(idx)) def plot_correlations(correlations): from matplotlib import pyplot as plt plt.style.use("ggplot") plt.title("Correlation scores for a protocol with 16-byte frames") plt.gcf().set_size_inches(20,10) plt.plot(correlations) plt.title("Correlation scores") plt.ylabel("Normalized correlation score") plt.xlabel("Shift") plt.show() def multishift_adjust(correlations): """ Multi-shift aware algorithm """ corr_multishift = np.zeros(correlations.size // 4) for i in range(1, corr_multishift.size): # Iterate multiples of i (including i itself) corr_multishift[i] = reduce(operator.mul, (correlations[j] for j in range(i, correlations.size, i)), 1) return corr_multishift if __name__ == "__main__": import argparse parser = argparse.ArgumentParser() parser.add_argument('port', help='The serial port to use') parser.add_argument('-b', '--baudrate', type=int, default=9600, help='The baudrate to use') parser.add_argument('-n', '--num-bytes', type=int, default=200, help='The number of characters to read') parser.add_argument('-m', '--multishift', action="store_true", help='Use the multi-shift aware autocorrelation algorithm') parser.add_argument('-p', '--plot', action="store_true", help='Plot the resulting correlation matrix') args = parser.parse_args() ser = serial.Serial(args.port, args.baudrate) ser.reset_input_buffer() chars = ser.read(args.num_bytes) corr = compute_correlation_scores(chars) if args.multishift: corr = multishift_adjust(corr) print_most_likely_frame_length(corr) if args.plot: plot_correlations(corr)

 

Usage example:

python3 ProtocolFrameLength.py -b 115200 /dev/ttyACM0

You can use -m to use the multiple-shift aware approach.

Use -p to plot the results as shown above. If one of the approaches does not work, it is advisable to plot the result in order to see if there is something visible in the plot which has not been detected by the

As the results depend on the actual data read, it is advisable to perform multiple runs and see if the results vary.

Posted by Uli Köhler in Electronics, Embedded