NezaRobot
Saturday, August 5, 2017
TCP/IP Remote Controller for Light Switch and Battery powered by Solar Panels
TCP/IP Remote Controller for Light Switch and Battery powered by Solar Panels
12V Battery, Board (including raspberry pi, relay module and converter), Solar charge controller (from right to left) |
Intro:
Last year, we bought a bunch of solar cells for me and my younger brother, Jeffery, to install into our backyard. We then got a bunch of LED lights to help light up our work area as it wasn’t properly lit, which was powered by a 12V battery that was in turn charged by the installed solar panel. However, whenever the light was needed, I had to physically unplug the wires connecting the solar panel to the transformer and attach the light’s wires with that of the battery, every single time.
After awhile, it got too tiring, switching it every time we needed the light. And so, since I had an extra Raspberry Pi 2 (a popular single-board computer) lying around, I decided to make a program so that I could control the light switch and the charging of the battery through the solar cells all at the same time. If accomplished, then anyone with the program/app would be able to control the lights and the charging of the battery without the need to laboriously detach and attach the cables all the time. And so, my new project was born!
Materials:
- Raspberry Pi (with Wifi)
- Installed Solar Panels
- Solar Charge Controller
- 12V Battery
- LED Lights
- Many Wires
- Relay Module
- Solder (iron, material, etc..)
- Solderable Breadboard
- Solderable Signal Converter
(Keep in mind, you can customize/change/swap out these items for your own projects)
**This guide assumes that a functioning solar panel has already been installed*
I will link our solar panel that we installed in our backyard about a year ago: https://solarpanelsang.blogspot.ca/
Installing the Board
Technically, you don’t even need a circuit board, but it just makes life a lot easier as the raspberry pi, the relay module, and the converter are all connected together on a single board.
Once everything is attached firmly onto the board, the next step is to connect all of the GPIO pins on the raspberry pi to the converter, as well as to the relay module. I needed the GPIO pinout of the raspberry pi (or find your specific GPIO pinout).
Basically, I connected the pi to the signal converter, through the DC 3.3V and 5V, as well as the ground and some GPIO pins from the pi to the relay module as well. This is essentially how the pi communicates with the switch (ex the GPIO outputs either HIGH voltage or LOW voltage, and this power gets sent over, for my case GPIO 17 on the pi to the switch. The converter is simply to convert this signal to the module. Then, depending on the received voltage, the relay module would either switch on of off (to the Normally Open [NO] pin or the Normally Closed [NC] pin).
Programming
The magic that will allow the remotely-controlled switch to work is the client-server tcp/ip communication protocol. But in order to do this, we need a platform to build our app on: Android Studio (https://developer.android.com/studio/index.html). For my project, I used Java (on Android) as well as C to control the Raspberry Pi’s GPIO pins, which in turn either turns the switch on or off.
The Client:
Here is the source code for the client-side of the tcp/ip communication through wifi, (written in Java 1.8 and Android Studio version 2.3.3) found on my Github:https://github.com/dave2000sang/remote_switch_solar_panel/blob/3091131fa40eac9628c33a6d1f26d411dc89d07b/server
Using Java, I created the Client-side TCP/IP on Android Studio. In essence, the program was quite simple, and only took a few hours to understand.
First, I created a ToggleButton with ID toggleBtn1, which is basically a push button on the app, which creates a boolean isChecked upon being called by the abstract class CompoundButton’s setOnCheckedChangeListener method.
The actual client requires a Socket to communicate with the server, which is mSocket in my case. To use the Client as output whenever the button is clicked, which is based on the boolean value of isChecked, I used a PrintWriter class, which prints out characters into the output stream using getOutputStream() to the socket mSocket. Also, I optionally added two TextView text outputs just to print the current status of the light and the battery.
Whenever isChecked carries a true boolean value (button clicked) then the Client program will output a string into the Socket connecting the server to the client. And on the server side, it will check if this string equals to “LIGHTON” or "LIGHTOFF", in which case the server will know that the button was clicked; thus, allowing it to proceed to turn the switch on or off.
Thursday, June 9, 2016
Auto login after reboot for Beaglebone black (BBB)
By default, BBB board has root log in at console.
My board connected a USB WIFI, I don't want to log in to enter root at console.
1) Auto Login
Edit file:
/lib/systemd/system/serial-getty@.service
[Service]
Environment=TERM=vt100
ExecStart=-/sbin/agetty -a root -s %I 115200
2) Start Service
Create file:
/lib/systemd/system/robot.service
[Unit]
Description=Robot
After=syslog.target network.target network-online.target
[Service]
Type=simple
ExecStart=/usr/bin/run.sh
[Install]
WantedBy=multi-user.target
3) Script file
Create file:
/usr/bin/run.sh
#!/bin/sh
cd /home/root
./run_service.sh
My board connected a USB WIFI, I don't want to log in to enter root at console.
1) Auto Login
Edit file:
/lib/systemd/system/serial-getty@.service
[Service]
Environment=TERM=vt100
ExecStart=-/sbin/agetty -a root -s %I 115200
2) Start Service
Create file:
/lib/systemd/system/robot.service
[Unit]
Description=Robot
After=syslog.target network.target network-online.target
[Service]
Type=simple
ExecStart=/usr/bin/run.sh
[Install]
WantedBy=multi-user.target
3) Script file
Create file:
/usr/bin/run.sh
#!/bin/sh
cd /home/root
./run_service.sh
4) Service file
/home/root/run_service.sh
#!/bin/sh
cd /sang/driver
insmod rfkill.ko
insmod cfg80211.ko
insmod mac80211.ko
insmod lib80211.ko
insmod rt2x00lib.ko
insmod rt2800lib.ko
insmod rt2x00usb.ko
insmod rt2800usb.ko
#can bus
echo "can bus set up ..."
echo BB-DCAN1 > /sys/devices/bone_capemgr.9/slots
cd /sang
./canconfig can0 bitrate 300000
ifconfig can0 up
echo "PWM setup ..."
cd /sang/driver
insmod pwm_test.ko
echo am33xx_pwm > /sys/devices/bone_capemgr.9/slots
echo bone_pwm_P9_31 > /sys/devices/bone_capemgr.9/slots
echo bone_pwm_P9_29 > /sys/devices/bone_capemgr.9/slots
echo bone_pwm_P8_36 > /sys/devices/bone_capemgr.9/slots
echo bone_pwm_P8_34 > /sys/devices/bone_capemgr.9/slots
echo bone_pwm_P8_45 > /sys/devices/bone_capemgr.9/slots
echo bone_pwm_P8_46 > /sys/devices/bone_capemgr.9/slots
5) run /etc/profile after reboot
Edit /etc/profile
cd /home/root
./run_wifi.sh
6) Start WIFI script file
/home/root/run_wifi.sh
#!/bin/sh
killall wpa_supplicant
ip link set wlan0 up
iw dev wlan0 set power_save off
wpa_supplicant -B -D wext -i wlan0 -c /etc/wpa_supplicant.conf
iw dev wlan0 link
ifconfig wlan0 192.168.2.77 up
mount -o nolock 192.168.2.99:/home/bsang /nfs &
mount -o nolock 192.168.2.96:/home/bsang /nfs2 &
Monday, April 25, 2016
Auto Drive
Robot Moves forward
Detect Left lane and Right lane, calculate angle and distance comparing them to decide if needs to turn or forward.
Robot Turns Left
Detect Right top lane and distance. If distance is near enough, then turn left. Since Top lane is at right side, so Robot needs to turn left, otherwise turn to right.
Robot Moves forward after turning
Turn left after several seconds, depending on speed, then check Right Lane, if Right lane angle is bigger than 70 degree, moves forward.
Sunday, March 20, 2016
Android + SurfaceTexture + Camera2 + OpenCV + NDK
Using Android Studio 1.5.1, OpenCV own camera class is too slow for video processing. I searched a lot at web, but didn't find a good and whole solution, so I decided to program by myself. With Android supplied camera2 class, it is fast -- about 30 fps and has more camera controls--like auto focus, set resolution, quality, etc..
Full source code can be found:
https://github.com/webjb/myrobot
Code is based on Android Studio sample camer2raw.
Whole idea is using class Camera2 to capture video, class ImageReader to obtain capture video frame, then send each frame image to NDK to process video -- recognition, draw lines with OpenCV functions, then draw processed frame to surface using class TextureView.
1) class Camera2 control and enable capture:
2) class ImageReader to obtain captured frame
Whenever one frame is captured, this callback will be called,
Then call JNI to send to image to NDK C++ for image processing.
3) JNI interface
4) OpenCV for image processing
Frame is YUV format data, it's buffer is:
Then you can do whatever you want for image processing with powerful OpenCV functions.
After processing, you need to put buffer into TextureView for display:
Please notice that stride value.
Finally, you can return back
https://github.com/webjb/myrobot
Enjoy!
Full source code can be found:
https://github.com/webjb/myrobot
Code is based on Android Studio sample camer2raw.
Whole idea is using class Camera2 to capture video, class ImageReader to obtain capture video frame, then send each frame image to NDK to process video -- recognition, draw lines with OpenCV functions, then draw processed frame to surface using class TextureView.
1) class Camera2 control and enable capture:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 | private void createCameraPreviewSession() { CaptureRequest.Builder mPreviewRequestBuilder; try { SurfaceTexture texture = mSurfaceView.getSurfaceTexture(); // We configure the size of default buffer to be the size of camera preview we want. texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight()); // This is the output Surface we need to start preview. mSurface = new Surface(texture); // We set up a CaptureRequest.Builder with the output Surface. mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); //mPreviewRequestBuilder.addTarget(mSurface); mPreviewRequestBuilder.addTarget(mImageReader.get().getSurface()); BlockingSessionCallback sessionCallback = new BlockingSessionCallback(); List<Surface> outputSurfaces = new ArrayList<>(); outputSurfaces.add(mImageReader.get().getSurface()); //outputSurfaces.add(mSurface); mCameraDevice.createCaptureSession(outputSurfaces, sessionCallback, mBackgroundHandler); try { Log.d(TAG, "waiting on session."); mCaptureSession = sessionCallback.waitAndGetSession(SESSION_WAIT_TIMEOUT_MS); try { mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,CaptureRequest.CONTROL_AF_MODE_AUTO); // mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE); // Comment out the above and uncomment this to disable continuous autofocus and // instead set it to a fixed value of 20 diopters. This should make the picture // nice and blurry for denoised edge detection. // mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, // CaptureRequest.CONTROL_AF_MODE_OFF); // mPreviewRequestBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, 20.0f); // Finally, we start displaying the camera preview. Log.d(TAG, "setting repeating request"); mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler); } catch (CameraAccessException e) { e.printStackTrace(); } } catch (TimeoutRuntimeException e) { showToast("Failed to configure capture session."); } } catch (CameraAccessException e) { e.printStackTrace(); } } |
2) class ImageReader to obtain captured frame
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 | private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() { @Override public void onImageAvailable(ImageReader reader) { Image image; String result; try { image = reader.acquireLatestImage(); if( image == null) { return; } int fmt = reader.getImageFormat(); Log.d(TAG,"bob image fmt:"+ fmt); if( mTakePicture == 1) { result = JNIUtils.detectLane(image, mSurface, mFileName, mTakePicture); mTakePicture = 0; } else { result = JNIUtils.detectLane(image, mSurface, mFileName, mTakePicture); } Log.d(TAG, "bob Lane Detect result: " + result); comm.send_lane(result); } catch (IllegalStateException e) { Log.e(TAG, "Too many images queued for saving, dropping image for request: "); return; } image.close(); } }; |
image = reader.acquireLatestImage();
Then call JNI to send to image to NDK C++ for image processing.
result = JNIUtils.detectLane(image, mSurface, mFileName, mTakePicture);After image is done, close it to release buffer.
3) JNI interface
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 | public static String detectLane(Image src, Surface dst, String path, int savefile) { if (src.getFormat() != ImageFormat.YUV_420_888) { throw new IllegalArgumentException("src must have format YUV_420_888."); } Plane[] planes = src.getPlanes(); // Spec guarantees that planes[0] is luma and has pixel stride of 1. // It also guarantees that planes[1] and planes[2] have the same row and // pixel stride. if (planes[1].getPixelStride() != 1 && planes[1].getPixelStride() != 2) { throw new IllegalArgumentException( "src chroma plane must have a pixel stride of 1 or 2: got " + planes[1].getPixelStride()); } return detectLane(src.getWidth(), src.getHeight(), planes[0].getBuffer(), dst, path, savefile); } |
4) OpenCV for image processing
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 | JNIEXPORT jstring JNICALL Java_com_neza_myrobot_JNIUtils_detectLane( JNIEnv *env, jobject obj, jint srcWidth, jint srcHeight, jobject srcBuffer, jobject dstSurface, jstring path, jint saveFile) { char outStr[2000]; const char *str = env->GetStringUTFChars(path, NULL); LOGE("bob path:%s saveFile=%d", str, saveFile); uint8_t *srcLumaPtr = reinterpret_cast<uint8_t *>(env->GetDirectBufferAddress(srcBuffer)); if (srcLumaPtr == nullptr) { LOGE("blit NULL pointer ERROR"); return NULL; } int dstWidth; int dstHeight; cv::Mat mYuv(srcHeight + srcHeight / 2, srcWidth, CV_8UC1, srcLumaPtr); uint8_t *srcChromaUVInterleavedPtr = nullptr; bool swapDstUV; ANativeWindow *win = ANativeWindow_fromSurface(env, dstSurface); ANativeWindow_acquire(win); ANativeWindow_Buffer buf; dstWidth = srcHeight; dstHeight = srcWidth; ANativeWindow_setBuffersGeometry(win, dstWidth, dstHeight, 0 /*format unchanged*/); if (int32_t err = ANativeWindow_lock(win, &buf, NULL)) { LOGE("ANativeWindow_lock failed with error code %d\n", err); ANativeWindow_release(win); return NULL; } uint8_t *dstLumaPtr = reinterpret_cast<uint8_t *>(buf.bits); Mat dstRgba(dstHeight, buf.stride, CV_8UC4, dstLumaPtr); // TextureView buffer, use stride as width Mat srcRgba(srcHeight, srcWidth, CV_8UC4); Mat flipRgba(dstHeight, dstWidth, CV_8UC4); // convert YUV -> RGBA cv::cvtColor(mYuv, srcRgba, CV_YUV2RGBA_NV21); // Rotate 90 degree cv::transpose(srcRgba, flipRgba); cv::flip(flipRgba, flipRgba, 1); #if 0 int ball_x; int ball_y; int ball_r; ball_r = 0; BallDetect(flipRgba, ball_x, ball_y, ball_r); if( ball_r > 0) LOGE("ball x:%d y:%d r:%d", ball_x, ball_y, ball_r); else LOGE("ball not detected"); #endif LaneDetect(flipRgba, str, saveFile, outStr); // copy to TextureView surface uchar *dbuf; uchar *sbuf; dbuf = dstRgba.data; sbuf = flipRgba.data; int i; for (i = 0; i < flipRgba.rows; i++) { dbuf = dstRgba.data + i * buf.stride * 4; memcpy(dbuf, sbuf, flipRgba.cols * 4); sbuf += flipRgba.cols * 4; } // Draw some rectangles Point p1(100, 100); Point p2(300, 300); cv::line(dstRgba, Point(dstWidth/2, 0), Point(dstWidth/2, dstHeight-1),Scalar(255, 255, 255)); cv::line(dstRgba, Point(0,dstHeight-1), Point(dstWidth-1, dstHeight-1),Scalar(255,255,255 )); LOGE("bob dstWidth=%d height=%d", dstWidth, dstHeight); ANativeWindow_unlockAndPost(win); ANativeWindow_release(win); return env->NewStringUTF(outStr); } } |
Frame is YUV format data, it's buffer is:
uint8_t *srcLumaPtr = reinterpret_cast<uint8_t *>(env->GetDirectBufferAddress(srcBuffer));Create YUV Mat:
cv::Mat mYuv(srcHeight + srcHeight / 2, srcWidth, CV_8UC1, srcLumaPtr);Convert YUV to RGBA
// convert YUV -> RGBAcv::cvtColor(mYuv, srcRgba, CV_YUV2RGBA_NV21);For some reason, ImageReader video is rotated to 90 degree, so rotate -90:
// Rotate 90 degreecv::transpose(srcRgba, flipRgba); cv::flip(flipRgba, flipRgba, 1);
Then you can do whatever you want for image processing with powerful OpenCV functions.
After processing, you need to put buffer into TextureView for display:
// copy to TextureView surfaceuchar *dbuf; uchar *sbuf; dbuf = dstRgba.data; sbuf = flipRgba.data; int i; for (i = 0; i < flipRgba.rows; i++) { dbuf = dstRgba.data + i * buf.stride * 4; memcpy(dbuf, sbuf, flipRgba.cols * 4); sbuf += flipRgba.cols * 4; }
Please notice that stride value.
Finally, you can return back
ANativeWindow_unlockAndPost(win);
ANativeWindow_release(win);
Again, source code here:https://github.com/webjb/myrobot
Enjoy!
Monday, March 7, 2016
Moving to Android Studio 1.5.1
My previous env is ADT, but Google released new version Android Studio 1.5.1 which is better to use. NDK is integrated into Android Studio, so it is easy to use. Don't need to go to console to compile C++, then run it. Just modify C++ code, they Android Studio will compile automatically for you. Unfortunately, NDK is not fully supported, but it is enough for my Robot project.
Android Studio requires more powerful PC, more RAM. Doesn't support my 10+ years old Windows XP. So I upgrade it to a new PC with Windows 10.
I have been struggling a lot of time to make everything working together with Android Studio. NDK + OpenCV + Camera2 class. But it is worth for it since this Android Studio is a main tools for Android development.
Android Studio requires more powerful PC, more RAM. Doesn't support my 10+ years old Windows XP. So I upgrade it to a new PC with Windows 10.
I have been struggling a lot of time to make everything working together with Android Studio. NDK + OpenCV + Camera2 class. But it is worth for it since this Android Studio is a main tools for Android development.
Thursday, October 15, 2015
BBB: use tftp for remote boot
Once Kernel is built wrong, it can't boot up, or you delete /bin/ some files by accident, nothing can run, even ls command, DON'T BE panic.. You can use tftp to reboot from other PC.
First you have one Linux PC which tftp is installed and enabled.
Then you can put compiled Kernel and root file system to Linux PC, BBB board can boot from this PC with Kernel and root file system.
After BBB board is powered on, press enter key quickly, it will enter u-boot mode, you need to set BBB board local IP address, Linux PC server IP address and tftp boot command. Something like:
Linux PC IP address:
setenv serverip 192.168.2.99
BBB board IP address
setenv ipaddr 192.168.2.97
Linux Kernel, you need to put Kernel uImage file at /tftpboot
tftp 0x80200000 uImage-dtb.am335x-boneblack
u-boot will download Kernel image to BBB board.
Set root file system, nfs service is installed and enabled at Linux PC
setenv bootargs console=ttyO0,115200n8 root=/dev/nfs rw nfsroot=192.168.2.99:/home/bsang/sang/robot/bbb/rootfs ip=192.168.2.97:::::eth0
boot BBB board
bootm 80200000
Congratulation! Your BBB board is still working. You can new Kernel to BBB or you can upgrade root file system. Your life is saved.
First you have one Linux PC which tftp is installed and enabled.
Then you can put compiled Kernel and root file system to Linux PC, BBB board can boot from this PC with Kernel and root file system.
After BBB board is powered on, press enter key quickly, it will enter u-boot mode, you need to set BBB board local IP address, Linux PC server IP address and tftp boot command. Something like:
Linux PC IP address:
setenv serverip 192.168.2.99
BBB board IP address
setenv ipaddr 192.168.2.97
Linux Kernel, you need to put Kernel uImage file at /tftpboot
tftp 0x80200000 uImage-dtb.am335x-boneblack
u-boot will download Kernel image to BBB board.
Set root file system, nfs service is installed and enabled at Linux PC
setenv bootargs console=ttyO0,115200n8 root=/dev/nfs rw nfsroot=192.168.2.99:/home/bsang/sang/robot/bbb/rootfs ip=192.168.2.97:::::eth0
boot BBB board
bootm 80200000
Congratulation! Your BBB board is still working. You can new Kernel to BBB or you can upgrade root file system. Your life is saved.
Subscribe to:
Posts (Atom)