Skip to content

Commit

Permalink
Samples: Update zivid python samples
Browse files Browse the repository at this point in the history
This PR adds the following samples:
- Hand eye calibration samples based on RoboDK
- Firmware updater
- ROI box via checkerboard

It also formats and standardize all samples.
  • Loading branch information
chrisasc committed Aug 10, 2022
1 parent d83c95d commit 508f22b
Show file tree
Hide file tree
Showing 24 changed files with 704 additions and 98 deletions.
4 changes: 3 additions & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@ disable=bad-continuation, ######## Disabled because of conflicts with bl
consider-using-enumerate, ######## Temporary disabled ########
import-error,
missing-docstring,
duplicate-code
no-member,
duplicate-code,

[TYPE CHECK]
generated-members=cv2.*

Expand Down
30 changes: 18 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ from the camera can be used.
- [camera\_user\_data](https://github.com/zivid/zivid-python-samples/tree/master//source/camera/info_util_other/camera_user_data.py) - Store user data on the Zivid camera.
- [capture\_with\_diagnostics](https://github.com/zivid/zivid-python-samples/tree/master//source/camera/info_util_other/capture_with_diagnostics.py) - Capture point clouds, with color, from the Zivid camera,
with settings from YML file and diagnostics enabled.
- [firmware\_updater](https://github.com/zivid/zivid-python-samples/tree/master//source/camera/info_util_other/firmware_updater.py) - Update firmware on the Zivid camera.
- [get\_camera\_intrinsics](https://github.com/zivid/zivid-python-samples/tree/master//source/camera/info_util_other/get_camera_intrinsics.py) - Read intrinsic parameters from the Zivid camera (OpenCV
model).
- [print\_version\_info](https://github.com/zivid/zivid-python-samples/tree/master//source/camera/info_util_other/print_version_info.py) - Print version information for Python, zivid-python and
Expand Down Expand Up @@ -97,9 +98,13 @@ from the camera can be used.
- [hand\_eye\_calibration](https://github.com/zivid/zivid-python-samples/tree/master//source/applications/advanced/hand_eye_calibration/hand_eye_calibration.py) - Perform Hand-Eye calibration.
- [mask\_point\_cloud](https://github.com/zivid/zivid-python-samples/tree/master//source/applications/advanced/mask_point_cloud.py) - Read point cloud data from a ZDF file, apply a binary
mask, and visualize it.
- [roi\_box\_via\_checkerboard](https://github.com/zivid/zivid-python-samples/tree/master//source/applications/advanced/roi_box_via_checkerboard.py) - Filter the point cloud based on a ROI box given relative
to the Zivid Calibration Board.
- **hand\_eye\_calibration**
- [pose\_conversions](https://github.com/zivid/zivid-python-samples/tree/master//source/applications/advanced/hand_eye_calibration/pose_conversions.py) - Convert to/from Transformation Matrix (Rotation Matrix
+ Translation Vector).
- [robodk\_hand\_eye\_calibration](https://github.com/zivid/zivid-python-samples/tree/master//source/applications/advanced/hand_eye_calibration/robodk_hand_eye_calibration/robodk_hand_eye_calibration.py) - Generate a dataset and perform hand-eye calibration
using the Robodk interface.
- [utilize\_hand\_eye\_calibration](https://github.com/zivid/zivid-python-samples/tree/master//source/applications/advanced/hand_eye_calibration/utilize_hand_eye_calibration.py) - Transform single data point or entire point cloud from
camera frame to robot base frame using Hand-Eye
calibration
Expand All @@ -111,43 +116,44 @@ from the camera can be used.
- **sample\_utils**
- [display](https://github.com/zivid/zivid-python-samples/tree/master//source/sample_utils/display.py) - Display relevant data for Zivid Samples.
- [paths](https://github.com/zivid/zivid-python-samples/tree/master//source/sample_utils/paths.py) - Get relevant paths for Zivid Samples.
- **applications**
- **advanced**
- **hand\_eye\_calibration**
- **robodk\_hand\_eye\_calibration**
- [robot\_tools](https://github.com/zivid/zivid-python-samples/tree/master//source/applications/advanced/hand_eye_calibration/robodk_hand_eye_calibration/robot_tools.py) - Robot Control Module

## Installation

1. [Install Zivid
Software](https://support.zivid.com/latest//getting-started/software-installation.html)
Software](https://support.zivid.com/latest//getting-started/software-installation.html).

2. [Install Zivid Python](https://github.com/zivid/zivid-python). Note:
The recommended Python version for these samples is 3.8.
2. [Install Zivid
Python](https://github.com/zivid/zivid-python#installation).

3. [Download Zivid Sample
Data](https://support.zivid.com/latest//api-reference/samples/sample-data.html)

4. \[Optional\] Launch the Python IDE of your choice. Read our
instructions on [setting up
Python](https://support.zivid.com/latest//api-reference/samples/python/setting-up-python.html).
Data](https://support.zivid.com/latest//api-reference/samples/sample-data.html).

5. Install the runtime requirements using IDE or command line:
4. Install the runtime requirements using IDE or command line:

``` sourceCode
pip install -r requirements.txt
```

6. Add the directory source to PYTHONPATH. Navigate to the root of the
5. Add the directory source to PYTHONPATH. Navigate to the root of the
repository and run:

> - PowerShell: `$env:PYTHONPATH=$env:PYTHONPATH + ";$PWD\source"`
> - cmd: `set PYTHONPATH="$PYTHONPATH;$PWD\source"`
> - bash: `export PYTHONPATH="$PYTHONPATH:$PWD/source"`
7. Open and run one of the samples.
6. Open and run one of the samples.

## Support

For more information about the Zivid cameras, please visit our
[Knowledge Base](https://support.zivid.com/latest). If you run into any
issues please check out
[Troubleshooting](https://support.zivid.com/latest/rst/support/troubleshooting.html).
[Troubleshooting](https://support.zivid.com/latest/support/troubleshooting.html).

## License

Expand Down
1 change: 1 addition & 0 deletions source/.gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
*.rdk filter=lfs diff=lfs merge=lfs -text
23 changes: 3 additions & 20 deletions source/applications/advanced/color_balance.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
import datetime
from dataclasses import dataclass

import matplotlib.pyplot as plt
import numpy as np
import zivid
from sample_utils.display import display_rgb


@dataclass
Expand All @@ -27,22 +27,6 @@ class MeanColor:
blue: np.float64


def _display_rgb(rgb, title):
"""Display RGB image.
Args:
rgb: RGB image (HxWx3 darray)
title: Image title
Returns None
"""
plt.figure()
plt.imshow(rgb)
plt.title(title)
plt.show(block=False)


def _compute_mean_rgb(rgb, pixels):
"""Compute mean RGB values.
Expand Down Expand Up @@ -221,7 +205,7 @@ def _main():
settings_2d = _auto_settings_configuration(camera)

rgba = camera.capture(settings_2d).image_rgba().copy_data()
_display_rgb(rgba[:, :, 0:3], "RGB image before color balance")
display_rgb(rgba[:, :, 0:3], title="RGB image before color balance", block=False)

[red_balance, green_balance, blue_balance] = _color_balance_calibration(camera, settings_2d)

Expand All @@ -231,8 +215,7 @@ def _main():
settings_2d.processing.color.balance.blue = blue_balance
rgba_balanced = camera.capture(settings_2d).image_rgba().copy_data()

_display_rgb(rgba_balanced[:, :, 0:3], "RGB image after color balance")
input("Press Enter to close...")
display_rgb(rgba_balanced[:, :, 0:3], title="RGB image after color balance", block=True)


if __name__ == "__main__":
Expand Down
2 changes: 0 additions & 2 deletions source/applications/advanced/downsample.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,6 @@ def _main():

display_pointcloud(xyz_donwsampled, rgba_downsampled[:, :, 0:3])

input("Press Enter to close...")


if __name__ == "__main__":
# If running the script from Spyder IDE, first run '%gui qt'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ def _visualize_checkerboard_point_cloud_with_coordinate_system(point_cloud_open3
coord_system_mesh = o3d.geometry.TriangleMesh.create_coordinate_frame(size=30)
coord_system_mesh.transform(transform)

visualizer = o3d.visualization.Visualizer() # pylint: disable=no-member
visualizer = o3d.visualization.Visualizer()
visualizer.create_window()
visualizer.add_geometry(point_cloud_open3d)
visualizer.add_geometry(coord_system_mesh)
Expand All @@ -83,9 +83,7 @@ def _main():
point_cloud = frame.point_cloud()

print("Detecting checkerboard and estimating its pose in camera frame")
transform_camera_to_checkerboard = (
zivid.calibration.detect_feature_points(point_cloud).pose().to_matrix()
) # pylint: disable=no-member
transform_camera_to_checkerboard = zivid.calibration.detect_feature_points(point_cloud).pose().to_matrix()
print(f"Camera pose in checkerboard frame:\n{transform_camera_to_checkerboard}")

transform_file = "CameraToCheckerboardTransform.yaml"
Expand Down
39 changes: 30 additions & 9 deletions source/applications/advanced/hand_eye_calibration/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
To fully understand Hand-Eye Calibration, please see the [tutorial][Tutorial-url] in our Knowledge Base.

-----------------
The following applications creates a **Transformation Matrix** from data provided by a user
The following applications create a **Transformation Matrix** from data provided by a user

[HandEyeCalibration][HandEyeCalibration-url]:
[**HandEyeCalibration**][HandEyeCalibration-url]

* Application which walks through the collection of calibration poses
1. Provide robot pose to application (manual entry)
Expand All @@ -14,9 +14,26 @@ The following applications creates a **Transformation Matrix** from data provide
4. Repeat i.-iii. until 10-20 pose pairs are collected
5. Enter command to perform calibration and return a **Transformation Matrix**

[ZividHandEyeCalibration][ZividHandEyeCalibration-url]
[**ZividHandEyeCalibration**][ZividHandEyeCalibration-url]

* [CLI application][CLI application-url] which takes a collection of pose pairs (e.g. output of steps i.-iii. in [HandEyeCalibration][HandEyeCalibration-url]) and returns a **Transformation Matrix**. This application comes with the Windows installer and is part of the tools deb for Ubuntu.
* [CLI application][CLI application-url] which takes a collection of pose pairs (e.g. output of steps 1-3 in [HandEyeCalibration][HandEyeCalibration-url]) and returns a **Transformation Matrix**. This application comes with the Windows installer and is part of the tools deb for Ubuntu.

-----------------

There are two samples that show how to perform acquisition of the hand-eye calibration dataset in this repository.
Both samples go through the process of acquiring the pose and point cloud pairs and then process them to return the resulting hand-eye **Transform Matrix**.

[**UniversalRobotsPerformHandEyeCalibration**][URhandeyecalibration-url]

* This sample is created to work specifically with the UR5e robot.
* To follow the tutorial for this sample go to [**UR5e + Python Hand Eye Tutorial**][URHandEyeTutorial-url].

[**RoboDKHandEyeCalibration**][RobodkHandEyeCalibration-url]

The second sample uses RoboDK for robot control and can be used with any robot that the software supports.
The list for the robots that they support can be found [**here**][robodk-robot-library-url].
Poses must be added by the user to their personal rdk file.
To find best pose practice follow the instructions provided on the Zivid knowledge base for the [hand-eye calibration process][ZividHandEyeCalibration-url].

-----------------
The following applications assume that a **Transformation Matrix** has been found
Expand All @@ -31,27 +48,31 @@ The following applications assume that a **Transformation Matrix** has been foun

[**PoseConversions**][PoseConversions-url]:

* Zivid primarily operate with a (4x4) Transformation Matrix (Rotation Matrix + Translation Vector). This example shows how to use Eigen to convert to and from:
* Zivid primarily operate with a (4x4) **Transformation Matrix** (Rotation Matrix + Translation Vector). This example shows how to use Eigen to convert to and from:
* AxisAngle, Rotation Vector, Roll-Pitch-Yaw, Quaternion

[**VerifyHandEyeWithVisualization**][VerifyHandEyeWithVisualization-url]:

Visually demonstrates the hand-eye calibration accuracy by overlapping transformed points clouds.

* The application asks the user for the hand-eye calibration type (manual entry).
* After loading the hand-eye dataset (point clouds and robot poses) and the hand-eye output (transformation matrix), the application repeats the following process for all data pairs:
* After loading the hand-eye dataset (point clouds and robot poses) and the hand-eye output (**transformation matrix**), the application repeats the following process for all data pairs:
1. Transforms the point cloud
2. Finds cartesian coordinates of the checkerboard centroid
3. Creates a region of interest around the checkerboard and filters out points outside the region of interest
4. Saves the point cloud to a PLY file
5. Appends the point cloud to a list (overlapped point clouds)
This application ends by displaying all point clouds from the list.


This application ends by displaying all point clouds from the list.

[HandEyeCalibration-url]: hand_eye_calibration.py
[UtilizeHandEyeCalibration-url]: utilize_hand_eye_calibration.py
[VerifyHandEyeWithVisualization-url]: verify_hand_eye_with_visualization.py
[ZividHandEyeCalibration-url]: https://support.zivid.com/latest/academy/applications/hand-eye/hand-eye-calibration-process.html
[Tutorial-url]: https://support.zivid.com/latest/academy/applications/hand-eye.html
[PoseConversions-url]: pose_conversions.py
[CLI application-url]: https://support.zivid.com/latest/academy/applications/hand-eye/zivid_CLI_tool_for_hand_eye_calibration.html
[CLI application-url]: https://support.zivid.com/latest/academy/applications/hand-eye/zivid_CLI_tool_for_hand_eye_calibration.html
[URhandeyecalibration-url]: ur_hand_eye_calibration/universal_robots_perform_hand_eye_calibration.py
[URHandEyeTutorial-url]: https://support.zivid.com/en/latest/academy/applications/hand-eye/ur5-robot-+-python-generate-dataset-and-perform-hand-eye-calibration.html
[RobodkHandEyeCalibration-url]: robodk_hand_eye_calibration/robodk_hand_eye_calibration.py
[robodk-robot-library-url]: https://robodk.com/supported-robots
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Hand Eye Calibration with RoboDK

Hand-eye calibration is a necessity for any picking scenario that involves a camera and a robot.
This sample offers an easy and adaptable method to perform hand eye with a variety of robots that are supported in RoboDK.

For more on Hand-Eye Calibration, please see the [tutorial](https://support.zivid.com/latest/academy/applications/hand-eye.html) in our Knowledge Base.

If you need help with the sample, visit our Knowledge Base article [here](help.zivid.com) (To be updated)

This sample is made and modeled with a Universal Robots UR5e robot.
It is a requirement that you make your own poses that suit your environment.
If you have a different robot from a UR5e you will need to load in the corresponding robot to your rdk file.

## Installation

1. [Install Zivid Software](https://support.zivid.com/latest//getting-started/software-installation.html)

2. [Install Zivid Python](https://github.com/zivid/zivid-python).
Note: The recommended Python version for these samples is 3.8.

3. [Install RoboDK](https://robodk.com/download)

4. [Install RoboDK python](https://pypi.org/project/robodk/)

Other requirements can be installed using the following command:

pip install -r requirements.txt
Git LFS file not shown
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
robodk
typing
Loading

0 comments on commit 508f22b

Please sign in to comment.