Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(autoware.repos): update transport drivers branch #3742

Closed
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
fix(docs): better tutorials (#7)
* docs: fix 02_installation

Signed-off-by: Akihito OHSATO <[email protected]>

* docs: update 01_hardware_setup to show display port usage

Signed-off-by: Manato HIRABAYASHI <[email protected]>

* docs: update 02_installation to show how to remote access

Signed-off-by: Manato HIRABAYASHI <[email protected]>

* docs: change build commands from raw one to using scripts

Signed-off-by: Manato HIRABAYASHI <[email protected]>

* docs: update 03_sensor_calibration to show how to access via ssh and starting camera stream

Signed-off-by: Manato HIRABAYASHI <[email protected]>

* docs: update 03_sensor_calibration and add some screen shots

Signed-off-by: Manato HIRABAYASHI <[email protected]>

* docs: update 04_launch_application

Signed-off-by: Manato HIRABAYASHI <[email protected]>

* docs: move figures

Signed-off-by: Akihito OHSATO <[email protected]>

* docs: using NOTE blockquote

Signed-off-by: Akihito OHSATO <[email protected]>

* docs: fix upper case

Signed-off-by: Akihito OHSATO <[email protected]>

* docs: fix note/warning blockquate style

Signed-off-by: Akihito OHSATO <[email protected]>

* docs: fix note/warning blockquate style

Signed-off-by: Akihito OHSATO <[email protected]>

* docs: fix netplan warning

Signed-off-by: Akihito OHSATO <[email protected]>

* docs: add hardware setup photo

Signed-off-by: Akihito OHSATO <[email protected]>

---------

Signed-off-by: Akihito OHSATO <[email protected]>
Signed-off-by: Manato HIRABAYASHI <[email protected]>
Co-authored-by: Manato HIRABAYASHI <[email protected]>
  • Loading branch information
aohsato and manato committed Jun 29, 2023
commit 18b70a4bfe5bdb5766b8e1365ba598796981c390
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ contents of this repository. Various perception applications will be added in th

This repository is based on a natively built ROS2 environment. The sample system overview is shown below.

![system overview](docs/tutorials/connection.drawio.svg "edge-auto overview")
![system overview](docs/tutorials/figures/connection.drawio.svg "edge-auto overview")

### System Requirement

Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ This tutorial will show you step-by-step instructions on how to use samples incl

1. [Hardware setup](tutorials/01_hardware_setup.md)
2. [Installation](tutorials/02_installation.md)
3. [Sensor Calibration](tutorials/03_sensor_calibration.md)
3. [Sensor calibration](tutorials/03_sensor_calibration.md)
4. [Launch application](tutorials/04_launch_application.md)
5. [Trouble shooting](tutorials/05_trouble_shooting.md)
33 changes: 22 additions & 11 deletions docs/tutorials/01_hardware_setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,23 @@ This following hardware configuration is used throughout this tutorial.

### Connection diagram

The figure below depicts the connection diagram between sensors and ECUs for this tutorial.
The figures below depicts the connection diagram between sensors and ECUs for this tutorial.
This network configuration, including applying the IP addresses to the specific interface, will be automatically done during the steps in [2.Installation](./02_installation.md) page.

**NOTE: Internet connection is required for 2.Installation step.**
> **Note**
>
> Internet connection is required for 2.Installation step.
>
> The next [2.Installation](./02_installation.md) step requires the internet connection for git clone and ML model download.
> Please connect an Ethernet cable to the port indicated in the figure below for the internet connection.

The next [2.Installation](./02_installation.md) step requires the internet connection for git clone and ML model download.
Please connect an Ethernet cable to the port indicated in the figure below for the internet connection.
> **Note**
>
> Try other display port if nothing is shown on the display.

![connection diagram of sample system](connection.drawio.svg "connection diagram of sample system")
![connection diagram of sample system](figures/connection.drawio.svg "connection diagram of sample system")

![hardware setup of sample system](figures/hardware_setup.png "hardware setup of sample system")

### Sensor driver

Expand All @@ -49,7 +57,7 @@ The figure below depicts the synchronization design between sensors and ECUs in

For more details, please refer to the [tier4/sensor_trigger](https://github.com/tier4/sensor_trigger) repository.

![synchronization design of sample system](synchronization.drawio.svg "synchronization design of sample system")
![synchronization design of sample system](figures/synchronization.drawio.svg "synchronization design of sample system")

## 1-1. x86-based ECU

Expand All @@ -59,10 +67,13 @@ Before proceeding with [2.Installation](./02_installation.md) step, install Ubun

Before proceeding with [2.Installation](./02_installation.md) step, install NVIDIA L4T R32.6.1 (including Ubuntu 18.04) to your Jetson-based ECU.

**NOTE: BSP installation for ADLINK RQX-58G**

RQX-58G need to be properly configured according to the official quick start guide from ADLINK Technology, Inc.
Please see the [official document](https://www.adlinktech.com/Products/Download.ashx?type=MDownload&isQuickStart=yes&file=1783%5croscube-x-bsp-qsg-l4t-32.5.0-kernel-1.0.8.pdf) in detail.
To download the BSP image, please visit the ADLINK official page [here](https://www.adlinktech.com/Products/DownloadSoftware.aspx?lang=en&pdNo=1783&MainCategory=ROS2-Solution.aspx&kind=BS). (If you are accessing the site for the first time, you will be prompted to create an account.)
> **Note**
>
> BSP installation for ADLINK RQX-58G
>
> RQX-58G need to be properly configured according to the official quick start guide from ADLINK Technology, Inc.
> Please see the [official document](https://www.adlinktech.com/Products/Download.ashx?type=MDownload&isQuickStart=yes&file=1783%5croscube-x-bsp-qsg-l4t-32.5.0-kernel-1.0.8.pdf) in detail.
>
> To download the BSP image, please visit the ADLINK official page [here](https://www.adlinktech.com/Products/DownloadSoftware.aspx?lang=en&pdNo=1783&MainCategory=ROS2-Solution.aspx&kind=BS). (If you are accessing the site for the first time, you will be prompted to create an account.)

While TIER IV camera driver ([tier4/tier4_automotive_hdr_camera](https://github.com/tier4/tier4_automotive_hdr_camera)) is included in the RQX-58G BSP official image, you can also update it during the following setup process.
60 changes: 41 additions & 19 deletions docs/tutorials/02_installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,20 @@

Setup for both x86-based and Jetson-based ECUs.

**NOTE: Internet connection is required in this step.**
> **Note**
>
> Internet connection is required in this step.

## 2-1. x86-based ECU

**NOTE: Network settings are automatically updated.**

During this procedure, IP addresses are assigned to some network interfaces (refer to the connection diagram on [1.Hardware setup](01_hardware_setup.md) for more detail) using `netplan`.
This behavior may cause unexpected disconnection, if you are accessing the ECU remotely via those interfaces.

If you would like to change network interfaces or IP addresses to be assigned, edit `edge-auto/ansible/playbooks/vars/edge_auto.yaml` before running `setup-dev-env.sh`
> **Warning**
>
> Network settings are automatically updated.
>
> During this procedure, IP addresses are assigned to some network interfaces (refer to the connection diagram on [1.Hardware setup](01_hardware_setup.md) for more detail) using `netplan`.
> This behavior may cause unexpected disconnection, if you are accessing the ECU remotely via those interfaces.
>
> If you would like to change network interfaces or IP addresses to be assigned, edit `edge-auto/ansible/playbooks/vars/edge_auto.yaml` before running `setup-dev-env.sh`

### Download the repository and setup your environment

Expand Down Expand Up @@ -47,16 +51,25 @@ vcs import src < autoware.repos
Install ros package dependencies and build your ROS workspace.

```sh
rosdep update
rosdep install -y -r --from-paths src --ignore-src --rosdistro $ROS_DISTRO

colcon build \
--symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release \
--packages-up-to edge_auto_launch
./build.sh

...
Finished <<< image_projection_based_fusion [0.25s]
Starting >>> edge_auto_launch
Finished <<< edge_auto_launch [0.22s]

Summary: 66 packages finished [6.85s]
2 packages had stderr output: extrinsic_interactive_calibrator intrinsic_camera_calibrator
```

## 2-2. Jetson-based ECU

**This following steps can be performed from your x86-based ECU via ssh**
> **Note**
>
> This following steps can be performed from your x86-based ECU via ssh.

### Download the repository and setup the environment

Expand All @@ -71,12 +84,15 @@ You can install the dependencies using the provided ansible script.
During the installation process, you will be asked if you want to install the TIER IV camera driver.
If you already have the driver installed and want to skip this step, please type `N` to continue.

**NOTE: `setup-dev-env.sh` script may take several hours.**
> **Note**
>
> `setup-dev-env.sh` script may take several hours.

```sh
./setup-dev-env.sh

[Warning] Do you want to install/update the TIER IV camera driver? [y/N]:
[Warning] Do you want to configure the network? This configuration may overwrite the IP address of the specific network interface [y/N]:
```

Finally, please reboot the system to make the installed dependencies and permission settings effective.
Expand All @@ -85,6 +101,13 @@ Finally, please reboot the system to make the installed dependencies and permiss
sudo reboot
```

If you chose `y` for the prompt of the network configuration,
the IP address for the specified network interface is fixed (default: `192.168.2.2` for `eth1`).
In that case, you can access the Jetson-based ECU and perform the following steps via remote access, such as:
```sh
ssh <IP_address> -l <username>
```

### Build edge-auto-jetson workspace

Create your ROS workspace and clone repositories using vcstool.
Expand All @@ -98,13 +121,10 @@ vcs import src < autoware.repos
Build your ROS workspace.

```sh
colcon build \
--symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release \
-DPython3_EXECUTABLE=$(which python3.6) -DCMAKE_CUDA_STANDARD=14 \
--packages-up-to edge_auto_jetson_launch
./build.sh
```

## Update your workspace
## (Optional) Update your workspace

If you want to update cloned repositories, use the following command.

Expand All @@ -113,9 +133,11 @@ vcs import src < autoware.repos
vcs pull src
```

## Modify camera exposure timing
## (Optional) Modify camera exposure timing

**NOTE: On the sample system introduced in [1.Hardware setup](01_hardware_setup.md) step, this does not need to be changed.**
> **Note**
>
> On the sample system introduced in [1.Hardware setup](01_hardware_setup.md) step, this doesn't need to be changed.

If you want to change the exposure time of cameras for sensor synchronization, please modify the following files.

Expand Down
47 changes: 43 additions & 4 deletions docs/tutorials/03_sensor_calibration.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,25 @@

Estimate your intrinsic/extrinsic parameters on your sensor system using [tier4/calibration_tools](https://github.com/tier4/CalibrationTools).

**NOTE: Perform the following tasks on the x86-based ECU**.
> **Note**
>
> Perform the following tasks on the x86-based ECU.

## 3-1. Calculate intrinsic parameters for cameras

First, launch `calibration_intrinsic` to estimate your intrinsic parameters of your cameras.
Before starting intrinsic parameters estimation, start camera streaming:
```sh
# optional: access Jetson-based ECU remotely
ssh <IP_address(default: 192.168.2.2)> -l <username>

# start camera streaming
cd edge-auto-jetson
source install/setup.bash

ros2 launch edge_auto_jetson_launch edge_auto_jetson.launch.xml perception:=false
```

Then, launch `calibration_intrinsic` to estimate your intrinsic parameters of your cameras.
See [this document](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) for detailed operation on the tool.


Expand All @@ -17,6 +31,30 @@ source install/setup.bash
ros2 launch edge_auto_launch calibration_intrinsic_sample.launch.xml
```

On the first popup window, select `ROS topic` to perform calibration on live streaming camera data
(In the case of using recorded data in ROSBAG, select `ROS bag`).
Afterwards, select the appropriate board options and board parameters, then click `Start`:

![Example: the first popup of the intrinsics calibration tool](figures/intrinsic_first_popup.png "Example: the first popup of the intrinsics calibration tool")

On the second popup window, select the topic name of the camera you would like to perform intrinsics calibration.
This tool can accept `compressed` topic as the target topic, and the compressed data would be preferred to save
the network load of the system.
Note that each topic on ROS 2 has its quality of service (QoS).
If the actual QoS assigned by a publisher and QoS(s) expected by subscribers are not acceptable combinations,
the communication between them will not be issued properly. By default, edge-auto-jetson publishes
sensor data with `BEST_EFFORT` QoS reliability, so here choose it on the drop down list.

![Example: the second popup of the intrinsics calibration tool](figures/intrinsic_second_popup.png "Example: the second popup of the intrinsics calibration tool")

After data collection, click `calibrate` and wait until "Calibration status" turns into `idle`.
If intrinsic parameters are estimated correctly, the reports of the estimation are displayed on
some additional popup windows:

![Example: after calculating intrinsics](figures/intrinsic_after_calibration.png "Example: after calculating intrinsics")

Repeat this step for all cameras in your system.

Intrinsic parameters for all cameras consisting of your system (e.g., `camera0` and `camera1` in this tutorial)
should be stored on `individual_params`.
After acquiring the files, put them in appropriate folder on your **Jetson-based ECU** so that they are loaded and published as `camera_info` topics:
Expand All @@ -35,9 +73,10 @@ edge-auto-jetson/src/individual_params/individual_params/config/

```

## 3-2. (HESAI AT128 only) Get a correction file from LiDAR

HESAI AT128 have the capability to access their own correction file stored inside them.
## 3-2. (Optional) Get a correction file from LiDAR

Some product series of LiDARs, including HESAI AT128, have the capability to access their own correction file stored inside them.
To get better results, you are encouraged to download the correction file from individual LiDAR, and store it in appropriate folder on the x86-based ECU:

```sh
Expand Down
13 changes: 12 additions & 1 deletion docs/tutorials/04_launch_application.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ Launch the perception application implemented in [autoware.universe](https://git

## 4-1. Jetson-based ECU

**NOTE: This following steps can be performed from your x86-based ECU via ssh.**
> **Note**
>
> This following steps can be performed from your x86-based ECU via ssh.

The following sample launches image-based object detection performed on two cameras individually.

Expand Down Expand Up @@ -42,3 +44,12 @@ in [autoware.universe](https://github.com/autowarefoundation/autoware.universe.g
See the READMEs of these packages for more detail.

In addition to the perception stack, this sample also launches viewers so that users can check perception results visually.

As an example, the following picture shows the perception results in the case of a system configuration that consists of one AT128 and one C1 camera.

![Example: perception result](../sample.png "Example: perception result")

Note: The default models used in this tutorial are tuned for outdoor environments
(especially for autonomous driving contexts).
If you try this tutorial in some indoor environments, such as room ceil is in the range of sensor FoV,
additional preprocessings, such as cropping the range to be processed, may be required to get better results.
Loading