Setting Up AI Computer
Resources
Development Machine side requirement
Ubuntu 18.04 or Windows 10 Machine
Internet Connection
SSH service
Power Supply
As an example of a good power supply, NVIDIA has validated Adafruit’s 5V 2.5A Switching Power Supply with 20AWG MicroUSB Cable (GEO151UB-6025). It was specifically designed to overcome common problems with USB power supplies; see the linked product page for details.
Items Required
Monitor
Keyboard,
Mouse,
USB power supply
USB-A to micro USB cable to connect it to Jetson Nano,
HDMI cable to connect the Jetson Nano to display monitor
micro SD Card, with a minimum size of 32GB.
Internet Connection with Wired LAN
2. Installation
Make micro SDcard for jetson Nano
Write Image to the microSD Card
3. First Boot
Unfold the paper stand and place inside the developer kit box
Set the developer kit on top of the paper stand.
Insert the microSD card (with system image already written to it) into the slot on the underside of the Jetson Nano module.
Power on your computer display and connect it.
Connect the USB keyboard and mouse.
Connect your Micro-USB power supply (5V⎓2A). The Jetson Nano Developer Kit will power on and boot automatically.
A green LED next to the Micro-USB connector will light as soon as the developer kit powers on. When you boot the first time, the Jetson Nano Developer Kit will take you through some initial setup, including:
Review and accept NVIDIA Jetson software EULA
Select system language, keyboard layout, and time zone
Create username, password, and computer name
login
4. JetPack
Installing Jetpack
First, let’s install the NVIDIA JetPack. NVIDIA JetPack SDK is the most comprehensive solution for building AI applications. Use the JetPack installer to flash your Jetson Developer Kit with the latest OS image, install developer tools for both host PC and the Developer Kit, and install libraries and APIs, samples, and documentation needed to jumpstart your development environment. The most recent version at the time of writing this article is JetPack 4.1.1.
For the most part, installation is easy. From an Ubuntu 16.04 or Ubuntu 18.04 PC host computer, you simply download JetPack from the NVIDIA JetPack web page(you’ll have to sign in with your developer account to download JetPack) and follow the instructions in the setup guide.
jk@amma:~/util$ git clone https://github.com/jetsonhacks/jetsonUtilities.git
Cloning into 'jetsonUtilities'...
remote: Enumerating objects: 84, done.
remote: Total 84 (delta 0), reused 0 (delta 0), pack-reused 84
Unpacking objects: 100% (84/84), done.
jk@amma:~/util$ cd jetsonUtilities
jk@amma:~/util/jetsonUtilities$ python jetsonInfo.py
NVIDIA Jetson TX1
L4T 32.1.0 [ JetPack 4.2 ]
Ubuntu 18.04.2 LTS
Kernel Version: 4.9.140-tegra
CUDA 10.0.166
JetPack 4.3 is good for input/output but 4.2 is detected as a version on board.
5. IO's : gpio, spi, I2C, I2S , PCI etc
Introduction of JetPack 4.3 ( ( L4T 32.3.1) brings with it a new tool, Jetson-IO. All of the Jetson developer kits include a 40-pin GPIO expansion header. Many of the pins can be used either as General Purpose I/O (GPIO) or Special Function I/O (SFIO). SFIO are functions such as I2C, I2S, SPI, and so on
https://www.jetsonhacks.com/2020/05/04/spi-on-jetson-using-jetson-io/
https://github.com/JetsonHacksNano/SPI-Playground
cat /etc/nv_tegra_release
dmesg |grep SPI
"32.4.3") JETSON_JETPACK="4.4" ;;
"32.4.2") JETSON_JETPACK="4.4 DP" ;;
"32.3.1") JETSON_JETPACK="4.3" ;;
"32.2.3") JETSON_JETPACK="4.2.3" ;;
"32.2.1") JETSON_JETPACK="4.2.2" ;;
"32.2.0" | "32.2") JETSON_JETPACK="4.2.1" ;;
"32.1.0" | "32.1") JETSON_JETPACK="4.2" ;;
"31.1.0" | "31.1") JETSON_JETPACK="4.1.1" ;;
"31.0.2") JETSON_JETPACK="4.1" ;;
"31.0.1") JETSON_JETPACK="4.0" ;;
"28.2.1") JETSON_JETPACK="3.3 | 3.2.1" ;;
"28.2.0" | "28.2") JETSON_JETPACK="3.2" ;;
"28.1.0" | "28.1") JETSON_JETPACK="3.1" ;;
"27.1.0" | "27.1") JETSON_JETPACK="3.0" ;;
"24.2.1") JETSON_JETPACK="3.0 | 2.3.1" ;;
"24.2.0" | "24.2") JETSON_JETPACK="2.3" ;;
"24.1.0" | "24.1") JETSON_JETPACK="2.2.1 | 2.2" ;;
"23.2.0" | "23.2") JETSON_JETPACK="2.1" ;;
"23.1.0" | "23.1") JETSON_JETPACK="2.0" ;;
"21.5.0" | "21.5") JETSON_JETPACK="2.3.1 | 2.3" ;;
"21.4.0" | "21.4") JETSON_JETPACK="2.2 | 2.1 | 2.0 | 1.2 DP" ;;
"21.3.0" | "21.3") JETSON_JETPACK="1.1 DP" ;;
"21.2.0" | "21.2") JETSON_JETPACK="1.0 DP" ;;
*) JETSON_JETPACK="UNKNOWN" ;;
JETSON_JETPACK="4.4" ;; and above…
6. Internet Access
Wired LAN and Wireless LAN access
7. RDP/SSH
SSH is good for user to login and work in Jetson Nano. Since Console option is available directly, it is good enough to have SSH. In case some user want to try with RDP then they can try it out
Easiest way is probably going to be running an RDP server called xrdp. Installation is a lot simpler than setting up VNC.
$ sudo apt-get install xrdp
After installation has completed, you should go ahead and reboot the Jetson Nano board. Once the reboot has completed you can check installation of xrdp was successful by using the command nmap from your laptop.
$ nmap jetson
RDP server is running despite us currently being at the login screen on the physical machine. While RDP is a proprietary protocol, Microsoft do provide viewers for most platforms for free, including the Mac, which is available in the Mac App Store and install it.
8. putty
GUI tool in windows machine to access remote machine by using IP address of remote machine
9. File Transfer between host and remote Machine
scp <file name > jk@<ipaddress>:/home/jk/vector
File moved from host machine to remote machine
scp jk@<ip address>:/home/jk/<filename> .
File moved from remote machine to host machine
10 .git in Jetson Nano
sudo apt-get install git
11. nvcc
microSD image of 64Bit Ubuntu Linux that NVIDIA provides for this computer has all the NVIDIA libraries and utilities you need all pre-installed. The programming guide made it clear that if you need to use the NVIDIA C compiler nvcc to compile your work. But if user type nvcc at a command prompt, user just got an error that this command wasn’t found. A bit of Googling revealed that everything is installed, but it did it before installation created your user, so you need to add the locations to some PATHS. Adding:
export PATH=${PATH}:/usr/local/cuda/bin
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/usr/local/cuda/lib64
To my .bashrc file got everything working. It also shows where cuda is installed. This is handy since it includes a large collection of samples.
Compiling the deviceQuery sample
12. cmake
jk@amma:~/tmp/oct16/build$ sudo apt-get install cmake
Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?
jk@amma:~/tmp/oct16/build$ sudo kill all apt apt-get
sudo rm /var/lib/dpkg/lock-frontend
sudo dpkg --configure -a
sudo poweroff
sssh jk@192.168.1.4
sudo apt-get install cmake
13. python Installation
Option 1:
sudo apt-get install lsof
sudo apt-get install -y python3-pip
sudo apt-get install -y python3-venv
sudo pip3 install virtualenv
virtualenv jkDL2
source jkDL2/bin/activate
Option 2:
sudo rm /var/lib/dpkg/lock-frontend
sudo dpkg --configure -a
sudo poweroff
sudo apt-get install lsof
sudo apt-get install -y python3-pip
sudo apt-get install -y python3-venv
Option 3:
sudo pip3 install virtualenv
virtualenv WorkDL2
source WorkDL2/bin/activate
sudo apt-get install cmake
sudo apt-get install git
pip3 install numpy ( use this command to install numpy)
There are a few more packages and development tools to install to ensure that we have a robust set-up for our programming environment:
sudo apt-get install build-essential libssl-dev libffi-dev python-dev
Once Python is set up, and pip and other tools are installed, we can set up a virtual environment for our development projects.
sudo apt-get install -y python3-venv
venv module, part of the standard Python 3 library, so that we can create virtual environments
mkdir environments
cd environments
python3 -m venv jk_env // jk_env is created ,u can give your name
ls_env
source environments/jk_env/bin/activate
Use virtual environments: use virtual environments for your Python programming needs. You might be familiar with conda, but unfortunately it can’t be installed on ARM. Instead you can use the Python3-venv package that can be installed with:
///DL SDK doc from NVIDIA
https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#import_model_python
// hello world in TF RT
Python Matrices and NumPy Arrays
https://www.programiz.com/python-programming/matrix
https://stackoverflow.com/questions/28831854/how-do-i-add-python3-kernel-to-jupyter-ipython
sudo pip3 install ipython>=5.0.0
:~/tmp/pub$ pip3 --version
pip 19.1.1 from /usr/local/lib/python3.4/dist-packages/pip (python 3.4)
/tmp/pub$ pip2 --version
pip 19.1.1 from /usr/local/lib/python2.7/dist-packages/pip (python 2.7)
//following worked well ,,,installed 3.5 version with ease
sudo apt-get install libssl-dev openssl
wget https://www.python.org/ftp/python/3.5.0/Python-3.5.0.tgz
sudo tar -xzvf Python-3.5.0.tgz
cd Python-3.5.0
./configure
sudo make
sudo make install
//following worked well
jupyter-notebook
//////////// issue on TensorRT
https://devtalk.nvidia.com/default/board/360/container-tensorrt/
https://devtalk.nvidia.com/default/board/304/
//download tensorRT
https://developer.nvidia.com/tensorrt
TensorRT 5.0 Usage Survey
https://developer.nvidia.com/embedded/downloads#?search=Jetson%20Nano
TensorRT 5.1 GA ( general availability RC is release candidate)
Tar File Install Packages For Linux Power
TensorRT-5.1.3.6 for Ubuntu
installation of tensorrT
https://docs.nvidia.com/deeplearning/sdk/tensorrt-install-guide/index.html
//model
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<eg:TensorRT-5.1.x.x/lib>
//wrong
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:</home/tmp/jetson/TensorRT-5.1.3.6/lib>
//correct one
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/tmp/jetson/TensorRT-5.1.3.6/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/tmp/jetson/TensorRT-5.1.2.2/lib
sudo pip3 install tensorrt-5.1.3.6-cp35-none-linux_ppc64le.whl
sudo pip3 install tensorrt-5.1.2.2-cp35-none-linux_x86_64.whl
sudo pip3 install uff-0.6.3-py2.py3-none-any.whl
//issue
sudo pip3 install graphsurgeon-0.4.1-py2.py3-none-any.whl
//working
sudo pip3 install graphsurgeon-0.4.0-py2.py3-none-any.whl
///CUDA
// upgrade from 3.4 to 3.5
sudo apt-get install python3.5
python3 --version
I'm getting
Python 3.4.3
didn't do anything wrong and things are not working as intended. Even after you have installed Python 3.6 from a PPA, the /usr/bin/python3 symlink on your Ubuntu 14.04 system still points to /usr/bin/python3.4, not /usr/bin/python3.6. Therefore, to invoke the Python 3.6 interpreter, you explicitly run python3.6.
how to install jupyter notebook in ubuntu 14.04
Python Prerequisites
sudo apt install python3-pip
sudo apt install ipython3
pip3 install jupyter
Downloading ipywidgets-7.4.2-py2.py3-none-any.whl (111kB): 111kB downloaded
Cleaning up...
Exception:
Traceback (most recent call last):
pip --version
pip 1.5.4 from /usr/lib/python2.7/dist-packages (python 2.7)
sudo pip install --upgrade pip
Not uninstalling pip at /usr/lib/python2.7/dist-packages, owned by OS
//following worked
sudo -H pip install --upgrade pip
sudo pip3 install --upgrade pip
// appear to be worked partly
sudo pip3 install --upgrade setuptools
/// again issues
pip3 install jupyter
Setting up Jupyter with Python 3 on Ubuntu
https://datawookie.netlify.com/blog/2017/06/setting-up-jupyter-with-python-3-on-ubuntu/
// tried with sudo ,..but still not ok
Installing TensorFlow
sudo pip3 install jupyter
...You are using pip version 10.0.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command
sudo -H pip install --upgrade pip --user
How can I uninstall python 2.7 and reinstall 3.5 in Ubuntu 14.04?
sudo apt-get install python3-notebook jupyter-core python-ipykernel
https://askubuntu.com/questions/847263/install-jupyter-notebook-for-python-2-7
///windows file
Found Windows Boot Manager on /dev/sda1@/EFI/Microsoft/Boot/bootmgfw.efi
Adding boot menu entry for EFI firmware configuration
sudo apt-get update
sudo apt-get autoremove
sudo apt-get -y install python3-pip python3-dev
sudo -H pip3 install --upgrade pip
sudo apt-get -y install ipython3 ipython3-notebook
sudo -H pip3 install jupyter
sudo -H pip3 install jupyter --user
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
https://github.com/jupyter/notebook/issues/2786
command not found: 'jupyter
/usr/local/bin/pip3
/usr/local/bin/jupyter
export PATH=$PATH:~/.local/bin
File "/tmp/pip-install-Ej0KVF/tornado/setup.py", line 146, in <module>
raise ImportError("Tornado requires an up-to-date SSL module. This means "
sudo pip install 'Tornado>=4.0.0,<5.0.0'
matplotlib 1.3.1 requires nose, which is not installed.
https://github.com/googlesamples/assistant-sdk-python/issues/264
Try using
sudo easy_install nose
sudo easy_install tornado
sudo pip install 'Tornado>=4.0.0,<5.0.0'
sudo -H pip install jupyter
https://github.com/Tony607/tf_jetson_nano
Run Keras/Tensorflow model on Jetson Nano
https://ehmatthes.github.io/pcc/chapter_01/osx_setup.html
To be able to run jupyter notebook from terminal, you need to make sure that ~/.local/bin is in your path.
Do this by running export PATH=$PATH:~/.local/bin for your current session, or adding that line to the end of ~/.bashrc to make your changes last for future sessions (e.g. by using nano ~/.bashrc). If you edit ~/.bashrc you will need to log out and log back in to make see your changes take effect
jupyter notebook ///////////// now it worked .....// 11.07 PM
https://github.com/Tony607/tf_jetson_nano
Run Keras/Tensorflow model on Jetson Nano
git clone https://github.com/Tony607/tf_jetson_nano
pip3 install -r requirements.txt
git clone https://github.com/Tony607/tf_jetson_nano.git
pip install numpy --upgrade
sudo pip install numpy --upgrade --ignore-installed
kernel2 or kerl 3 issue in Jypeter notebook
sudo apt-get install python-dev python3-dev python-pip python3-pip
sudo python -m pip install virtualenv --user
///worked
sudo apt install python-pip
sudo pip install absl-py
sudo pip install gast
sudo pip install grpcio
sudo pip install mock
sudo pip install tensorboard>=1.8.0
pip install numpy --upgrade
///DL SDK doc from NVIDIA
https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#import_model_python
// hello world in TF RT
https://docs.nvidia.com/deeplearning/sdk/tensorrt-sample-support-guide/index.html#end_to_end_tensorflow_mnist
Python Matrices and NumPy Arrays
https://www.programiz.com/python-programming/matrix
https://stackoverflow.com/questions/28831854/how-do-i-add-python3-kernel-to-jupyter-ipython
sudo pip3 install ipython>=5.0.0
/tmp/pub$ pip3 --version
pip 19.1.1 from /usr/local/lib/python3.4/dist-packages/pip (python 3.4)
/tmp/pub$ pip2 --version
pip 19.1.1 from /usr/local/lib/python2.7/dist-packages/pip (python 2.7)
//following worked well ,,,installed 3.5 version with ease
sudo apt-get install libssl-dev openssl
wget https://www.python.org/ftp/python/3.5.0/Python-3.5.0.tgz
sudo tar -xzvf Python-3.5.0.tgz
cd Python-3.5.0
./configure
sudo make
sudo make install
//following worked well
jupyter-notebook
//////////// issue on TensorRT
https://devtalk.nvidia.com/default/board/360/container-tensorrt/
https://devtalk.nvidia.com/default/board/304/
//download tensorRT
https://developer.nvidia.com/tensorrt
TensorRT 5.0 Usage Survey
https://developer.nvidia.com/embedded/downloads#?search=Jetson%20Nano
TensorRT 5.1 GA ( general availability RC is release candidate)
Tar File Install Packages For Linux Power
TensorRT-5.1.3.6 for Ubuntu
installation of tensorrT
https://docs.nvidia.com/deeplearning/sdk/tensorrt-install-guide/index.html
//model
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<eg:TensorRT-5.1.x.x/lib>
//wrong
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:</home/tmp/jetson/TensorRT-5.1.3.6/lib>
//correct one
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/tmp/jetson/TensorRT-5.1.3.6/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/tmp/jetson/TensorRT-5.1.2.2/lib
sudo pip3 install tensorrt-5.1.3.6-cp35-none-linux_ppc64le.whl
sudo pip3 install tensorrt-5.1.2.2-cp35-none-linux_x86_64.whl
sudo pip3 install uff-0.6.3-py2.py3-none-any.whl
//issue
sudo pip3 install graphsurgeon-0.4.1-py2.py3-none-any.whl
//working
sudo pip3 install graphsurgeon-0.4.0-py2.py3-none-any.whl
$ which jupyter
/usr/local/bin/jupyter
TensorRT, OpenCV
TensorFlow is one of the most popular deep learning frameworks today. NVIDIA® TensorRT™ is a deep learning platform that optimizes neural network models and speeds up inference across all kinds of GPU-accelerated platforms running in data centers, embedded and automotive devices. TensorFlow integrates nicely with TensorRT, which seems a natural fit, particularly as NVIDIA provides platforms well-suited to accelerate TensorFlow. This enables TensorFlow users to have extremely high inference performance and a near transparent workflow when using TensorRT.
Adding TensorRT to the TensorFlow inference workflow involves an additional step, as shown in Figure 3. In this step (highlighted in green), TensorRT builds an optimized inference graph from a frozen TensorFlow graph.
Throughout this article, we will use python 3. Let’s install TensorFlow and TensorRT on the device. You can find good instructions in the NVIDIA TensorFlow/TensorRT Models on Jetson repository. But first, you should install python3-dev and libfreetype6-dev packages. They may solve some problems with matplotlib installation:
sudo apt-get update
sudo apt-get upgrade
sudo apt-get install libfreetype6-dev python3-dev
Also, we recommend installing the last version of TensorFlow, currently it is 1.10.1.
After installing TensorRT we had a problem with the jupyter example. Since the example uses a ssd_inception_v2 model which tries to allocate a lot of GPU memory, the session run process gets killed by the system. To resolve this problem we changed the model to SSD Lite MobileNet v2 from TensorFlow Model ZOO. The model zoo is Google’s collection of pre-trained object detection models that have various levels of processing speed and accuracy.
14. Kernel Update
Quick Demo
Quick Demo
Demo applications built we can run our first model. If you go ahead and change directory to the build directory, you can run the included detectnet-console demo application. This accepts an image as input, and outputs a list of coordinates of the detected bounding boxes. You’ll need to specify a pre-trained model as the third argument.
First time you run a model you may assume that the code has hung and it’s not working. I know I did, because it’s more than “a few minutes.”
git clone https://github.com/dusty-nv/jetson-inference
cd jetson-inference
git submodule update --init
mkdir build
cd build
cmake ../
make
sudo make install
cd ~/jetson-inference/build/aarch64/bin
./detectnet-console ~/dog.jpg out.jpg coco-dog
The inference portion of Hello AI World - which includes coding your own image classification application for C++ or Python, object detection, and live camera demos - can be run on your Jetson in roughly two hours or less
https://github.com/dusty-nv/jetson-inference/blob/master/README.md
cd ~/jetson-inference/build/aarch64/bin
./detectnet-console ~/dog.jpg out.jpg coco-dog
Took dog pic from https://blog.hackster.io/getting-started-with-the-nvidia-jetson-nano-developer-kit-43aa7c298797 and placed in /home/siri/dog.jpg
$ cd ~/jetson-inference/build/aarch64/bin
$ ./detectnet-console ~/dog.jpg out.jpg coco-dog