PassionDrivesSucces02.py

PassionDrivesSucces02.py

No alt text provided for this image

I'm starting to notice a theme here. Let's see how many snippets I can pull from animated movies! Okay...let's get on with it!

Project Update

If you missed the first phase of this project, you can easily hop over and check it out here. Today, I will cover Phase 2. Something to note at this point, if you are interested in working on a project similar to this, you will use Remote Desktop, VNC, and Putty. Links to these technologies are included in the Freenove Tutorial. As of now, the end goal is to connect this little bot to Azure IoT Edge + Custom Vision. Now let's get to it!

Phase 2 - Functional Testing

This portion of the project provides an opportunity to warm yourself up to Linux and its bash command line. Testing each component's functionality requires accessing and testing various methods within the Python program. At this point in time, I consider myself a Junior Software Engineer. I mention just in case there are other Software Engineers who are - like me - wet behind the ears and are excited at the idea of getting into a project similar to this! This type of project is a great way for you to introduce yourself to Linux and Python. Both great tools needed for any blossoming Software Engineer. You got this! Just dive right in!

With that said, let's jump into the lessons learned.

Lessons Learned

  1. Please beware of the expected output

When conducting the test, I almost sent our little bot friend into a nose dive off of my desk because I wasn't aware of the output of this test was going to be. The title "Line_Tracking" did not exactly equate to "Race Car Mode" in my head. As you move through these series of tests, research your expected output.

2.  Ensure you're using the correct machine to execute the test

This is huge. Once you begin using VNC on your host machine, it is imperative to stay aware of which machine you are executing the lines of code from. This small mistake cost me multiple hours of unnecessary troubleshooting. While attempting to execute the program needed to begin the video feed, I was using my host machine and not the Raspberry Pi. You must use the Raspberry Pi and VNC to execute this code.

def face_detect(self,img):
    if sys.platform.startswith('win') or sys.platform.startswith('darwin'):
        gray = cv2.cvtColor(img,cv2.COLOR_BGR2GRAY)
        faces = self.face_cascade.detectMultiScale(gray,1.3,5)
        if len(faces)>0 :
            for (x,y,w,h) in faces:
                self.face_x=float(x+w/2.0)
                self.face_y=float(y+h/2.0)
                img= cv2.circle(img, (int(self.face_x),int(self.face_y)), int((w+h)/4), (0, 255, 0), 2)
        else:
            self.face_x=0
            self.face_y=0

    cv2.imwrite('video.jpg',img)

3. Enjoy the journey!

I understand this isn't a true "lesson learned". However, for me - it is. As you may know, I am in the final few months of my military transition. For those of you who have done this, I don't have to explain the complexities and stress that are associated with it. For those of you who are reading this and don't know - it can best be described as being placed into a country where your language, lingo, communication style, and culture are relatively misunderstood. In recent months, I am becoming comfortable with the journey into a new and exciting career as a software engineer. Facing down that ever present imposter syndrome and creating interesting projects like this to provide an opportunity to learn and grow. Both as a new engineer and as a human.

Summary

With Phase 2 complete, we will move on to Phase 3 where we enter the realm of the AI, facial detection, and other cool stuff. Over the next couple of weeks, I will be researching and learning about Azure IoT Edge and how we can truly put this bot to work. Below, I will share a few of the test videos. I hope you have enjoyed this article and I would like to thank you for coming. See you next time!


Dana Garcia

Principal Site Reliability Developer at Oracle

4 年

Good to see your project coming along. You should think about buffering the movements to make the end result smoother. While it would add a few milliseconds in response latency it would make your car react much smoother. I'm assuming your current code has something like: ``` lines = detectLines if !lines.empty then processedlines = array for each line in lines lineAxis = check if line is vertical or horizontal lineXPosition = check if line is left right or center processedline = object(lineId, lineAxis, lineXPosition) end for each if processedlines contains two vertical axis lines if the vertical lines have one left and one right pulse all motors forward end if else if processedlines contain two horizontal axis lines pulse left motors forward and right motors backward else move camera left and recurse function end if end if ``` instead of doing just pulsing the motors using your logic I recommend you buffer a few cycles using a array containing the actions to take. Then you would have a thread function that is on a timer of every x milliseconds. It would copy the buffer, then dump it (clear it), then start the motor actions stored in the buffer. This would result in a few milliseconds delay at first; however, after that it would give the effect of a smooth operation.

Johnny Jones Jr

CEO F7 Studios | Senior Security Technical Program Manager at Microsoft | Marine Corps Reserve Association

4 年

Justin THIS is amazing man, Ive been following this project and youve made so much progress. great job!

Vasilis Kiakotos

Security Analyst at Microsoft

4 年

This is amazing man! good job, cant wait to see the end product.!

要查看或添加评论,请登录

Dominick Blue的更多文章

社区洞察

其他会员也浏览了