Fizyr Interview with Herbert ten Have

Fizyr Interview with Herbert ten Have

https://www.youtube.com/watch?v=QxPZKVDz65c

Fizyr Interview with Herbert ten Have - Robot Optimised Podcast

Fizyr: https://fizyr.com/

Philip English: https://philipenglish.com/

Sponsor: Robot Center: https://www.robotcenter.co.uk


Philip English: 

Hi guys, my name Philip English, and, uh, I’m a Ross’s enthusiasts report on the latest business and applications of robotics and automation. Uh, my mission is to get you robot optimized, uh, to support industry infrastructure and innovation for the next era. Uh, today, uh, we have Herbert, uh, Ten Have from Fizyr. Who’s going to give us an overview of fizyr, and then we’re going to fire a few questions at him and just to see how the business runs and what solutions that he has. And so hi there Herbert

Herbert ten Have: 

Afternoon,

Herbert ten Have: 

Philip English: 

Afternoon. And so I suppose to start with, could you give us like an overview, um, of, of yourself and your company?

Herbert ten Have: 

Sure. I’m 57, still alive. You can breathe and you can do all kinds of active stuff. Um, I run a company called Pfizer, which is a spinoff of the university by a professor. He had an academic view of the world and having robots do all kinds of jobs and they put a brilliant team together. And so now we also have a business plan and we are in quite good operation. A few years ago. We won the Amazon picking challenge, post stowing and picking. And that’s the moment we stopped doing robotics. No more robots.

Herbert ten Have: 

Okay. Okay. So you’ve moved from the hardware to the software side.

Herbert ten Have: 

The brain, Actually what we do is like self-driving cars. We translate the image, what the camera sees into, where the robot should move. And that’s why also our name, new name fizyr means scope. We look, so we look through where it should be grasp as well as we’re looking at future new applications and new technology.

Philip English: Wow. Okay. And fizyr means scope

Herbert ten Have: 

Dutch in Dutch and German and Danish. It’s telescope,

Philip English: 

It’s sort of the same. And then I know, we’ll see, I saw the bit about, um, you won the Amazon awards and that, that was because of your deep learning capability of the software.

Herbert ten Have: 

Correct. Wow. So,

Philip English: 

Um, how big is your team then? Um, from like a size wise,

Herbert ten Have: 

24 currently.

Philip English: 

I see. And then we’ll see. So you’ve come from a university background. Um, so I guess, you know, you’ve got, um, uh, quite like an academic, uh, source of inspiration there, which is something that’s really interesting. I mean, so what, what’s the main problem that fizyr is solving.

Herbert ten Have: 

Okay, good question. So let’s take a partial fulfillment, like example, every time you drop it, it will, the robot will see a different envelope. It will always be different or they’ll take their style. They will always be different. So there is no way you can program it to program a robot. How will, how will it look like? Because it will always be different. So the only way to sort of recognize it and classify it is to generalize to teach it to generalize. Like we, humans can, I could get a one year old, can see that this is a box and can grasp it. And so what we’ve done, we trained the neural network with a lot of images, so a supervised learning. So we teach them, this is an envelope, this is a box, et cetera, and millions of images. And at some point it became good at doing the same thing as humans, understanding that this is a box, a bottle or two per cylinder, et cetera. And to pick it from a bulk of unknown partials.

Philip English: 

Right. I see. So obviously the main problem is that there’s from a normal vision system, it’s hard for it to distinguish the way one item ends and another one starts. So, so within, within your software, it’s obviously got a lot of deep, deep learning tools that actually does that distinction. Um, and I suppose the main problem for the customers is that if they can’t find where the product is, they can’t get a robot to pick it up and then they can’t move into an automation type of process to speed up the process plan. Correct. Okay. That makes sense. And then, so in regards to the actual camera system, then, I mean, is it, is it one type of camera or is it, is there any camera that you can use and you put for your software on top or

Herbert ten Have: 

Yeah, now we are hardware agnostic or both on the sensor of the camera side, as well as on the robot and the end effector the gripper. But having said that we mostly use RGB depth data. RGB is being what we see right now and the depth images is to do triangulation. So we need to know what’s the distance. So for that, you need a depth camera, which is a stereo camera, uh, with, uh, in the middle is structured light, like a flashlight with different ways of structuring it. And based on that, uh, that camera will create a point cloud from which you can derive the distance and to see how it’s positioned. So for instance, take this box, uh, this time you would hear instruction to simulate it, to factor them, uh, in this case you have two sites, but we will see maximum, see three sites of each parcel. So we will find those places where to grasp with the robot to simulate the, uh, the item.

Philip English: 

Right. I see. I see. So that’s the mechanics of how it works and, um, I suppose, would it, um, could you, if you had two cameras or three cameras or four cameras with, does that add on and make it more and more, um, of a product that the products you can see or was it just one camera and that’s all you need

Herbert ten Have: 

As always, it depends. The key thing is the camera should see it. If the camera cannot see it, then, uh, the algorithm, the neural network cannot it. So the good exercise is normally when we want to pick something, we look into the bin and we can find what we want, but we are flexible with our eyes and we can move that. So assume you have the camera above, then you just look at the screen, can you see it in the screen? And then we can train a neural network to pick it as well. So we’ve had many applications that the neural network was much better than human beings. So for us, it was too hard, but the neural network was more faster, accurate, and more robust.

Philip English: 

And would it make sense? I mean, it’s just an example there, we’ll see if a camera is that I only got one position and w w would it help to have maybe a system where the camera was almost moving to some degree to help like pick, pick it up?

Herbert ten Have: 

Not necessarily in most cases we, um, we picked, let’s say from a pellet or from a bin or from a conveyor, and as long as it’s feasible for that camera, and then the robot can move there as well, because what you can see in the camera, then the mobile can move at the same direction as well, because it’s free. There is no, there is no other objects in between.

Philip English: 

Perfect. And then, so to give us some examples, I mean, like where have you installed this type of technology?

Herbert ten Have: 

Okay. One of the nice thing, and we’ll show you in a, in an image is picking the corners of towels and that’s really hard. So towels are in bulk white and white, for instance, and you have soft corners like when it’s, uh, it’s it’s bended, but you also have the heart corner of, of, of a, of a towel. And what the algorithm does is finds the corners of towel and then big, and then feeds it into a folding machine. And then the towel is folded. So that’s something that has been operation for a few years. So we started doing that, and now we are mostly in logistics, in e-commerce it’s item picking. So picking unknown items, which you order online, as well as when the item is being packaged into a bag or a box, we have to pick the partials also from bulk for, let’s say, DHL ups, federal express, et cetera. And, um, we do also track on loading and the pelletizing, which is mostly also boxes, let’s say,

Philip English: 

Well, and for the truck on, on loading then. So the idea is obviously the back of the trunk pulls out. Um, then I suppose a bit of a challenge, a question like where, like, where would you put the camera? Does the camera sort of drop, drop down to the back of the truck? So it sees in and then pop pops up again.

Herbert ten Have: 

Uh, first of all, we don’t build a machine. We only are the brains that translate the image into, with what they do is they have a camera on the device that goes into the truck. So there is a robot picking each of the items and putting it on a retractable conveyor. So the items, the boxes or bags are simulated from bulk, from unknown. And I will show you the image as well, and then put on a conveyor and then it’s being handled the, in the warehouse. Right.

Philip English: 

That makes sense. Yeah. Cause I understand in big light logistic houses, you would have a device that goes into the actual lorry with a conveyor belt on, and then obviously the items can be paid. I mean, in, in regards to, to the, um, to the towel folding, I was always, I was having to look at that FoldiMate robot. I’m not sure if you’ve seen that. It’s like a house appliance. I think it’s on Kickstarter at the moment.

Herbert ten Have: 

Yeah. In our case, it’s really, it’s a really professional a robot. Uh it’s uh, so it’s, uh, it’s been there for, uh, for a while. It’s really for the professional, uh, laundry industry. So in hotels and conference centers and a lot of laundry when they, where they have three shifts per day dealing with laundry.

Philip English: 

Yeah. I’ve been into some of those sites moment myself, and it’s a 24 hour operations as washing and washing and washing. So yeah.

Herbert ten Have: 

Humid and warm show. It’s something you, at some point we realized we should not do this as humans. Yes. Yes. So we should just sleep at night and then have robots doing the work for us.

Philip English: 

Yeah. No, definitely. Definitely. Yeah. There’s definitely a way, I mean, um, in, in regards to their robustness, cause I’ll say, I was saying that’s one of your key features, I suppose the first question is, can you use it outside that I know you wouldn’t normally have a robot set up outside, but is it, can you have cold and snow and wet or is it

Herbert ten Have: 

Yeah. For the software? Doesn’t matter. Of course. So it’s yeah, of course. It’s all about, do you have a camera that’s IP waterproof and all this stuff. And uh, so, uh, most, I would say 95% of our applications are, are indoor, but some of them are like for truck and loading, it could be outdoors as well,

Philip English: 

And then I saw her, I was, I was reading up. So yeah, the, the, the scanning can, can scan up. Um, is it like a hundred items a

Herbert ten Have: 

Second?

Philip English: 

Yeah. Yeah. So, so if the robot was fast enough and then you could really yeah,

Herbert ten Have: 

Yeah. There is no robot on earth that fast. So the neural network is extremely faster. So we use a GPU like in a Nvidia card where we play with, uh, so we use that, uh, to, to do that process. So it’s extremely fast in providing all information, including the, the, the cross poses of the, of the parcels. Yeah.

Philip English: 

Yeah. Cause I was, I was reading that. I was saying, yeah, that’s really fast. I was thinking, Oh yeah, you would need a, you need a lot of robots all attacking. Um, so I suppose then, as you’re saying, like, you know, the bigger picture then is really for those sort of dull, dirty and dangerous jobs, you know, that, that you, you, you, we have robots with the fizyr system that can also pick the items and do the job for us. I mean, what, what, what’s your sort of, um, uh, like future plan? I mean, I did see from, from your website, obviously you, you guys have very successfully like bootstrapped up to 2020, and I think you recently got some investment. So is your seed expansion is it’s on your mind? So

Herbert ten Have: 

Yeah, we’re quite unique I think in Europe. Uh, so we’re bootstrapping is more common, uh, then later to get some investment. So we refill it, dated our product with our, our clients. So we have clients like Toyota for, for four years already. So it’s really, we go into a long-term relationship and we build things going production, and then we built the next one. So it’s, the Americans would go faster and et cetera, but we would like to get everything in order and then go to the next one. So that’s how we build up. So now we have the product ready and we can scale easily easier. We are in logistics, which is like I said, if fulfillment and partial handling and we do something in airports as well, but it hasn’t been disclosed it’s so it’s always logistics and nine out of 10 cases, either a box or a bag.

Philip English: 

Yeah. Yeah, that’s right. I’ve, I’ve, uh, I’ve been in a lot of airports as well, and I’ve seen, uh, I’ve seen them to deploy some robotic systems in there. So I suppose, yeah, that’d be a perfect target for a, for, for you really? Because, um, just, just making sure, you know, different sizes of luggage and bags made sure, because that’s key. If you, if you go on a holiday, you want to make sure that your luggage is there.

Herbert ten Have: 

Yeah, yeah, yeah. But like I said, we only deal with the computer vision part. So there are two more elements to it. Secondly, do you have an anti factor, a gripper that can cope with the variation? So if I have to pick up a pen with suction, I need a very small suction cup to pick up this one where when I have a bigger box, let’s say is, would be heavy, then I need multiple suction to CrossFit. So it’s the, do you have a gripper and anti factor that can cope with the variation? That’s going to be expected. That’s a second challenge. And cert one is all about the integration. So how fast can you accelerate or decelerate without throwing it away? How well do you know it’s it’s attached? Is it safe for the environment for people? Do you have a cobalt or an industrial robot? So integration with the warehouse system. So there are a lot of things around it. So there are three phases and we take all the first phase. Can you see it? Do you have perception? Can, do you know where to where to cross?

Philip English: 

I see. Yeah. No, that makes sense. I mean, I did see your gripper as well. And um, I think if you made that, um, like open source, so anyone can sort of build their own, is that, is that the idea or, you know, a usual technology?

Herbert ten Have: 

Yeah, we do a lot of open source. So if you go and get up and Pfizer, they will see a lot of return on net and all the stuff that we’ve made open source. So we have a lot of followers. We’re very proud of that. And it also brings in new developers. So we get a lot of developers through the open source community because they know us. Um, so the gripper is something we give away the science as well, because we only do software. We don’t do hardware. We don’t want to do, we just want to stay digital. And so it’s, it’s a really nice market. It’s so big. And there’s so many challenges still to go. It’s not as easy as it looks because in warehousing, if you go to a shorting center, it’s looks, it looked like a warzone. So you’ll see everything, car tires, all kinds of stuff is being shipped. So it’s not easy and they’re working hard. And um, so it’s, it’s a, yeah, there is a lot to be done still.

Philip English: 

Yes. Yeah. No, that that’s really true. I mean, I I’ve, I’ve been into a lot of those sites as well, and yeah, I can definitely see that there’s, um, that you, you need a good vision system to make sure that you pick up the right light items. I mean, just going back to the gripper though, I mean, it’s, um, it’s obviously that’s open source. So then I was going to ask a payload question, but I don’t, I suppose it depends on how you build the grip or how they would build the gripper. Like usual

Herbert ten Have: 

Fill up the payload is very simple. You have a vacuum, then you have a surface. So just, you can just calculate what is the maximum force you can apply. So in order to lift something with a certain amount of vacuum and surface, so you can calculate, and if it’s well touched, then you can do CrossFit. But let’s say if it’s something like this, like a towel It will go through, right, you need to take that into account and then you need to apply more flux. So like a vacuum cleaner, you can still pick it up as long as it’s not, it’s not going to be sucked in. So you need, you need the, you need the filter, but so are our neural network knows what it is, can classify it and knows. In some cases you have to apply more, more air, more flux in order to cross this. And, uh, so you can also measure how much air goes through how well it’s attached, uh, in order to know how fast you can move without throwing it away.

Philip English: 

Right. And, and I’ve seen recently there’s a lot, a lot more of those soft type of rope or all of robotics that have all sorts of arms and flux that, that, that make it even more, um, like useful for those types of operations. So yeah,

Herbert ten Have: 

The key thing is, is the combination between what the robot sees the information, the eye, hand coordination. So the more like we’re humans, we have flexible hands. We can do a lot. So the same applies for a robot. You can have a smart gripper with multiple suction cups so we can apply based on if it’s this one, we only do you see suction cup so we can apply different suction cups, different sizes and shapes, uh, based on the material we go to grasp. And so then what we also do is stacking. It’s like playing Tetris. Okay. So we picked something of unknown. And when we look, what are the dimensions, and we look into the place where we want to play a place like a pallet or a bag, like grocery should do micro fulfillment. And then we placed the, the item in the, in the unknown environment.

Philip English: 

Right. So you’ve got the ability to do that as well. So I was speaking to a friend about you guys, so he he’s got a project, uh, to do with waste recycling. So as you can imagine, massive plant, lots of all sites of rubbish. So we’ll kind of get coming along a conveyor belt. And, uh, I know he’s looking into, um, uh, you know, a technique to do it. And I think they’re actually saying, Oh, look, we only needed to, we only need it to work 30% of the time, and then we can work on it. And then if you guys had a chance to have a play with that industry. Yeah. Or

Herbert ten Have: 

Yes, we did screen off years ago. And then we decided we want to focus on logistics because it’s logistics like a blue ocean. It’s so big. And we, we, uh, we claim, we did still think we are the best in the world, although we are small and, uh, we want to stay the best. So you need to focus, focus, focus, focus, and just stick with that one and just be in, stay the best because you can do a lot of things, stuff. And it’s really interesting. It’s nice to do, but at the end you need to, to stay the best and just to focus,

Philip English: 

Focus on the main area. Yeah, yeah. Um, okay. No, that’s great. So, I mean, so what’s, what’s the latest news, like, what’s the next thing for, for you guys, um, are looking into it.

Herbert ten Have: 

Um, it’s, it’s helping our integrators are robotic integrators worldwide. Uh, so our, now our software fresh in production in, uh, in the U S North America in Europe, of course, and in Asia, China, and soon Japan. Uh, but what we see a lot is in the fulfillment that they will have micro fulfillment centers. So one of our clients is for instance, fabric, they have micro fulfillment center to, uh, to bring the groceries, really, to work towards the homes and they are in cities. So that’s really a robot robotized, like lights, they call it lights out factory where everything is done with robots. And I would say, we’re still ahead of that. We’ll still take maybe one or two years, but that definitely what the industry is going for to have lights out factories where just truck comes in, it’s loaded, then the robots take over the rest,

Philip English: 

Right. And then, and then your software light, either lights on or lights off can still do the same job.

Herbert ten Have: 

We need some lights, but, uh, the lights off means, let’s say no people or just remote. But again, we only do a small part. We only do. We like self-driving cars where to drive. We are for the robots where to pick that’s the key thing we do.

Philip English: 

Yeah. And, and I suppose it, you expect on a lights outfit, factory, you know, on certain items that need light, then it would flash on a light. So, so, so, so it can do its job, uh, like most efficiently. So

Herbert ten Have: 

Yeah, in our case, we always need light. We need RGB. And so we need light to see, but, uh, lights out as a term means that they can have a factory without humans around. And so they call it lights, lights out, factory mat. So, but in shorting center, everybody buys a lot online and the number of parcels only, uh, increases every year. So that’s a big challenge in the industry to be automated because there’s a shortage of humans doing this work. And we don’t like to do it in the middle of the night or weekends working on the, on the de palletizing or picking parcels. So we should have robots doing that.

Philip English: 

Yes, no, I totally agree. I totally agree. Have you, uh, have you had any, any work in, in hospitals or sort of amendment meant medical care? That seems to be a lot of speak about that robots coming into the hospital world. I mean, I suppose you’re using your software to, um, organize certain items around the hospital. Have even if you guys said that any, any traction with that or,

Herbert ten Have: 

Uh, focus, focus, focus, logistics, logistics. The only thing that we still do is picking towels and then sometime that’s in a, in a, in a hospital as well, but we really want to do focus and we do pick a medicine by the way, see fulfillment, shop picking small boxes of, uh, of medicine and the blisters and stuff. It’s part of it.

Philip English: 

Right. Fantastic. Okay. No, that’s great. But I think we’ve got a great overview of fizyr. I mean, um, what’s the best way to get hold of you then?

Herbert ten Have: 

Well, you can follow us on LinkedIn. We post frequently, let’s say a few times a week, uh, of course followers open source if you’re into developments, um, yeah, I’m online. So you can reach me if I can help you more than welcome to help.

Philip English: 

Sure. appreciate that, and I suppose that could be a mixture of, uh, of end users integrators, um, and, and anyone who needs that, uh, the, the, the vision to basically move lot items around them. So there’s a big industry there.

Herbert ten Have: 

Sure. But I, I meant also, uh, personally, if a student has an, a question or whatever, working in the, we have, we have 11 nationalities. We have a lot of people from abroad. We have consent to hire people from both. So, so we’re always open for new, uh, brilliant talent talent joining us.

Philip English: 

Thank you. Thank you here, but no, that’s great. Like many thanks fit fit for your time today. It’s very much appreciated. Thank you.

Herbert ten Have: 

My pleasure. Take care.

Robot Optimised Podcast #1 – Herbert Ten Have of Fizyr

Fizyr: https://fizyr.com/ 

Philip English: https://philipenglish.com/

Sponsor: Robot Center : https://www.robotcenter.co.uk

Youtube:- https://www.youtube.com/watch?v=QxPZKVDz65c

要查看或添加评论,请登录

社区洞察

其他会员也浏览了