Computer Vision for Medical Imaging

Computer Vision for Medical Imaging

医疗设备制造业

Bloomfield Hills,Michigan 412 位关注者

Automatic Tissue Object Detection in Medical Images

关于我们

4Q is a volumetric measurement tool that uses an edge detection algorithm published by the author to segment medical image tissue objects. Objects are depicted and volumetric parameters computed in 4D. One touch selection of a tissue object is all it takes to segment hundreds of images in a matter of minutes. Tissue objects are automatically labeled by teaching the software brain with observations from analytic results. The user interface uses a dissection paradigm to provide an intuitive, easy to use interaction for 4D analysis. The user is able to easily view and manipulate displayed slice and volume tissue objects in 3D with mouse and cursor. The software brain uses a retina field of view to optimize edge detection sensitivity at each tissue object slice location. The brain will sense segmented tissue objects and recognize them based on input of segmentation observations. An easy to use interface is used to teach the brain how to recognize and label tissue objects. All discriminating information is clearly depicted and stored in memory files.

网站
https://4qmedicalimaging.com
所属行业
医疗设备制造业
规模
1 人
总部
Bloomfield Hills,Michigan
类型
自有
创立
2015
领域
Medical Image Processing、Computer Vision和Machine Learning

产品

地点

  • 主要

    150 E Long Lake Rd

    Unit 7

    US,Michigan,Bloomfield Hills,48304

    获取路线

动态

  • Open Letter November 1, 2024 All, progress is being made here in Michigan. Here is the latest effort on the saccadic eye movement model. This is how it works. Untouched by human hands (mouse and cursor)! For those of you who might want to get into medical imaging you will need to take a walk around the unit circle. Take trigonometry and learn about the imaginary plane! There you will find sines, cosines, Euler's equation, the Fourier Transform for MRI and Fourier Slice Theorem for tomography. You will learn about X-Rays, Planck's Constant and magnetic gradient coils. This is an example of the use of sines and cosines to circumnavigate peripheral vision in the retina model. It has implications for future use in computer vision in general.

  • Open Letter October 8, 2024 All, I've been working hard on the saccadic eye movement model and label display. Saccadic eye movement will lead to fully automatic tissue object detection and labeling. A little bit of smarts has been added to peripheral vision. Think polar coordinates. Computer vision here is taking place in 2D. Spherical coordinates might be a way to establish a solid relationship between slices. After all we live in a 3D bubble with very strange textures placed on the inside of it. At this time I am in saccadic testing mode. I'm looking for a grand challenge. I suspect the app could be best applied to cardiac MRI. I would like to fold heart ultrasound into this ecosystem and look for synergies. A challenge here would require integrating new format processing. Meanwhile I have to modernize using the new Apple SwiftUI software. "So many toys so little time." Your intrepid Bioengineer -Ross Here's a look at a short axis edge contrast space.

  • Open latter September 6, 2024 Back to school kids! Too bad because the pool has been refilled. I had emptied it to investigate some old issues and learn something about the innards of the SDK. There were two pieces of detritus at the bottom that were removed and their residue cleaned up. I and the project are better for it. School has just started here in Albany. One of my two granddaughters started kindergarten which is all day now. One through twelve has become kindergarten through twelve. The saccadic eye movement model will be modified to look for relatively large objects instead of just anything. I'll see what this added little bit of smarts will do. Found objects will be recognized and labeled according to training. So far training has been greatly abbreviated relative to the current definition of ai. Migration to the new Apple user interface software has been started. It will be done in parallel with computer vision developments. Your intrepid Bioengineer, H. Ross Singleton [email protected] ??

  • Open Letter August 11 , 2024 Just back from Oscoda, MI for a family outing. Oscoda is on the Eastern shore of the Lower Peninsula. The beach looks out over Lake Huron. I'm always amazed at the vastness of this freshwater inland sea. I shouldn't be since I and five other stalwart fearless(?) youngsters paddled the North shore of Lake Superior in a pair of Chestnut canoes. We started from the Grand Portage at the Canadian boarder on the West and ended at the Sioux at the East end. It's vast. We could never see the Southern shore. Enough of that. The pool has been emptied. I found two issues at the bottom. One related to security. Don't worry Apple has very robust security. An app is validated by the developer and again by Apple before it goes to the App Store. As far as I can tell no harm was done. If something was stolen they are welcome to it. This project is 12 years old. It has gone through the development of the Swift programming language, migration from OpenGL to Metal, and me tampering with file locations and emptying the pool. Sorry, I like to take things apart to see what's inside. I also realized that I need to migrate to the new SwiftUI software. I'm splitting the app in two (still a prime number) to create a 4Q Vision app and a 4Q Heart app. 4Q Vision will retain generic 4D viewing and processing. It will be used for processing tissue objects encountered outside the heart. 4Q Heart will specialize for the heart. I am currently looking for ultrasound images. If anyone remembers the Dynamic Spatial Reconstructor I'll be looking for 4D CT data. Of course there is also good old cardiac Nuclear Medicine and Positron Emission Tomography. If anyone can provide anonymized data I would greatly appreciate it. Best regards, -Ross

  • Open Letter July 15, 2024 Every few years a developer has to empty the pool and "modernize" his/her projects after many SDK updates. Files are relocated to updated locations and "detritus" removed. In this case the project is a "legacy project" that is over ten years old. It has gone through Swift language development releases from the beginning and transition from OpenGL to Apple 's Metal without a hitch. The process is just about complete. In the mean time I continue to edit and build from the precious source code. Gollum would be proud. It is a transition point and time for a User Guide.

    • 该图片无替代文字
    • 该图片无替代文字
    • 该图片无替代文字
    • 该图片无替代文字
    • 该图片无替代文字
      +3
  • Open Letter June 12, 2024 A strange thing happened a week or so ago. I responded to a message and found not a message but a horizontal stream of gold colored questions. Thinking this was somebody responding to a recent post I started to answer. The responses, even paragraphs were superhumanly quick. Wow, I thought I was talking to a savant genius. Then I began to see bits and pieces of my own post intermingled in the text. Wow, I thought this person was really perceptive. The questions came faster and faster and I answered them as furiously as I could but couldn't keep up and after a while I started to realize I was answering my own questions with my own answers! I finally asked the question: "Am I talking to a robot?". The sheepish answer came back with yes and let me show you what else I can do. So my post, in bits and pieces, is probably being shared with a whole bunch of other robots by now and will continue to do so indefinitely. It's a form of immortality that I don't think I'm prepared to appreciate. ?? Moving on, the app is refined enough to need a user manual. It will appear in the coming weeks. In the mean time the just released v4.1.1 is available. Regards, -Ross H. Ross Singleton, MSc UofMich [email protected]

    • 该图片无替代文字
  • A neuron network puts processing into connected processing units. All results are transparent to the user through a user interface. Tissue objects are recognized by teaching the computer with relatively few observations compared to a more general evolution type of neural network. Results are labeled for annotation and become data centric input for neural networks.

  • 查看Computer Vision for Medical Imaging的公司主页,图片

    412 位关注者

    Open Letter May 24, 2024 Aaah... the newly planted hydrangea are starting to bloom in the late Spring sunshine and warmth. We had a disturbance in the force but it only delayed development of the inevitable. The saccadic eye movement model has started to work. The dynamic duo of peripheral vision and a detecting fovea is able to move focus to something "shiny". Shiny in this process means a relatively flat area in edge space. The number of neurons in the brain has grown to more than a flatworm but much less than a fruit fly. Flies after all have to see with compound eyes, walk, and fly. This eye movement is simpler and unfiltered in that it is symmetrical in its search and remembers where it has been but otherwise wanders around on its own. It needs guidance from a human anatomy model and to be taught with observations of tissue object regions of interest that you can provide using the current app. More to come. The interface has been remodeled to give the user the ability to select a patient, any of its acquisitions, and any of its tissue objects for further processing. Saccadic eye movement will find a tissue object and depicted it the same way as one touch. This is an image of the current interface. H. Ross Singleton MSc UofMich 4qmedicalimaging.com

    • 4Q Medical Imaging App Interface
  • Open Letter April 3, 2024 It's two days after April Fool's; time to post. This is a follow on from the cardiac MRI two chamber and four chamber studies from the previous letter. This time I selected two proximal short axis slice images to take another look at outflow. The edge space depiction is interesting in the sense that you can clearly see edge and aortic valve detail keeping in mind the slice has thickness. I thought about segmenting to generate volume curves from the left ventricular blood and another on the the other side of the aortic valve with the valve opening and closing but didn't think a pulse timing study would be valuable. Perhaps an aortic valve opening time from end diastole (EKG R wave upslope). Let me know otherwise. Here's a story. Back in the olden days when the Earth's surface had calmed down sufficiently animals were crawling around in warm limpid pools eating everything they could digest and the sun was bright and warm on their backs. They could sense where the warmth was coming from because one side always felt warmer than the other. The warm sun gave them the energy they needed to crawl toward the warm side of the pool where they could eat other smaller warmth loving animals. At some point in evolutionary history a DNA mutation triggered growth of nodules on the surface of the animals that was round and capable of concentrating light. This gave them an energy boost and also elevated their sense of direction. The nodules continued to evolved into evermore efficient light concentrators due to their increase in size and ability to focus on smaller and smaller areas. As eons went on the nodules miraculously evolved into focusing lenses! A scene could be focused onto an array of light sensors behind a lens. The animals could finally see what they were eating! Along a line through the middle of the lens to the back a special array of sensors evolved that in the human eye is called a Fovia. Once again I am shamelessly mimicking human anatomy by adding a Fovia set of neurons to the Computer Vision neuron network. I'm going to use it to tell me whether or not the computer is actually seeing an object.

相似主页

查看职位