Computer Vision (Object Tracking)
Tessellate Imaging is a machine learning startup helping companies accelerate into the AI era. We empower better decision-making through data. Computer vision, data science, and deep learning are core elements of our AI solutions. We’re looking for passionate people capable of using AI modules to innovate, design, and implement solutions for never-before solved real-world problems. Our team is a tight-knit, super collaborative group that’s on a mission to create a more curious and rational world.
Identify and track the location of moving 2D objects in videos
Investigate computer vision techniques for optimally estimating the location of a known object within videos in presence of clutter and occlusions.
Apply predictive filtering techniques to keep the latency to a minimum.
Collaborate across teams, working with business, product and engineering teams to deploy solutions to enterprise customers and see firsthand how a top-notch AI startup is built.
Guide the feature development process, from infancy to production.
Learn and work in all facets of product design, from interactions to information architecture, visual design, research, and more.
Fluent in Python and OpenCV with an understanding of Feature detection (ex: FAST, STAR, SIFT, SURF, ORB, Harris) and Motion estimation(optical flow, LK tracking)
Best if knowledgable about Kalman filters and variants
You're at least a rising undergraduate senior. The more experience you have, the better.
You're adept at using machine learning frameworks (such as PyTorch, TensorFlow, MXnet, etc.) to build machine learning/deep learning models and launch training experiments.
You have a proven track record of innovation in creating novel algorithms for real world problems in deep learning and/or computer vision (either through past internships or side projects), and your portfolio shows it.
Bonus points if
Interested? Apply now!
Please include a link to your GitHub profile and portfolio of past work. Submissions without these will not be considered.