You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Moved the project idea for underwater mapping and exploration using spatially anchored panoramas to the second idea position, in order to increase its discoverability, as it became important.
Copy file name to clipboardExpand all lines: pages/development/google-summer-of-code/2026.md
+29-29Lines changed: 29 additions & 29 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -69,6 +69,35 @@ To apply for this project idea, please complete <a target="_blank" href="https:/
69
69
70
70
<hr>
71
71
72
+
### SeaSee'r: Underwater Mapping and Exploration using Spatially anchored Panoramas
73
+
74
+
350 Hours
75
+
{: .label .label-blue }
76
+
77
+
{: .highlight }
78
+
> **Required Skills**: Web Development (Full Stack), Web 3D Graphics, Real-Time Rendering, Python, Computer Vision, Git, REST APIs
79
+
> **Possible Mentors**: Benedikt Kantz, Tobias Schreck, Wolfgang Slany
80
+
> **Expected Outcome**: A navigatable underwater map based on real multimodal data from a remotely operated vehicle (ROV), including panorama (videos) and sonar data
Modern underwater ROVs provide a magnitude of different data streams in both real-time and post-operation for scientific application. There is, however, a lack of systems integrating the domain-specific requirements for navigating and exploring these data, especially for frequent site visits to study the behavior of marine creatures.
88
+
89
+
The project therefore proposes the creation of an open-source system to ingest data from the ROV and display the panorama images spatially anchored based on the position and orientation of the vehicle at the specific time stamps. The resulting interface should also incorporate multimodal data, i.e. sonar data and possible further data streams like point clouds directly into the visual exploration system. Once this basic visualization system is set up, specific exploration tasks can be implemented, such as:
90
+
91
+
- Mission and path planning, with
92
+
- Search for prior exploration passes for detailed mission planning at specific sites,
93
+
- Point cloud stitching / Image registration to compare passes over the same regions,
94
+
- Site Re-Identification even in noisy environments through multimodal data use, and possible
95
+
- Registration of new images and feeds live (during missions) to improve the navigation accuracy.
96
+
97
+
The project serves as an exploratory foray into bringing advances in spatial exploration and mapping techniques below the sea, and requires an extensible and longevity-focused architecture to enable downstream additions of data modalities and different vehicle types. The specific mission planning aspects need to be flexible in their structure as well, to allow for a rapid evolution in supported planning tooling; as the scientific goals might shift over time -- or new aspects of the marine life are discovered, requiring new paradigms.
98
+
99
+
<hr>
100
+
72
101
### Pocket Paint Flutter: backwards compatibility to old Android app
73
102
74
103
350 Hours
@@ -600,35 +629,6 @@ The platform is intentionally scoped as a learning-oriented reference implementa
600
629
601
630
<hr>
602
631
603
-
### SeaSee'r: Underwater Mapping and Exploration using Spatially anchored Panoramas
604
-
605
-
350 Hours
606
-
{: .label .label-blue }
607
-
608
-
{: .highlight }
609
-
> **Required Skills**: Web Development (Full Stack), Web 3D Graphics, Real-Time Rendering, Python, Computer Vision, Git, REST APIs
> **Expected Outcome**: A navigatable underwater map based on real multimodal data from a remotely operated vehicle (ROV), including panorama (videos) and sonar data
Modern underwater ROVs provide a magnitude of different data streams in both real-time and post-operation for scientific application. There is, however, a lack of systems integrating the domain-specific requirements for navigating and exploring these data, especially for frequent site visits to study the behavior of marine creatures.
619
-
620
-
The project therefore proposes the creation of an open-source system to ingest data from the ROV and display the panorama images spatially anchored based on the position and orientation of the vehicle at the specific time stamps. The resulting interface should also incorporate multimodal data, i.e. sonar data and possible further data streams like point clouds directly into the visual exploration system. Once this basic visualization system is set up, specific exploration tasks can be implemented, such as:
621
-
622
-
- Mission and path planning, with
623
-
- Search for prior exploration passes for detailed mission planning at specific sites,
624
-
- Point cloud stitching / Image registration to compare passes over the same regions,
625
-
- Site Re-Identification even in noisy environments through multimodal data use, and possible
626
-
- Registration of new images and feeds live (during missions) to improve the navigation accuracy.
627
-
628
-
The project serves as an exploratory foray into bringing advances in spatial exploration and mapping techniques below the sea, and requires an extensible and longevity-focused architecture to enable downstream additions of data modalities and different vehicle types. The specific mission planning aspects need to be flexible in their structure as well, to allow for a rapid evolution in supported planning tooling; as the scientific goals might shift over time -- or new aspects of the marine life are discovered, requiring new paradigms.
0 commit comments