在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):layumi/University1652-Baseline开源软件地址(OpenSource Url):https://github.com/layumi/University1652-Baseline开源编程语言(OpenSource Language):Python 98.2%开源软件介绍(OpenSource Introduction):University1652-Baseline[Paper] [Slide] [Explore Drone-view Data] [Explore Satellite-view Data] [Explore Street-view Data] [Video Sample] [中文介绍] template.Download [University-1652] upon request. You may use the requestThis repository contains the dataset link and the code for our paper University-1652: A Multi-view Multi-source Benchmark for Drone-based Geo-localization, ACM Multimedia 2020. The offical paper link is at https://dl.acm.org/doi/10.1145/3394171.3413896. We collect 1652 buildings of 72 universities around the world. Thank you for your kindly attention. Task 1: Drone-view target localization. (Drone -> Satellite) Given one drone-view image or video, the task aims to find the most similar satellite-view image to localize the target building in the satellite view. Task 2: Drone navigation. (Satellite -> Drone) Given one satellite-view image, the drone intends to find the most relevant place (drone-view images) that it has passed by. According to its flight history, the drone could be navigated back to the target place. Table of contentsAbout DatasetThe dataset split is as follows:
More detailed file structure:
We note that there are no overlaps between 33 univeristies of training set and 39 univeristies of test set. News10 Jul 2022 Rainy?Night?Foggy? Snow? You may check our new paper "Multiple-environment Self-adaptive Network for Aerial-view Geo-localization" at https://arxiv.org/pdf/2204.08381 1 Dec 2021 Fix the issue due to the latest torchvision, which do not allow the empty subfolder. Note that some buildings do not have google images. 3 March 2021 GeM Pooling is added. You may use it by 21 January 2021 The GPU-Re-Ranking, a GNN-based real-time post-processing code, is at Here. 21 August 2020 The transfer learning code for Oxford and Paris is at Here. 27 July 2020 The meta data of 1652 buildings, such as latitude and longitude, are now available at Google Driver. (You could use Google Earth Pro to open the kml file or use vim to check the value). 26 July 2020 The paper is accepted by ACM Multimedia 2020. 12 July 2020 I made the baseline of triplet loss (with soft margin) on University-1652 public available at Here. 12 March 2020 I add the state-of-the-art page for geo-localization and tutorial, which will be updated soon. Code FeaturesNow we have supported:
Prerequisites
Getting startedInstallation
Dataset & PreparationDownload [University-1652] upon request. You may use the request template. For CVUSA, I follow the training/test split in (https://github.com/Liumouliu/OriCNN). Train & EvaluationTrain & Evaluation University-1652
Default setting: Drone -> Satellite If you want to try other evaluation setting, you may change these lines at: https://github.com/layumi/University1652-Baseline/blob/master/test.py#L217-L225 Ablation Study only Satellite & Drone
Set three views but set the weight of loss on street images to zero. Train & Evaluation CVUSA
Show the retrieved Top-10 result
It will save an image named `show.png' containig top-10 retrieval results in the folder. Trained ModelYou could download the trained model at GoogleDrive or OneDrive. After download, please put model folders under CitationThe following paper uses and reports the result of the baseline model. You may cite it in your paper. @article{zheng2020university,
title={University-1652: A Multi-view Multi-source Benchmark for Drone-based Geo-localization},
author={Zheng, Zhedong and Wei, Yunchao and Yang, Yi},
journal={ACM Multimedia},
year={2020}
} Instance loss is defined in @article{zheng2017dual,
title={Dual-Path Convolutional Image-Text Embeddings with Instance Loss},
author={Zheng, Zhedong and Zheng, Liang and Garrett, Michael and Yang, Yi and Xu, Mingliang and Shen, Yi-Dong},
journal={ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM)},
doi={10.1145/3383184},
volume={16},
number={2},
pages={1--23},
year={2020},
publisher={ACM New York, NY, USA}
} Related Work
|
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论