Recognizing Geographical Locations using a GAN-Based Text-To-Image Approach

Date

2025

Advisors

Journal Title

Journal ISSN

ISSN

2502-4752

DOI

Volume Title

Publisher

Institute of Advanced Engineering and Science (IAES)

Type

Article

Peer reviewed

Yes

Abstract

Generating photo-realistic images that align with the text descriptions is the goal of the text-to-image generation (T2I) model. They can assist in visualizing the descriptions thanks to advancements in Machine Learning Algorithms. Using text as a source, Generative Adversarial Networks (GANs) can generate a series of pictures that serve as descriptions. Recent GANs have allowed oldest T2I models to achieve remarkable gains. However, they have some limitations. The main target of this study is to address these limitations to enhance the text-to-image generation models to enhance location services. To produce high-quality photos utilizing a multi-step approach, we build an attentional generating network called AttnGAN. The fine-grained image-text matching loss needed to train the AttnGAN’s generator is computed using our multimodal similarity model. With an inception score of 4.81 on the PatternNet dataset, our AttnGAN model achieves an impressive R-precision value of 70.61 percent. Because the PatternNet dataset comprises photographs, we’ve added verbal descriptions to each one to make it a text-based dataset instead. Many experiments have shown that AttnGAN’s proposed attention procedures, which are critical for text-to-image production in complex circumstances, are effective.

Description

open access article

Keywords

Text-to-Image, Text reading, Generative Adversarial Network; GANs, AttnGAN model, Location-based services, Road infrastructure, Deep learning

Citation

Ibrahim, Dina M. and Al-Shargabi, Amal A. (2025) Recognizing Geographical Locations using a GAN-Based Text-To-Image Approach. Indonesian Journal of Electrical Engineering and Computer Science

Rights

Attribution 4.0 International
http://creativecommons.org/licenses/by/4.0/

Research Institute

Institute of Digital Research, Communication and Responsible Innovation