Little About Me

Hey hi, Nice meeting you! Hope you are having a great time.

My name is Manikanta Loya, a second year MS in CS student at University of California Irvine. Currently, I work as Graduate Student Researcher, advised by Dr. Sameer Singh, at UCI NLP lab. The companies I have worked for in the past include Samsung Research & Amazon. I completed my bachelor's in Electronics Engineering from the Indian Institute of Technology Varanasi, India.
I am proficient in Python C++ JAVA, JavaScript, R, Julia, C#, and other 351 programming languages. Surprised ? don't be, I only need to build and make a Large Language Model do the work for me, which I can do efficiently. I research improving robustness of Code Generation Models, data poisoning attacks on LLMs and cognitive behavior of LLMs. I am equally fascinated by the scale at which current models are being built. This has drawn me towards distributed systems and parallel computing, and I have completed a couple of projects as part of course work.
In free time, I like to explore new places, hike, play chess and create art.

I am passionate about acquiring knowledge and building software that makes people's life better. The following quotes inspire me to the best version of myself, hopefully they inspire you too:

"You were born an original, don't die a copy." - John Mason.

I see myself as a lifelong learner. In fact, everyone needs to be, or else the future ChatGPTs capabilities will be limited(they learn from what we produce!). I will be graduating in June 2023. Please feel free to contact me if you have an exciting opportunity that can help me grow and achieve my dream of helping others.

Work experience

GSR @ UCI NLP Lab [Sep 2022 - Present]

  • Large Language Models have attracted lot of attention these days and have been deployed to serve wide variety of tasks. One such task is Code Completion or Generation.
  • Most of these models are trained on publicly available sources like Github, limiting their capabilities by the content present in these sources.
  • An adversary can easily introduce poisonous data to model's training by simply making it available online.
  • As part of this research project we investigate resilience of Code Generation Models like Incoder, CodeT5
  • and develop ways to prevent such attacks.

SDE Intern @ AWS, Amazon.com [Jun 2022 - Sept 2022]

  • Spent Summer of 2022 designing, developing and testing bigdata application on large scale data for EBS Snapshot team.
  • It was fun experience, there has been many first's - first time writing code in Java, working in AWS environment.
  • We optimized the Garbage Collector workflow's runtime by 50% and cost by 60%.
  • I studied partitioning, data schema for efficient storage of data and utilized Apache Spark & Hadoop to build the application.

Software Engineer @ SAMSUNG Research Institute [July 2017 - Sept 2020]

  • Designed and developed interface for 4G cellular dongles into Samsung TV and maintained 3G dongle support
  • Optimized performance and boosted reliability of wireless & cellular internet connection. Recommended network changes and reduced internet connection time for cellular networks by 83%
  • Lead the development of 5G Cellular Dongle integration to Samsung TV's
  • Studied modem protocols such as MBIM, QMI and integrated into TV stack to achieve best results. Projects were showcased in CES 2020 ('5G-8K TV', 'Callar for Sero TV' (AR video call))

Publications

Projects

Multi-Image Generation using Cycle GAN [Mar 2022 - June 2022]

  • Designed and implemented Augmented Cycle GAN to learn and generate many to many mappings of two domain using noise.
  • Injected noise acts as latent variable controlling image generation and varying it produces different versions of single image.
  • Analyzed performance of the system on Edges2Shoes, Night2Day, CelebA and FERDB datasets
  • Code is available in manikanta-72/multi-image-generation repository

Distributed Multi-Room Chat Application [Jan 2023 - Mar 2023]

  • Designed and implemented a multi-room chat web application in distributed environment and deployed it in AWS EC2 instances.
  • Publish and Subscribe architecture was implemented using Apache Kafka as the messaging middleware.
  • Java-spring boot and React were used as backend and frontend.