News in English

I embedded myself in a vibe coding team at Gemini's AI hackathon in Singapore. Building an app in 7 hours takes real work.

I followed a hackathon team as they raced to vibe code an app in seven hours at Google's Gemini 3 Hackathon in Singapore.
  • I spent seven hours with a vibe coding team at Google's Gemini 3 hackathon in Singapore.
  • Watching from the sidelines was intense.
  • From prompting and debugging to filming the demo — here's how it all unfolded.

Just after sunrise, four vibe coding enthusiasts from Malaysia crossed into Singapore with a loose idea — and a bet that AI could build most of their app.

Hours later, they were racing to prototype it at Google's Gemini 3 Hackathon in Singapore.

The four friends, all in their late 30s to 40s, came from different professional backgrounds. Chan Wei Khjan is an accountant. Chan Ler-Kuan lectures on AI at a private university. Loh Wah Kiang works in IT. Lee How Siem, who goes by Benny, is the chief technology officer of a Malaysian startup.

Their initial idea was a "feng shui" app to analyze properties in Singapore — a potentially lucrative use case in a market obsessed with housing and wealth accumulation. Feng shui is a traditional Chinese practice that evaluates how a person's surroundings, along with birth factors, influence luck and well-being.

I embedded with the team at Google's developer space in Singapore in January to observe how a vibe-coding project comes together — or nearly falls apart — in seven hours.

9:30 a.m.: The brief

The assignment: Teams of up to four people had to build a working demo, publish a public repository with code, and submit a short video explaining their project by 5:30 p.m.

Each project had to fit into one of six tracks, including generative media, deep research, and enterprise orchestration.

Organized by Google DeepMind and 65labs, Singapore's AI builder collective, the hackathon featured a 100,000-credit Gemini API prize pool, with first place getting 30,000 credits.

By the end of the day, 189 participants had built 76 projects.

10:30 a.m.: Getting started

The team had pivoted to a new idea due to time constraints: a feng shui app that could analyse a user's outfit and workspace through the phone camera in real time and assess how "lucky" they were.

Wei Khjan took the lead on prompting. He typed the first instructions into Claude, asking it to generate the workflow and code. Ler-Kuan focused on whether the AI's output aligned with feng shui concepts. Wah Kiang and Benny hovered over the codebase, refining ideas and flagging issues.

"For people who don't know how to read code, it's helpful to have people who do," Wei Khjan said.

While waiting for the code to be generated, Ler-Kuan opened Google's AI Studio to design the app's logo. They called their app "Feng Shui Banana."

11:40 a.m.: The bugs arrive

After about an hour, Claude generated the initial codebase for the app. It was designed to work with the Gemini Live API, enabling real-time image and text analysis. It ran but was riddled with bugs.

An error message flashed when they tested the camera feature, so Wei Khjan copied the error back into the AI and asked for it to be fixed. Minutes later, the feature worked.

It wasn't right. The feng shui logic was off, especially where colour analysis intersected with the user's birth timings. Ler-Kuan manually corrected the underlying dictionary and its mappings.

The team kept prompting to tighten the features: shorter explanations, clearer output, and more streamlined user interfaces.

By 12 p.m., the app was rough, but it existed.

12:20 p.m.: Lunch can wait

Lunch arrived. The team stayed glued to their screens.

The app didn't respond instantly when a user changed their outfit, nor did it update its feng shui analysis in real time.

Wei Khjan explained how one prompt matters. Instead of issuing commands, he asked the AI to "discuss it with me." The shift changed how the model reasoned, and it worked more like a collaborator.

After some prompting, the app updated with a real-time camera analysis. It was striking to watch a feature emerging from a short back-and-forth with AI.

1 p.m.: Putting the app to the test

I helped the team test the app.

The camera correctly identified what I was wearing: a dark green polo, a yellow participant tag, and a white name card hanging from my neck. According to the app, I was already wearing colours aligned with my luck for the day.

The app suggested small tweaks, such as additional accessories, that could enhance the feng shui of my outfit.

1:20 p.m.: Pizza break

They finally had lunch and joked around to ease the tension. Four hours remained before they had to submit their project.

1:40 p.m.: Back to work

Ler-Kuan shifted focus to workspace feng shui, feeding knowledge into the model and refining how the app would evaluate desks and work setups. Wah Kiang and Benny worked on the video demo.

By 2 p.m., they had a landing page that looked animated and 3D. When I asked Wei Khjan how he felt, he smiled.

The team also revisited the app's tagline. After cycling through suggestions from multiple AI models, they settled on a line that didn't come from an AI at all: "A wisdom, not a superstition."

3 p.m.: Filming the demo

By late afternoon, the restlessness was showing. The team snacked and paced, then decided to film the video explaining their project.

They used Gemini to generate a storyboard for the demo video. The model laid out several scenes and drafted the script. The team followed along, filming clips and stitching everything together as they went.

Their workspace feature was also up and running.

4 p.m.: Final touches

The app had come together nicely. With some time to spare, they decided to add audio output for users who prefer listening to reading on a screen.

The first attempt to generate a voice using AI fell flat. It sounded robotic.

After debugging and several iterations, they landed on a voice they liked, similar to how a Chinese feng shui master might speak.

5:30 p.m.: Deadline

As the deadline approached, the team was still stitching clips for their video and nitpicking the AI-generated presenter voice.

The organizers had urged teams to submit early. With about 15 minutes to spare, they made the call to lock the final cut and hit submit.

Then it was over. The hunger hit immediately, and everyone got in line for some well-deserved food.

Even as an observer, watching from the sidelines was tiring. Seven hours of vibe coding turned out to be anything but effortless.

The team didn't win a prize, but agreed that the hackathon had been worth it.

"Sometimes, the best experiences come from saying 'yes' without overthinking," said Ler Kuan. "Innovation starts with curiosity and a little bit of spontaneity."

Do you have a story to share about vibe coding? Contact this reporter at cmlee@businessinsider.com.

Read the original article on Business Insider

Читайте на сайте