I never thought I’d be writing code again after so many years of abandoning — and losing — my coding skills. As a product manager, my job has always been to translate customer needs into requirements, not to open an IDE or debug code. But then came generative AI. Suddenly, I found myself not just writing specs, but feeding them into an AI that could spit out working code.
It felt like an adventure — exciting, messy, and full of rookie mistakes (both mine and the AI’s). Along the way, I’ve learned a set of principles that any PM experimenting with GenAI should know.
Skeleton First, Atomic Second
The first lesson: don’t dump your entire requirements doc into AI and expect magic. That’s a recipe for hallucinations and chaos.
Instead, follow this steps:
- Skeleton First: Define the barebones structure — data models, folder layout, key classes, and services. Think of it as the map before the journey.
- Atomic Second: Feed AI one micro-requirement at a time. “Add a form.” “Make an API call using the following fields.” “Send request via
DemoRequestHandler.” Each step is small, testable, and traceable.
This approach keeps AI from wandering off into the jungle of its own assumptions. Don’t forget, GenAI will make assumptions whenever things are not clear.
Rookie Mistakes: AI Codes Like a Junior Dev
AI is fast, but it’s not wise. Left unchecked, it will:
- Hardcode API keys or URLs
- Mix UI and business logic
- Skip error handling
- Duplicate code instead of reusing utilities
In other words, it codes like a rookie developer. The fix? Prompt with architectural intent. Tell it where logic belongs, what modules to use, and what not to do.
Example: Instead of “Build a login form”, say:
“Create a login form component in
auth/views. UseAuthServicefor validation. Do not call the API directly.”
You have to keep GenAI on a short leash. If it doesn’t know how to do something, it will hallucinate — and the results won’t be what you expect. Furthermore, if it encounters an error, it may try to “fix” it in a completely different way, often by taking shortcuts. That’s how you end up with situations like AI calling raw APIs instead of using the proper components. GenAI will guess, and sometimes even invent commands that don’t exist, which leads to errors later during use.
DevOps Skeletons as Base Camps
Even non-coding PMs can orient themselves if they lean on DevOps scaffolding. Folder structures, config files, and CI/CD pipelines are like base camps in the adventure. They give you anchor points for prompting AI.
Example:
“Use the existing
config/auth.jsonfor field labels. Place the form inauth/views. Integrate withAuthServiceinservices/auth.ts.”
This keeps AI aligned with the real architecture instead of inventing its own.
Testing the Treasure
Automated tests are the safety rope of this adventure. AI can generate them, but Automated tests are the safety rope of this adventure. AI can generate them, but they’re often brittle. PMs can:
- Define test cases based on user journeys
- Prompt AI to generate unit and integration tests
- Validate coverage against requirements
But engineers should still own the final test code. Think of AI tests as scaffolding, not the finished bridge.
AI can definitely help create test scripts — even for uploading forms, images, and dialogs. It will happily generate both happy paths and corner cases on its own. But, as always, it needs guardrails. Without clear instructions, AI may overfit to trivial cases, miss critical edge conditions, or produce fragile tests that break under real-world use.
Best Practices Feed
Before writing any code, I now give AI a best practices feed — a short list of rules to prevent rookie mistakes. This upfront guidance saves hours of cleanup later.
What to Do
- Use config files or environment variables
- Keep functions atomic and single-purpose
- Separate rendering from validation and data logic
- Use existing utilities and modules — don’t reinvent
- Centralize constants and enums for consistency
- Include error handling and fallback logic
- Follow the app’s skeleton and folder structure
- Write tests for both happy paths and edge cases
What Not to Do
- Don’t accept hallucinated commands or libraries without verification
- Don’t hardcode API keys, tokens, or URLs
- Don’t log sensitive data (passwords, tokens, PII)
- Don’t mix UI rendering with business logic
- Don’t call APIs directly from UI components — always use service layers
- Don’t duplicate logic across files — reuse shared utilities
- Don’t create monolithic functions that handle multiple concerns
- Don’t skip edge cases or rely only on “happy path” tests
- Don’t change schemas midstream without versioning
The Reality Check
Here’s the hard truth: AI development isn’t ready for non-coding PMs to lead solo.
You still need developer instincts to:
- Spot misplaced logic
- Refactor when architecture changes
- Debug when AI reverts to old patterns
- Enforce security and modularity
For now, this is low-code with high discipline, not no-code magic.
My Advantage as a PM Who Was a Developer
One reason I’ve been able to push further is that I started my career as a developer. That background gives me an edge:
- I can read and understand the code AI generates, spotting when logic is misplaced or brittle.
- I know how to feed requirements and prompts in a way that aligns with real-world architecture.
- My PM experience in requirements gathering and customer needs means I can bridge the “what” and the “how” more effectively than either role alone.
This isn’t just exploration for me — it’s a glimpse into a mid-term future for product managers, where technical literacy and AI fluency become core skills. PMs won’t replace developers, but those who can guide AI with architectural clarity will shape the next generation of product building.
Closing: The PM as Explorer
Writing code with GenAI has been an adventure. I’ve fallen into traps (like midstream schema changes that broke everything), and I’ve found shortcuts (like skeleton-first prompting).
The biggest lesson? PMs don’t need to become engineers, but they do need to think like architects. Our role is to chart the path — define the skeleton, enforce the patterns, and feed AI atomic steps.
The adventure isn’t about replacing developers. It’s about learning how to guide AI so that both humans and machines can build better, faster, and smarter together.
The Future of PM + GenAI
This isn’t just a quirky side quest where product managers dabble in code. It’s a glimpse into the mid‑term future of our craft. As GenAI matures, PMs who can combine customer empathy, requirements clarity, and just enough technical fluency will become the architects of AI‑driven workflows. We won’t replace developers — but we’ll increasingly shape how AI and engineers collaborate, setting the skeleton, defining the guardrails, and ensuring the product vision translates into working systems. The future of product management isn’t about writing every line of code ourselves; it’s about mastering the orchestration of humans and machines to build faster, safer, and smarter than ever before.
Leave a comment