Thanks for writing this, it clarifies a lot. As someone teacing computer science, the "garbage in, garbage out" principle you highlight resonates so much; it's fascinating to see how basic communication skills are becoming central to leveraging AI, rather than just technical prowess. I'm curious if you see these prompt systems evolving to be less dependent on human 'engineers' in the future, or if that communication bottleneck will always remain? Your ability to explain this so clearly is realy impressive.
Thank you, this means a lot. I do believe that the models will get better at interpreting messy inputs, but the people who get the most out of AI will still be the ones who know how to express their thinking with clarity. That part hasn’t changed.
Prompt systems will slowly shift from “craft the perfect message” to “structure your thinking so the model can follow it”. That feels closer to teaching than engineering.
Thanks for writing this, it clarifies a lot. As someone teacing computer science, the "garbage in, garbage out" principle you highlight resonates so much; it's fascinating to see how basic communication skills are becoming central to leveraging AI, rather than just technical prowess. I'm curious if you see these prompt systems evolving to be less dependent on human 'engineers' in the future, or if that communication bottleneck will always remain? Your ability to explain this so clearly is realy impressive.
Thank you, this means a lot. I do believe that the models will get better at interpreting messy inputs, but the people who get the most out of AI will still be the ones who know how to express their thinking with clarity. That part hasn’t changed.
Prompt systems will slowly shift from “craft the perfect message” to “structure your thinking so the model can follow it”. That feels closer to teaching than engineering.