• kescusay@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    This is a recipe for SQL injections, race conditions, memory leaks, and keys being placed directly in code.

    Trust the output of an LLM at your peril. Literally.

      • kescusay@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        54 minutes ago

        Unless you’re checking every line and have a good enough and comprehensive enough understanding of the codebase to spot subtle bugs it will try to introduce that aren’t caught by your tests, you’re still opening yourself up to problems.