• technocrit@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    34
    ·
    edit-2
    2 days ago

    This is invisible on paper but readable if uploaded to chatGPT.

    This sounds fake. It seems like only the most careless students wouldn’t notice this “hidden” prompt or the quote from the dog.

    Maybe if homework can be done by statistics, then it’s not worth doing.

    Maybe if a “teacher” has to trick their students in order to enforce pointless manual labor, then it’s not worth doing.

    Schools are not about education but about privilege, filtering, indoctrination, control, etc.

    • thebestaquaman@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      2 days ago

      The whole “maybe if the homework can be done by a machine then its not worth doing” thing is such a gross misunderstanding. Students need to learn how the simple things work in order to be able to learn the more complex things later on. If you want people that are capable of solving problems the machine can’t do, you first have to teach them the things the machine can in fact do.

      In practice, compute analytical derivatives or do mildly complicated addition by hand. We have automatic differentiation and computers for those things. But I having learned how to do those things has been absolutely critical for me to build the foundation I needed in order to be able to solve complex problems that an AI is far from being able to solve.

    • ArchRecord
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      2 days ago

      Schools are not about education but about privilege, filtering, indoctrination, control, etc.

      Many people attending school, primarily higher education like college, are privileged because education costs money, and those with more money are often more privileged. That does not mean school itself is about privilege, it means people with privilege can afford to attend it more easily. Of course, grants, scholarships, and savings still exist, and help many people afford education.

      “Filtering” doesn’t exactly provide enough context to make sense in this argument.

      Indoctrination, if we go by the definition that defines it as teaching someone to accept a doctrine uncritically, is the opposite of what most educational institutions teach. If you understood how much effort goes into teaching critical thought as a skill to be used within and outside of education, you’d likely see how this doesn’t make much sense. Furthermore, the heavily diverse range of beliefs, people, and viewpoints on campuses often provides a more well-rounded, diverse understanding of the world, and of the people’s views within it, than a non-educational background can.

      “Control” is just another fearmongering word. What control, exactly? How is it being applied?

      Maybe if a “teacher” has to trick their students in order to enforce pointless manual labor, then it’s not worth doing.

      They’re not tricking students, they’re tricking LLMs that students are using to get out of doing the work required of them to get a degree. The entire point of a degree is to signify that you understand the skills and topics required for a particular field. If you don’t want to actually get the knowledge signified by the degree, then you can put “I use ChatGPT and it does just as good” on your resume, and see if employers value that the same.

      Maybe if homework can be done by statistics, then it’s not worth doing.

      All math homework can be done by a calculator. All the writing courses I did throughout elementary and middle school would have likely graded me higher if I’d used a modern LLM. All the history assignment’s questions could have been answered with access to Wikipedia.

      But if I’d done that, I wouldn’t know math, I would know no history, and I wouldn’t be able to properly write any long-form content.

      Even when technology exists that can replace functions the human brain can do, we don’t just sacrifice all attempts to use the knowledge ourselves because this machine can do it better, because without that, we would be limiting our future potential.

      This sounds fake. It seems like only the most careless students wouldn’t notice this “hidden” prompt or the quote from the dog.

      The prompt is likely colored the same as the page to make it visually invisible to the human eye upon first inspection.

      And I’m sorry to say, but often times, the students who are the most careless, unwilling to even check work, and simply incapable of doing work themselves, are usually the same ones who use ChatGPT, and don’t even proofread the output.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      Even if the prompt is clear, the ask is a trap in and of itself. Because it’s not possible to actually do, but it will induce an LLM to synthesize something that sounds right.

      If it was not ‘hidden’, then everyone would ask about that requirement, likely in lecture, and everyone would figure out that they need to at least edit out that part of the requirements when using it as a prompt.

      By being ‘hidden’, then most people won’t notice it at all, and the few that do will fire off a one-off question to a TA or the professor in an email and be told “disregard that, it was a mistake, didn’t notice it due to the font color” or something like that.

    • TheRealKuni@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 days ago

      Maybe if homework can be done by statistics, then it’s not worth doing.

      Lots of homework can be done by computers in many ways. That’s not the point. Teachers don’t have students write papers to edify the teacher or to bring new insights into the world, they do it to teach students how to research, combine concepts, organize their thoughts, weed out misinformation, and generate new ideas from other concepts.

      These are lessons worth learning regardless of whether ChatGPT can write a paper.

    • Goodman@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      It does feel like some teachers are a bit unimaginative in their method of assessment. If you have to write multiple opinion pieces, essays or portfolios every single week it becomes difficult not to reach for a chatbot. I don’t agree with your last point on indoctrination, but that is something that I would like to see changed.

    • Smith6826@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      All it takes is a student to proofread their paper to make sure it’s not complete nonsense. The bare minimum a cheating student should do.