📚 세현's Vault

🌍 도메인

  • 🔮3D-Vision
  • 🎨Rendering
  • 🤖Robotics
  • 🧠LLM
  • 👁️VLM
  • 🎬GenAI
  • 🥽XR
  • 🎮Simulation
  • 🛠️Dev-Tools
  • 💰Crypto
  • 📈Finance
  • 📋Productivity
  • 📦기타

📄 Papers

  • 📚전체 논문172
Home

❯

bookmarks

❯

theres a lot of work now on llm watermarking but can we extend this to

theres-a-lot-of-work-now-on-llm-watermarking-but-can-we-extend-this-to

2025년 6월 23일1 min read

  • LLM
  • AR

Nikola Jovanović (@ni_jovanovic)

2025-06-23 | ❤️ 316 | 🔁 53


There’s a lot of work now on LLM watermarking. But can we extend this to transformers trained for autoregressive image generation?

Yes, but it’s not straightforward 🧵(1/10) https://x.com/ni_jovanovic/status/1937186101015511406/photo/1


🔗 Related

See similar notes in domain-llm

Tags

type-thread domain-llm


그래프 뷰

  • Nikola Jovanović (@ni_jovanovic)
  • 🔗 Related
  • Tags

백링크

  • domain-LLM

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Sehyeon Park