HIP-412: Standardizing NFT Metadata for Interoperability and Flexibility

Metadata is an overlooked component that enriches NFT, especially collectables or even NFTs related to real estate. One of the many uses of NFT metadata is storing information about the attributes of a project, such as calculating investment, and the value and rarity of an NFT. Metadata proves authenticity and aids in proper licensing. Because of these benefits, standardizing NFT metadata for the future is imperative.

How does HIP-412 play into standardizing NFT Metadata and what are the resulting expected future developments?

On today’s episode of Gossip about Gossip, host Zenobia Godschalk converses with Michiel Mulders, a Developer Advocate with Swirlds Labs, to talk about the structure of HIP-412 and the importance of NFT Metadata.

The two discuss:

  1. Defining the properties field of an NFT’s Metadata structure and how overlooked this component of NFTs is
  2. The goal of HIP-412 as it relates to standardizing NFT Metadata by providing structure and improving interoperability with ecosystem tooling
  3. The Hedera community and HIP-412 aiming for increased flexibility in NFT Metadata and creating a wider range of use cases.

“What the standard is solving is that you’re creating a defined structure that everyone knows about. When a NFT wants to interpret the data, they can just see that the image data will always sit under the image field. So, there is no guessing, and they can be almost 100% sure that their data is in the correct place,” Mulders said. “That’s also the reason why the community was so eager to define a new standard.”

Michiel Mulders is a developer Advocate with Swirlds Labs. Mulders has been the Head of Developer Relations & Training for the Algorand Foundation, a Senior Develop Advocate for Humanitec, and a Senior Software Engineer at Lunie.

Recent Episodes

As demand for artificial intelligence continues to soar, the AI infrastructure needed to power it is scaling just as rapidly. A 2024 report from the International Data Corporation (IDC) forecasts that global spending on AI infrastructure will exceed $200 billion by 2028, driven by an explosion in compute-heavy applications like large language models and…

AI workloads are redefining the limits of data center design and infrastructure. Legacy data centers, built for traditional co-location, cannot handle the density, thermal demands, or power dynamics of accelerated computing. The AI boom has upended the data center sector, forcing a rapid shift to liquid-cooled racks as facilities pivot from sub-10kW racks to…

At PayPal Dev Days, the convergence of agentic AI tools and real-world developer ingenuity signaled a bold shift in the future of digital commerce. This wasn’t just about code—it was a glimpse into a world where AI collaborates, not just automates, and developers become architects of intelligent systems. Across immersive sessions and live coding…