Begin transmission…
Ethers + Polygon Amoy—and Mumbai deprecation
by Dan Buchholz
If you haven't been in the loop, Polygon deprecated the Mumbai testnet. Providers like Alchemy have also deprecated Mumbai RPCs—so…it's time to transition to Amoy! We'll be releasing a new @tableland/evm
and downstream clients later this week, which will swap the chains from a support perspective.
Custom Amoy logic
Support for Amoy isn't quite ready out-of-the-box if you're working with ethers v6. This is due both to fluctuations in gas and the fact the default ethers maxFeePerGas
and maxPriorityFeePerGas
values are wildly incorrect. For example, if you use the built-in method, you'll get values like this:
import { ethers } from "hardhat"; // using ethers within hardhat context const { maxFeePerGas, maxPriorityFeePerGas } = await ethers.provider.getFeeData(); // maxFeePerGas: 1000000030n (note: `n` signifies `bigint`; it's wei) // maxPriorityFeePerGas: 1000000000n
But, if you look on the Amoy Gas Station (here), these values will be up to 30x larger (e.g., 35185714305 wei
). If you pass the default values, your transaction will never get accepted because the fees are far too low. Instead, you must create a custom method—or use the ethers plugin—to fetch the live fees.
You can do this with a Node fetch
call and also make it generalized so that it either returns the Amoy fee data or the built-in method's fee data (for other chains):
import { network } from "hardhat"; import type { FeeData } from "ethers"; async function getFeeData(chainName: string): Promise<FeeData> { if (chainName === "polygon-amoy") { try { const url = "<https://gasstation-testnet.polygon.technology/amoy>"; const response = await fetch(url); const data = (await response.json()) as AmoyFeeData; // a custom interface that aligns to the API above const feeData = new FeeData( null, // No gas price value needed BigInt(ethers.parseUnits(String(data.fast.maxFee), "gwei")), BigInt(ethers.parseUnits(String(data.fast.maxPriorityFee), "gwei")) ); return feeData; } catch { const feeData = new FeeData(); return feeData; } } else { return await ethers.provider.getFeeData(); } } const chainName = network.name; // "polygon-amoy" or whatever chain const { maxFeePerGas, maxPriorityFeePerGas } = await getFeeData(chainName);
Since Amoy is pretty new, the costs to deploy a contract also seem to vary quite a bit and are more expensive than when deploying on Mumbai. For example, an upgradable contract deployed on a low gas day (Sunday) cost around 0.18 MATIC, but on a higher gas day (Monday), it was closer to 0.8 MATIC.
If you need Amoy test MATIC, the Alchemy and Polygon faucets each drip 0.5 MATIC per day…which isn't great when fees are so high, but what can you do?!
Tableland Studio updates
by Dan Buchholz
We officially launched the Studio a few weeks ago and are continuing to build new features. Most of the more recent additions were changes to optimize the developer/user experience within the app. Let's review a few of them.
Dark mode
Let's be honest…everyone prefers dark mode! We changed the design to make this the default color when you visit the Studio app.
Slide in/out panels
When creating projects or tables, you used to be redirected to a form to fill out the information, which led to more clicks. Now, a nice panel pops out from the right-hand side, which lets you fill out the info without the redirection.
Exposing environment IDs & improved project layout
In the Studio CLI, the concept of an environment ID was part of certain commands, but it wasn't obvious how or where these came from. The UI shows these on the project's page, along with improvements in how the project’s description is rendered. When working in the CLI, you use the project ID to then, for example, set a context that lets the CLI know you intend to work within this project.
Tweaks & more to come
Lastly, we made some small tweaks to the logic (e.g., route name guards against reserved keywords) and created a new package to help modularize forms.
Those initial changes were part of a design and user experience improvement process. Next, we're starting to dive into letting users edit certain information in the Studio—and if you're looking for something that isn't a feature or something that isn't working as expected, be sure to let us know!
DePIN Corner: Wingbits
by Marla Natoli
Wingbits** is organizing a community around employing Automatic Dependent Surveillance-Broadcast (ADS-B) technology to capture, aggregate and monetize aviation data, to enhance aviation safety and efficiency. They have a variety of hardware options to choose from, and participants can claim a hexagon of coverage on a first-come, first-served basis. The goal is to get enough coverage in certain areas to reach critical mass whereby they can begin monetizing the data. Participants are rewarded based on volume and quality of coverage data via a token, with the hope of creating a flywheel effect as more data is monetized.
ADS-B is a type of aircraft tracking data that is transmitted by aircraft in real-time including the aircraft’s position, velocity, altitude, etc. The main output includes flight tracking data feeds for both live and historic flights which enables developers to access real-time data for various use cases including flight tracking websites, insights to improve operations for airlines and charters, air traffic control organizations, emergency services, and more.
Wingbits is still early in their DePIN journey, and we’re excited to see how the incentive mechanisms they have in place can catapult their community toward creating enough data to kickstart monetization. This would be yet another industry that could begin creating valuable data outside of walled gardens and siloes, giving data consumers more options for how they collect and use data.
If you’re interested in starting to open up your data, or in exploring how decentralized data infrastructure could support collaboration, monetization, and verification, we’d love to hear from you. Feel free to set up some time with us here or join our Discord and get in touch.
**Information was taken from https://docs.wingbits.com/
Designing for the web through prototyping
by Jim Kosem
This past week has been a lot of making web prototypes, namely in Framer, a tool that allows designers to design and publish websites without coding. I learned how to make websites a long time ago, longer than most could imagine or want to imagine. It’s what led me to interaction design and user research. When you design software that nine times out of ten, or even more actually, comes to the world in a browser, if you’re not building an actual website, you’re building an application that shares many of the functions of one.
So, understanding the web as material means you need to not only know what can happen in a browser, but you have to sketch out how it looks and feels. You must play around with how it scales and changes and adapt your design to that. This means you need to prototype before you build. We’ll make websites properly, digging deeply into the frontend, but right now, design means seeing quickly how it works first.
Insights from the Intersection of AI and Crypto: Pioneering the Future of Decentralized Technologies
Last Thursday, we hosted a Twitter Spaces event featuring innovators from the forefront of "Decentralized AI." Moderated by Jonathan Victor of Ansa Research, this session delved into how blockchain and decentralized networks can revolutionize artificial intelligence. Leaders from this emerging frontier discussed the vast potential and the synergistic effects of these technologies, revealing numerous ways they could alter our economic systems, redefine privacy and security, and enhance communal governance.
Joel Thorstensson, co-founder of 3Box Labs and creator of Ceramic, emphasized the societal implications of decentralized technologies. He pointed out, "The consequence of AI models can be used for a lot of good stuff, a lot of bad stuff. And a lot of the bad stuff is like impersonation and things like that. This is actually something that the crypto space, since the early cypherpunks, foresaw… with things like public key cryptography, we can actually change the structure of society." He underscored the need for ethical frameworks to govern the deployment of these technologies, reflecting Ceramic Network’s commitment to building a secure, open web where data integrity and verifiability are paramount.
Ben Fielding, co-founder of Gensyn, discussed the practical benefits of decentralization for AI, particularly in terms of resource allocation. He noted, "Decentralization essentially allows us to unlock a lot of those resources by drastically lowering the barrier to being a supplier of the resources." Ben's insights stress how decentralization enhances the efficiency and scalability of AI applications, an approach that Gensyn is applying to create a global supercluster of computational resources. He also emphasized the vital role of open-source models in keeping the development of AI technologies accessible and inclusive.
Doug Petkanics, founder of Livepeer, focused on the market dynamics of decentralized networks, emphasizing the necessity of aligning supply with real-world demand. Doug explained, "To actually bring on real demand, and to make sure that fees are flowing to the supply side as well, you have to productize and you have to meet the market where it is, you have to make a product that fits in with the way that people want to consume these resources." He also highlighted the importance of regulatory environments that support rather than stifle innovation, ensuring that new technologies can reach their full potential without unnecessary barriers.
David Minarsch, co-founder of Valory, highlighted another dimension of decentralization by discussing the role of autonomous agents. He articulated, "Another way to think about decentralization in this context is at the other end of the spectrum, where the autonomous agent is something we share, which essentially works as a collective whole." David’s work at Valory is reshaping how communities and organizations interact with AI through their design of autonomous, co-owned systems that empower individual and collective decision-making, further underscoring the need for data verifiability to ensure trust and functionality in these systems.
The conversation explored the impact of regulatory environments on innovation, the importance of data verifiability to maintain trust in decentralized systems, and the necessity of open-source models to democratize access to technology. These interconnected themes illustrated the challenges and opportunities at the intersection of AI and blockchain technologies. As we look toward a future where these technologies are increasingly intertwined, the insights from these leaders will help provide a roadmap for navigating the challenges and seizing the opportunities that lie ahead.
We’d like to thank Jonathan for his excellent stewardship of the conversation, as well as Joel, Ben, Doug, and David, for their participation and contributions to the field broadly. Thanks to pioneers like them, the trajectory of technological development and its societal impact are in capable hands. You can listen to the full conversation here.
Stay tuned for upcoming discussions on the cutting-edge of our decentralized future by following us on Twitter and joining our Discord server.
Other updates this week
Hackathon roundup
The Filecoin Data Economy hack is over, and winners will be announced this upcoming Wednesday—three prizes for 1st/2nd/3rd will be awarded. Plus, the LearnWeb3 Decentralized Intelligence hack is also over, so keep an eye out for the top 3 hacks and top 4 runner-ups!
End transmission…
Want to dive deeper, ask questions, or just nerd out with us? Jump into our Telegram or Discord—including weekly research office hours or developer office hours. And if you’d like to discuss any of these topics in more detail, comment on the issue over in GitHub!
Are you enjoying Weeknotes? We’d love your feedback—if you fill out a quick survey, we’ll be sure to reach out directly with community initiatives in the future!: Fill out the form here