A Short Guide to Bitcoin Forks - CoinDesk

Why Osana takes so long? (Programmer's point of view on current situation)

I decided to write a comment about «Why Osana takes so long?» somewhere and what can be done to shorten this time. It turned into a long essay. Here's TL;DR of it:
The cost of never paying down this technical debt is clear; eventually the cost to deliver functionality will become so slow that it is easy for a well-designed competitive software product to overtake the badly-designed software in terms of features. In my experience, badly designed software can also lead to a more stressed engineering workforce, in turn leading higher staff churn (which in turn affects costs and productivity when delivering features). Additionally, due to the complexity in a given codebase, the ability to accurately estimate work will also disappear.
Junade Ali, Mastering PHP Design Patterns (2016)
Longer version: I am not sure if people here wanted an explanation from a real developer who works with C and with relatively large projects, but I am going to do it nonetheless. I am not much interested in Yandere Simulator nor in this genre in general, but this particular development has a lot to learn from for any fellow programmers and software engineers to ensure that they'll never end up in Alex's situation, especially considering that he is definitely not the first one to got himself knee-deep in the development hell (do you remember Star Citizen?) and he is definitely not the last one.
On the one hand, people see that Alex works incredibly slowly, equivalent of, like, one hour per day, comparing it with, say, Papers, Please, the game that was developed in nine months from start to finish by one guy. On the other hand, Alex himself most likely thinks that he works until complete exhaustion each day. In fact, I highly suspect that both those sentences are correct! Because of the mistakes made during early development stages, which are highly unlikely to be fixed due to the pressure put on the developer right now and due to his overall approach to coding, cost to add any relatively large feature (e.g. Osana) can be pretty much comparable to the cost of creating a fan game from start to finish. Trust me, I've seen his leaked source code (don't tell anybody about that) and I know what I am talking about. The largest problem in Yandere Simulator right now is its super slow development. So, without further ado, let's talk about how «implementing the low hanging fruit» crippled the development and, more importantly, what would have been an ideal course of action from my point of view to get out. I'll try to explain things in the easiest terms possible.
  1. else if's and lack any sort of refactoring in general
The most «memey» one. I won't talk about the performance though (switch statement is not better in terms of performance, it is a myth. If compiler detects some code that can be turned into a jump table, for example, it will do it, no matter if it is a chain of if's or a switch statement. Compilers nowadays are way smarter than one might think). Just take a look here. I know that it's his older JavaScript code, but, believe it or not, this piece is still present in C# version relatively untouched.
I refactored this code for you using C language (mixed with C++ since there's no this pointer in pure C). Take a note that else if's are still there, else if's are not the problem by itself.
The refactored code is just objectively better for one simple reason: it is shorter, while not being obscure, and now it should be able to handle, say, Trespassing and Blood case without any input from the developer due to the usage of flags. Basically, the shorter your code, the more you can see on screen without spreading your attention too much. As a rule of thumb, the less lines there are, the easier it is for you to work with the code. Just don't overkill that, unless you are going to participate in International Obfuscated C Code Contest. Let me reiterate:
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.
Antoine de Saint-Exupéry
This is why refactoring — activity of rewriting your old code so it does the same thing, but does it quicker, in a more generic way, in less lines or simpler — is so powerful. In my experience, you can only keep one module/class/whatever in your brain if it does not exceed ~1000 lines, maybe ~1500. Splitting 17000-line-long class into smaller classes probably won't improve performance at all, but it will make working with parts of this class way easier.
Is it too late now to start refactoring? Of course NO: better late than never.
  1. Comments
If you think that you wrote this code, so you'll always easily remember it, I have some bad news for you: you won't. In my experience, one week and that's it. That's why comments are so crucial. It is not necessary to put a ton of comments everywhere, but just a general idea will help you out in the future. Even if you think that It Just Works™ and you'll never ever need to fix it. Time spent to write and debug one line of code almost always exceeds time to write one comment in large-scale projects. Moreover, the best code is the code that is self-evident. In the example above, what the hell does (float) 6 mean? Why not wrap it around into the constant with a good, self-descriptive name? Again, it won't affect performance, since C# compiler is smart enough to silently remove this constant from the real code and place its value into the method invocation directly. Such constants are here for you.
I rewrote my code above a little bit to illustrate this. With those comments, you don't have to remember your code at all, since its functionality is outlined in two tiny lines of comments above it. Moreover, even a person with zero knowledge in programming will figure out the purpose of this code. It took me less than half a minute to write those comments, but it'll probably save me quite a lot of time of figuring out «what was I thinking back then» one day.
Is it too late now to start adding comments? Again, of course NO. Don't be lazy and redirect all your typing from «debunk» page (which pretty much does the opposite of debunking, but who am I to judge you here?) into some useful comments.
  1. Unit testing
This is often neglected, but consider the following. You wrote some code, you ran your game, you saw a new bug. Was it introduced right now? Is it a problem in your older code which has shown up just because you have never actually used it until now? Where should you search for it? You have no idea, and you have one painful debugging session ahead. Just imagine how easier it would be if you've had some routines which automatically execute after each build and check that environment is still sane and nothing broke on a fundamental level. This is called unit testing, and yes, unit tests won't be able to catch all your bugs, but even getting 20% of bugs identified at the earlier stage is a huge boon to development speed.
Is it too late now to start adding unit tests? Kinda YES and NO at the same time. Unit testing works best if it covers the majority of project's code. On the other side, a journey of a thousand miles begins with a single step. If you decide to start refactoring your code, writing a unit test before refactoring will help you to prove to yourself that you have not broken anything without the need of running the game at all.
  1. Static code analysis
This is basically pretty self-explanatory. You set this thing once, you forget about it. Static code analyzer is another «free estate» to speed up the development process by finding tiny little errors, mostly silly typos (do you think that you are good enough in finding them? Well, good luck catching x << 4; in place of x <<= 4; buried deep in C code by eye!). Again, this is not a silver bullet, it is another tool which will help you out with debugging a little bit along with the debugger, unit tests and other things. You need every little bit of help here.
Is it too late now to hook up static code analyzer? Obviously NO.
  1. Code architecture
Say, you want to build Osana, but then you decided to implement some feature, e.g. Snap Mode. By doing this you have maybe made your game a little bit better, but what you have just essentially done is complicated your life, because now you should also write Osana code for Snap Mode. The way game architecture is done right now, easter eggs code is deeply interleaved with game logic, which leads to code «spaghettifying», which in turn slows down the addition of new features, because one has to consider how this feature would work alongside each and every old feature and easter egg. Even if it is just gazing over one line per easter egg, it adds up to the mess, slowly but surely.
A lot of people mention that developer should have been doing it in object-oritented way. However, there is no silver bullet in programming. It does not matter that much if you are doing it object-oriented way or usual procedural way; you can theoretically write, say, AI routines on functional (e.g. LISP)) or even logical language if you are brave enough (e.g. Prolog). You can even invent your own tiny programming language! The only thing that matters is code quality and avoiding the so-called shotgun surgery situation, which plagues Yandere Simulator from top to bottom right now. Is there a way of adding a new feature without interfering with your older code (e.g. by creating a child class which will encapsulate all the things you need, for example)? Go for it, this feature is basically «free» for you. Otherwise you'd better think twice before doing this, because you are going into the «technical debt» territory, borrowing your time from the future by saying «I'll maybe optimize it later» and «a thousand more lines probably won't slow me down in the future that much, right?». Technical debt will incur interest on its own that you'll have to pay. Basically, the entire situation around Osana right now is just a huge tale about how just «interest» incurred by technical debt can control the entire project, like the tail wiggling the dog.
I won't elaborate here further, since it'll take me an even larger post to fully describe what's wrong about Yandere Simulator's code architecture.
Is it too late to rebuild code architecture? Sadly, YES, although it should be possible to split Student class into descendants by using hooks for individual students. However, code architecture can be improved by a vast margin if you start removing easter eggs and features like Snap Mode that currently bloat Yandere Simulator. I know it is going to be painful, but it is the only way to improve code quality here and now. This will simplify the code, and this will make it easier for you to add the «real» features, like Osana or whatever you'd like to accomplish. If you'll ever want them back, you can track them down in Git history and re-implement them one by one, hopefully without performing the shotgun surgery this time.
  1. Loading times
Again, I won't be talking about the performance, since you can debug your game on 20 FPS as well as on 60 FPS, but this is a very different story. Yandere Simulator is huge. Once you fixed a bug, you want to test it, right? And your workflow right now probably looks like this:
  1. Fix the code (unavoidable time loss)
  2. Rebuild the project (can take a loooong time)
  3. Load your game (can take a loooong time)
  4. Test it (unavoidable time loss, unless another bug has popped up via unit testing, code analyzer etc.)
And you can fix it. For instance, I know that Yandere Simulator makes all the students' photos during loading. Why should that be done there? Why not either move it to project building stage by adding build hook so Unity does that for you during full project rebuild, or, even better, why not disable it completely or replace with «PLACEHOLDER» text for debug builds? Each second spent watching the loading screen will be rightfully interpreted as «son is not coding» by the community.
Is it too late to reduce loading times? Hell NO.
  1. Jenkins
Or any other continuous integration tool. «Rebuild a project» can take a long time too, and what can we do about that? Let me give you an idea. Buy a new PC. Get a 32-core Threadripper, 32 GB of fastest RAM you can afford and a cool motherboard which would support all of that (of course, Ryzen/i5/Celeron/i386/Raspberry Pi is fine too, but the faster, the better). The rest is not necessary, e.g. a barely functional second hand video card burned out by bitcoin mining is fine. You set up another PC in your room. You connect it to your network. You set up ramdisk to speed things up even more. You properly set up Jenkins) on this PC. From now on, Jenkins cares about the rest: tracking your Git repository, (re)building process, large and time-consuming unit tests, invoking static code analyzer, profiling, generating reports and whatever else you can and want to hook up. More importantly, you can fix another bug while Jenkins is rebuilding the project for the previous one et cetera.
In general, continuous integration is a great technology to quickly track down errors that were introduced in previous versions, attempting to avoid those kinds of bug hunting sessions. I am highly unsure if continuous integration is needed for 10000-20000 source lines long projects, but things can be different as soon as we step into the 100k+ territory, and Yandere Simulator by now has approximately 150k+ source lines of code. I think that probably continuous integration might be well worth it for Yandere Simulator.
Is it too late to add continuous integration? NO, albeit it is going to take some time and skills to set up.
  1. Stop caring about the criticism
Stop comparing Alex to Scott Cawton. IMO Alex is very similar to the person known as SgtMarkIV, the developer of Brutal Doom, who is also a notorious edgelord who, for example, also once told somebody to kill himself, just like… However, being a horrible person, SgtMarkIV does his job. He simply does not care much about public opinion. That's the difference.
  1. Go outside
Enough said. Your brain works slower if you only think about games and if you can't provide it with enough oxygen supply. I know that this one is probably the hardest to implement, but…
That's all, folks.
Bonus: Do you think how short this list would have been if someone just simply listened to Mike Zaimont instead of breaking down in tears?
submitted by Dezhitse to Osana [link] [comments]

NEAR PROJECT REPORT

NEAR PROJECT REPORT
Author: Gamals Ahmed, CoinEx Business Ambassador
https://preview.redd.it/xbnvecjn71t51.png?width=1164&format=png&auto=webp&s=acfd141ead035ee156f218eec9fc41288142a922

ABSTRACT

The effects of the web by a number of companies have seduced a large number of users as these companies keep their data to prevent them from searching for alternatives. Likewise, these huge platforms have attracted applications to build their highest ecosystems before either severing access or actively opposing their interests when the applications became so successful. As a result, these walled gardens have effectively hindered innovation and monopolized large sections of the web. After the emergence of blockchain technology and decentralized cryptocurrencies, the need for applications to support decentralization has emerged. Several blockchain-based companies, applications and platforms have appeared in decentralization. In this research report, we will explain the approach adopted by the NEAR decentralization platform in designing and implementing the basic technology for its system. Near is a basic platform for cloud computing and decentralized storage managed by the community, designed to enable the open web for the future. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.

1. INTRODUCTION

The richness of the web is increasing day by day with the combined efforts of millions of people who have benefited from “innovation without permission” as content and applications are created without asking anyone. this lack of freedom of data has led to an environment hostile to the interests of its participants. And as we explained in the summary previously, web hosting companies have hindered innovation and greatly monopolized the web.
In the future, we can fix this by using new technologies to re-enable the permissionless innovation of the past in a way, which creates a more open web where users are free and applications are supportive rather than adversarial to their interests.
Decentralization emerged after the global financial crisis in 2008, which created fundamental problems of confidence in the heavily indebted banking system. Then the decentralized financial sector based on Blockchain technology has emerged since 2009.
Decentralized Blockchain technology has made it easy for decentralized digital currencies like Bitcoin to exchange billions of dollars in peer-to-peer transfers for a fraction of the price of a traditional banking system. This technology allows participants in the over $ 50 billion virtual goods economy to track, own and trade in these commodities without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
By default, the Internet where freedom of data enables innovation will lead to the development of a new form of software development. On this web, developers can quickly create applications from open state components and boost their efforts by using new business models that are enabled from within the program itself rather than relying on parasitic relationships with their users. This not only accelerates the creation of applications that have a more honest and cooperative relationship with its users, but also allows the emergence of completely new business built on them.
To enable these new applications and the open web, it needs the appropriate infrastructure. The new web platform cannot be controlled by a single entity and its use is not limited due to insufficient scalability. It should be decentralized in design like the web itself and supported by a community of distributors widely so that the value they store cannot be monitored, modified or removed without permission from the users who store this value on their behalf.
A new decentralization technology (Blockchain), which has facilitated decentralized digital currencies like Bitcoin, has made billions of dollars in peer-to-peer transfers at a fraction of the price of the traditional banking system. This technology allows participants in the $ 50 billion + virtual goods economy to track, own and trade in these goods without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
Although the cost of storing data or performing a calculation on the Ethereum blockchain is thousands and millions of times higher than the cost of performing the same functionality on Amazon Web Services. A developer can always create a “central” app or even a central currency for a fraction of the cost of doing the same on a decentralized platform because a decentralized platform, by definition, will have many iterations in its operations and storage.
Bitcoin can be thought of as the first, very basic, version of this global community-run cloud, though it is primarily used only to store and move the Bitcoin digital currency.
Ethereum is the second and slightly more sophisticated version, which expanded the basic principles of Bitcoin to create a more general computing and storage platform, though it is a raw technology, which hasn’t achieved meaningful mainstream adoption.

1.1 WHY IS IT IMPORTANT TO PAY THE EXTRA COST TO SUPPORT DECENTRALIZATION?

Because some elements of value, for example bits representing digital currency ownership, personal identity, or asset notes, are very sensitive. While in the central system, the following players can change the value of any credits they come into direct contact with:
  1. The developer who controls the release or update of the application’s code
  2. The platform where the data is stored
  3. The servers which run the application’s code
Even if none of these players intend to operate with bad faith, the actions of governments, police forces and hackers can easily turn their hands against their users and censor, modify or steal the balances they are supposed to protect.
A typical user will trust a typical centralized application, despite its potential vulnerabilities, with everyday data and computation. Typically, only banks and governments are trusted sufficiently to maintain custody of the most sensitive information — balances of wealth and identity. But these entities are also subject to the very human forces of hubris, corruption and theft.
Especially after the 2008 global financial crisis, which demonstrated the fundamental problems of confidence in a highly indebted banking system. And governments around the
world apply significant capital controls to citizens during times of crisis. After these examples, it has become a truism that hackers now own most or all of your sensitive data.
These decentralized applications operate on a more complex infrastructure than today’s web but they have access to an instantaneous and global pool of currency, value and information that today’s web, where data is stored in the silos of individual corporations, cannot provide.

1.2 THE CHALLENGES OF CREATING A DECENTRALIZED CLOUD

A community-run system like this has very different challenges from centralized “cloud” infrastructure, which is running by a single entity or group of known entities. For example:
  1. It must be both inclusive to anyone and secure from manipulation or capture.
  2. Participants must be fairly compensated for their work while avoiding creating incentives for negligent or malicious behavior.
  3. It must be both game theoretically secure so good actors find the right equilibrium and resistant to manipulation so bad actors are actively prevented from negatively affecting the system.

2. NEAR

NEAR is a global community-run computing and storage cloud which is organized to be permissionless and which is economically incentivized to create a strong and decentralized data layer for the new web.
Essentially, it is a platform for running applications which have access to a shared — and secure — pool of money, identity and data which is owned by their users. More technically, it combines the features of partition-resistant networking, serverless compute and distributed storage into a new kind of platform.
NEAR is a community-managed, decentralized cloud storage and computing platform, designed to enable the open web in the future. It uses the same core technology for Bitcoin and Blockchain. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.
NEAR is a decentralized community-run cloud computing and storage platform, which is designed to enable the open web of the future. On this web, everything from new currencies to new applications to new industries can be created, opening the door to a brand new future.
NEAR is a scalable computing and storage platform with the potential to change how systems are designed, how applications are built and how the web itself works.
It is a complex technology allow developers and entrepreneurs to easily and sustainably build applications which reap the benefits of decentralization and participate in the Open Web while minimizing the associated costs for end users.
NEAR creates the only community-managed cloud that is strong enough to power the future of the open web, as NEAR is designed from the ground up to deliver intuitive experiences to
end users, expand capacity across millions of devices, and provide developers with new and sustainable business models for their applications.
The NEAR Platform uses a token — also called “NEAR”. This token allows the users of these cloud resources, regardless of where they are in the world, to fairly compensate the providers of the services and to ensure that these participants operate in good faith.

2.1 WHY NEAR?

Through focus, we find that Platforms based on blockchain technologies like Bitcoin and Ethereum have made great progress and enriched the world with thousands of innovative applications spanning from games to decentralized financing.
However, these original networks and none of the networks that followed were not able to bridge the gap towards mainstream adoption of the applications created above them and do not provide this type of standard that fully supports the web.
This is a result of two key factors:
  1. System design
  2. Organization design
System design is relevant because the technical architecture of other platforms creates substantial problems with both usability and scalability which have made adoption nearly impossible by any but the most technical innovators. End-users experience 97–99% dropoff rates when using applications and developers find the process of creating and maintaining their applications endlessly frustrating.
Fixing these problems requires substantial and complex changes to current protocol architectures, something which existing organizations haven’t proven capable of implementing. Instead, they create multi-year backlogs of specification design and implementation, which result in their technology falling further and further behind.
NEAR’s platform and organization are architected specifically to solve the above-mentioned problems. The technical design is fanatically focused on creating the world’s most usable and scalable decentralized platform so global-scale applications can achieve real adoption. The organization and governance structure are designed to rapidly ship and continuously evolve the protocol so it will never become obsolete.

2.1.1 Features, which address these problems:

1. USABILITY FIRST
The most important problem that needs to be addressed is how to allow developers to create useful applications that users can use easily and that will capture the sustainable value of these developers.
2. End-User Usability
Developers will only build applications, which their end users can actually use. NEAR’s “progressive security” model allows developers to create experiences for their users which more closely resemble familiar web experiences by delaying onboarding, removing the need for user to learn “blockchain” concepts and limiting the number of permission-asking interactions the user must have to use the application.
1. Simple Onboarding: NEAR allows developers to take actions on behalf of their users, which allows them to onboard users without requiring these users to provide a wallet or interact with tokens immediately upon reaching an application. Because accounts keep track of application-specific keys, user accounts can also be used for the kind of “Single Sign On” (SSO) functionality that users are familiar with from the traditional web (eg “Login with Facebook/Google/Github/etc”).
2. Easy Subscriptions: Contract-based accounts allow for easy creation of subscriptions and custom permissioning for particular applications.
3. Familiar Usage Styles: The NEAR economic model allows developers to pay for usage on behalf of their users in order to hide the costs of infrastructure in a way that is in line with familiar web usage paradigms.
4. Predictable Pricing: NEAR prices transactions on the platform in simple terms, which allow end-users to experience predictable pricing and less cognitive load when using the platform.

2.1.2 Design principles and development NEAR’s platform

1. Usability: Applications deployed to the platform should be seamless to use for end users and seamless to create for developers. Wherever possible, the underlying technology itself should fade to the background or be hidden completely from end users. Wherever possible, developers should use familiar languages and patterns during the development process. Basic applications should be intuitive and simple to create while applications that are more robust should still be secure.
2. Scalability: The platform should scale with no upper limit as long as there is economic justification for doing so in order to support enterprise-grade, globally used applications.
3. Sustainable Decentralization: The platform should encourage significant decentralization in both the short term and the long term in order to properly secure the value it hosts. The platform — and community — should be widely and permissionlessly inclusive and actively encourage decentralization and participation. To maintain sustainability, both technological and community governance mechanisms should allow for practical iteration while avoiding capture by any single parties in the end.
4. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose. Optimize for simplicity, pragmatism and ease of understanding above theoretical perfection.

2.2 HOW NEAR WORKS?

NEAR’s platform provides a community-operated cloud infrastructure for deploying and running decentralized applications. It combines the features of a decentralized database with others of a serverless compute platform. The token, which allows this platform to run also, enables applications built on top of it to interact with each other in new ways. Together, these features allow developers to create censorship resistant back-ends for applications that deal with high stakes data like money, identity, assets, and open-state components, which interact seamlessly with each other. These application back-ends and components are called “smart contracts,” though we will often refer to these all as simply “applications” here.
The infrastructure, which makes up this cloud, is created from a potentially infinite number of “nodes” run by individuals around the world who offer portions of their CPU and hard drive space — whether on their laptops or more professionally deployed servers. Developers write smart contracts and deploy them to this cloud as if they were deploying to a single server, which is a process that feels very similar to how applications are deployed to existing centralized clouds.
Once the developer has deployed an application, called a “smart contract”, and marked it unchangeable (“immutable”), the application will now run for as long as at least a handful of members of the NEAR community continue to exist. When end users interact with that deployed application, they will generally do so through a familiar web or mobile interface just like any one of a million apps today.
In the central cloud hosted by some companies today like: Amazon or Google, developers pay for their apps every month based on the amount of usage needed, for example based on the number of requests created by users visiting their webpages. The NEAR platform similarly requires that either users or developers provide compensation for their usage to the community operators of this infrastructure. Like today’s cloud infrastructure, NEAR prices usage based on easy to understand metrics that aren’t heavily influenced by factors like system congestion. Such factors make it very complicated for developers on alternative blockchain-based systems today.
In the centralized cloud, the controlling corporation makes decisions unilaterally. NEAR community-run cloud is decentralized so updates must ultimately be accepted by a sufficient quorum of the network participants. Updates about its future are generated from the community and subject to an inclusive governance process, which balances efficiency and security.
In order to ensure that the operators of nodes — who are anonymous and potentially even malicious — run the code with good behavior, they participate in a staking process called “Proof of Stake”. In this process, they willingly put a portion of value at risk as a sort of deposit, which they will forfeit if it is proven that they have operated improperly.

2.2.1 Elements of the NEAR’s Platform

The NEAR platform is made up of many separate elements. Some of these are native to the platform itself while others are used in conjunction with or on top of it.
1. THE NEAR TOKEN
NEAR token is the fundamental native asset of the NEAR ecosystem and its functionality is enabled for all accounts. Each token is a unique digital asset similar to Ether, which can be used to:
a) Pay the system for processing transactions and storing data.
b) Run a validating node as part of the network by participating in the staking process.
c) Help determine how network resources are allocated and where its future technical direction will go by participating in governance processes.
The NEAR token enables the economic coordination of all participants who operate the network plus it enables new behaviors among the applications which are built on top of that network.
2. OTHER DIGITAL ASSETS
The platform is designed to easily store unique digital assets, which may include, but aren’t limited to:
  • Other Tokens: Tokens bridged from other chains (“wrapped”) or created atop the NEAR Platform can be easily stored and moved using the underlying platform. This allows many kinds of tokens to be used atop the platform to pay for goods and services. “Stablecoins,” specific kinds of token which are designed to match the price of another asset (like the US Dollar), are particularly useful for transacting on the network in this way.
  • Unique Digital Assets: Similar to tokens, digital assets (sometimes called “Non Fungible Tokens” (NFTs) ranging from in-game collectibles to representations of real-world asset ownership can be stored and moved using the platform.
3. THE NEAR PLATFORM
The core platform, which is made up of the cloud of community-operated nodes, is the most basic piece of infrastructure provided. Developers can permissionlessly deploy smart contracts to this cloud and users can permissionlessly use the applications they power. Applications, which could range from consumer-facing games to digital currencies, can store their state (data) securely on the platform. This is conceptually similar to the Ethereum platform.
Operations that require an account, network use, or storage at the top of the platform require payment to the platform in the form of transaction fees that the platform then distributes to its community from the authentication contract. These operations could include creating new accounts, publishing new contracts, implementing code by contract and storing or modifying data by contract.
As long as the rules of the protocol are followed, any independent developer can write software, which interfaces with it (for example, by submitting transactions, creating accounts or even running a new node client) without asking for anyone’s permission first.
4. THE NEAR DEVELOPMENT SUITE
Set of tools and reference implementations created to facilitate its use by those developers and end users who prefer them. These tools include:
  • NEAR SDKs: NEAR platform supports (Rust and AssemblyScript) languages to write smart contracts. To provide a great experience for developers, NEAR has a full SDK, which includes standard data structures, examples and testing tools for these two languages.
  • Gitpod for NEAR: NEAR uses existing technology Gitpod to create zero time onboarding experience for developers. Gitpod provides an online “Integrated Development Environment” (IDE), which NEAR customized to allow developers to easily write, test and deploy smart contracts from a web browser.
  • NEAR Wallet: A wallet is a basic place for developers and end users to store the assets they need to use the network. NEAR Wallet is a reference implementation that is intended to work seamlessly with the progressive security model that lets application developers design more effective user experiences. It will eventually include built-in functionality to easily enable participation by holders in staking and governance processes on the network.
  • NEAR Explorer: To aid with both debugging of contracts and the understanding of network performance, Explorer presents information from the blockchain in an easily digestible web-based format.
  • NEAR Command Line Tools: The NEAR team provides a set of straightforward command line tools to allow developers to easily create, test and deploy applications from their local environments.
All of these tools are being created in an open-source manner so they can be modified or deployed by anyone.

3. ECONOMIC

Primarily economic forces drive the ecosystem, which makes up the NEAR platform. This economy creates the incentives, which allow participants permissionlessly organize to drive the platform’s key functions while creating strong disincentives for undesirable, irresponsible or malicious behavior. In order for the platform to be effective, these incentives need to exist both in the short term and in the long term.
The NEAR platform is a market among participants interested in two aspects:
  • On the supply side, certification contract operators and other core infrastructure must be motivated to provide these services that make up the community cloud.
  • On the demand side, platform developers and end-users who pay for their use need to be able to do so in a simple, clear and consistent way that helps them.
Further, economic forces can also be applied to support the ecosystem as a whole. They can be used at a micro level to create new business models by directly compensating the developers who create its most useful applications. They can also be used at a macro level by coordinating the efforts of a broader set of ecosystem participants who participate in everything from education to governance.

3.1 NEAR ECONOMY DESIGN PRINCIPLES

NEAR’s overall system design principles are used to inform its economic design according to the following interpretations:
1. Usability: End users and developers should have predictable and consistent pricing for their usage of the network. Users should never lose data forever.
2. Scalability: The platform should scale at economically justified thresholds.
3. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose.
4. Sustainable Decentralization: The barrier for participation in the platform as a validating node should be set as low as possible in order to bring a wide range of participants. Over time, their participation should not drive wealth and control into the hands of a small number. Individual transactions made far in the future must be at least as secure as those made today in order to safeguard the value they modify.

3.2 ECONOMIC OVERVIEW

The NEAR economy is optimized to provide developers and end users with the easiest possible experience while still providing proper incentives for network security and ecosystem development.
Summary of the key ideas that drive the system:
  • Thresholded Proof of Stake: Validating node operators provide scarce and valuable compute resources to the network. In order to ensure that the computations they run are correct, they are required to “stake” NEAR tokens, which guarantee their results. If these results are found to be inaccurate, the staker loses their tokens. This is a fundamental mechanism for securing the network. The threshold for participating in the system is set algorithmically at the lowest level possible to allow for the broadest possible participation of validating nodes in a given “epoch” period (½ of a day).
  • Epoch Rewards: Node operators are paid for their service a fixed percentage of total supply as a “security” fee of roughly 4.5% annualized. This rate targets sufficient participation levels among stakers in order to secure the network while balancing with other usage of NEAR token in the ecosystem.
  • Protocol treasury: In addition to validators, protocol treasury received a 0.5% of total supply annually to continuously re-invest into ecosystem development.
  • Transaction Costs: Usage of the network consumes two separate kinds of resources — instantaneous and long term. Instantaneous costs are generated by every transaction because each transaction requires the usage of both the network itself and some of its computation resources. These are priced together as a mostly-predictable cost per transaction, which is paid in NEAR tokens.
  • Storage Costs: Storage is a long term cost because storing data represents an ongoing burden to the nodes of the network. Storage costs are covered by maintaining minimum balance of NEAR tokens on the account or contract. This provides indirect mechanism of payment via inflation to validators for maintaining contract and account state on their nodes.
  • Inflation: Inflation is determined as combination of payouts to validators and protocol treasury minus the collected transaction fees and few other NEAR burning mechanics (like name auction). Overall the maximum inflation is 5%, which can go down over time as network gets more usage and more transactions fees are burned. It’s possible that inflation becomes negative (total supply decreases) if there is enough fees burned.
  • Scaling Thresholds: In a network, which scales its capacity relative to the amount of usage it receives, the thresholds, which drive the network to bring on additional capacity are economic in nature.
  • Security Thresholds: Some thresholds, which provide for good behavior among participants are set using economic incentives. For example, “Fishermen” (described separately).
Full Report
submitted by CoinEx_Institution to Coinex [link] [comments]

Polkadot — An Early In-Depth Analysis — Part One — Overview and Benefits

Polkadot — An Early In-Depth Analysis — Part One — Overview and Benefits
Having recently researched Polkadot, as with other projects, I wanted to document what I had learnt, so that others may potential find it useful. Hopefully providing a balanced view, it will consist of three articles outlined below.
Part One — Polkadot Overview and Benefits (This article)
Part Two — In-Depth look at the Consensus
Part Three — Limitations and Issues
I will provide links throughout, providing reference to sections, as well as include a list of sources at the bottom of the article for further reading.
https://preview.redd.it/pr8hmkhhe6m51.png?width=700&format=png&auto=webp&s=58331d0411e684b4c511d59aeabeb789205d8a44

Overview

Frustrated with the slow development of Ethereum 2.0, Dr. Gavin Wood, co-founder of Ethereum and inventor of Solidity, left to begin work on Polkadot, a next generation scalable blockchain protocol that connects multiple specialised blockchains into one unified network. It achieves scalability through a sharding infrastructure with multiple blockchains running in parallel, called parachains, that connect to a central chain called the Relay Chain.
Whilst it shares some similarities with Ethereum 2.0, one key differentiator is that it uses heterogeneous sharding, where each parachains can be customised through the Substrate development framework, enabling them to be optimised for a specific use case and running in parallel rather than same across all shards. This is important as when it comes to blockchain architecture, one size does not fit all and all blockchains make trade-offs to support different features and use cases.
All parachains connect to the relay chain, which validates the state transition of connected parachains, providing shared state across the entire ecosystem. If the Relay Chain must revert for any reason, then all of the parachains would also revert. This is to ensure that the validity of the entire system can persist, and no individual part is corruptible. The shared state makes it so that the trust assumptions when using parachains are only those of the Relay Chain validator set, and no other. Since the validator set on the Relay Chain is expected to be secure with a large amount of stake put up to back it, it is desirable for parachains to benefit from this security.
This enables seamless interoperability between all parachains and parathreads using the Cross-chain Message Passing (XCMP) protocol, allowing arbitrary data — not just tokens — to be transferred across blockchains. Interoperability is also possible to other ecosystems through bridges, which are specifically designed parachains or parathreads that are custom made to interact with another ecosystem such as Ethereum, Bitcoin and Cosmos for example, enabling interoperability. Because these other ecosystems don’t use the same shared state of Polkadot, finality is incredibly important, because whilst the relay chain can roll back all the parachains, it can’t roll back the Ethereum or Bitcoin blockchains for example. This is discussed further in part three.
https://preview.redd.it/lmrz428je6m51.png?width=1000&format=png&auto=webp&s=237ad499f85e960ca50ca884234453ce283a60c0
The relay chain is responsible for the network’s shared security, consensus and cross-chain interoperability. It is secured by Validators and Nominators staking the native DOT tokens. Ultimately scalability for the ecosystem is determined by how scalable the relay chain can be. The number of parachains is determined by the number of validators on the relay chain. The hope is to reach 1000 validators, which would enable around 100 parachains. With each parachain being capable of around 1,000 transactions per second.
Nominators stake their DOT tokens with validators they trust, with the validators likely charging a small commission to cover running costs. If a validator is found to have performed misconduct a percentage of the their stake but also the nominators stake will be slashed depending upon the severity. For Level 4 security threats such as collusion and including an invalid block then 100% of the stake will be slashed.What’s really important to understand is that both the validators own stake and the nominated stake will be slashed, so you could lose all your DOT that you have staked against a validator if they perform maliciously. Therefore, it’s very important not to just try and maximise rewards and being oblivious to the risk, not only can you lose all your DOT, but you are making the entire system less secure (addressed in part three). There have already been several minor slashing incidents so far, so something to really consider.
https://preview.redd.it/aj9v0azke6m51.png?width=700&format=png&auto=webp&s=86134eaef08d1ef50466d1d80ec5ce151327d702

Auction for Parachain Slots

Due to the limited number of parachain slots available, there needs to be a method to decide who gets a parachain slot. This is achieved through a candle-auction where participants bid with DOT to secure a lease on a parchain slot to secure a 6 — 24 month period, with the highest bidders winning. DOT isn’t spent, but rather locked for the duration of the lease and unable to participate in staking and earn rewards. In the event they are unsuccessful in securing a further slot, then the lease expires and the DOT will be returned.
Of the 100 parachain slots that they hope to be able to accommodate, between 10 and 30 will be reserved for system parachains, with the remaining available for either auction slots or used for parathreads. Whilst the DOT is returned, due to the limited number of slots available this could result in significant amounts of DOT needing to be acquired to secure a slot. How the auction mechanics effect the price of DOT also remains to be seen, with potentially a rise from the start of the auction, followed by a fall before the lease ends and the DOT are returned. The plan is to continuously have a small amount of parachain auctions going throughout the year, to minimise any unwanted effects. How comfortable developers will be with locking significant amounts of funds in a highly volatile asset for an extended amount of time, also remains to be seen. They could also be in a position where they can no longer afford to keep their lease and have to downgrade to a parathread (providing the application will still function with the reduced performance or migrate to another platform). See this article for more details on the auction mechanism
https://preview.redd.it/wp8rvxlme6m51.png?width=387&format=png&auto=webp&s=496320d627405362142210e1a4c17ebe43e1f8a1

Parathreads

For applications that don’t require the guaranteed performance of a parachain or don’t want to pay the large fees to secure a parachain slot, then parathreads can be used instead. Parathreads have a fixed fee for registration that would realistically be much lower than the cost of acquiring a parachain slot and compete with other parathreads in a per-block auction to have their transactions included in the next relay chain block. A portion of the parachain slots on the Relay Chain will be designated as part of the parathread pool.
In the event that a parachain loses its slot then it can transition to a parathread (assuming the application can still function with the reduced and varied performance of sharing the slot between many). This also enables small projects to start out with a parathread and then upgrade to a parachain slot when required.

Token

DOT is the native token of the Polkadot network and serves three key functions. (i) It is staked to provide security for the relay chain, (ii) to be bonded to connect a chain to Polkadot as a parachain and (iii) to be used for governance of the network. There is an initial total supply of 1 billion DOT with yearly inflation estimated to be around 10% providing the optimal 50% staking rate is achieved, resulting in rewards of 20% to those that stake (net 10% when take into account inflation). Those that don’t stake lose 10% through dilution. Should the amount staked exceed the optimal 50% then reward rates reduce as well as inflation to make staking less attractive. Likewise if its below 50% then rewards and inflation rate will be higher to encourage staking. Staking isn’t risk free though as mentioned before.

Governance

Polkadot employs an on-chain governance model where in order to make any changes to the network, DOT holders vote on a proposal to upgrade the network with the help of the Council. The council is an entity comprising a 23 seats each represented by an on-chain account. Its goals are to represent passive stakeholders, submit sensible and important proposals, and cancel dangerous or malicious proposals. All DOT holders are free to register their candidacy for the Council, and free to vote for any number of candidates, with a voting power proportional to their stake.
Any stakeholder can submit a public proposal by depositing a fixed minimum amount of DOTs, which stays locked for a certain period. If someone agrees with the proposal, they may deposit the same amount of tokens to endorse it. Public proposals are stored in a priority queue, and at regular intervals the proposal with the most endorsements gets tabled for a referendum. The locked tokens are released once the proposal is tabled. Council proposals are submitted by the Council, and are stored in a separate priority queue where the priorities are set at the Council’s discretion.
Every thirty days, a new proposal will be tabled, and a referendum will come up for a vote. The proposal to be tabled is the top proposal from either the public-proposal queue or the Council-proposal queue, alternating between the two queues.
The Technical Committee is composed according to a single vote for each team that has successfully and independently implemented or formally specified the protocol in Polkadot, or in its canary network Kusama. The Technical Committee is the last line of defence for the system. Its sole purpose is detecting present or imminent issues in the system such as bugs in the code or security vulnerabilities, and proposing and fast-tracking emergency referenda.

Ecosystem

Whilst parachains aren’t currently implemented at this stage, there is a rapidly growing ecosystem looking to build on Polkadot with substrate. Polkadot’s “cousin”, the canary network Kusama used for experimentation, was launched last year by the same team and contributes to the early growth of the overall ecosystem. See here for a list of the current projects looking to build on Polkadot and filter by Substrate based.
https://preview.redd.it/rt8i0hqpe6m51.png?width=700&format=png&auto=webp&s=f6bcf26fa84463765f720c3074ee10157c2735f6
Now that we have covered the basics, in part two I will explain how the consensus mechanism in Polkadot works and covering more of the technical aspects.
submitted by xSeq22x to CryptoCurrency [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

All you need to know about Yield Farming - The rocket fuel for Defi

All you need to know about Yield Farming - The rocket fuel for Defi
Source
It’s effectively July 2017 in the world of decentralized finance (DeFi), and as in the heady days of the initial coin offering (ICO) boom, the numbers are only trending up.
According to DeFi Pulse, there is $1.9 billion in crypto assets locked in DeFi right now. According to the CoinDesk ICO Tracker, the ICO market started chugging past $1 billion in July 2017, just a few months before token sales started getting talked about on TV.
Debate juxtaposing these numbers if you like, but what no one can question is this: Crypto users are putting more and more value to work in DeFi applications, driven largely by the introduction of a whole new yield-generating pasture, Compound’s COMP governance token.
Governance tokens enable users to vote on the future of decentralized protocols, sure, but they also present fresh ways for DeFi founders to entice assets onto their platforms.
That said, it’s the crypto liquidity providers who are the stars of the present moment. They even have a meme-worthy name: yield farmers.

https://preview.redd.it/lxsvazp1g9l51.png?width=775&format=png&auto=webp&s=a36173ab679c701a5d5e0aac806c00fcc84d78c1

Where it started

Ethereum-based credit market Compound started distributing its governance token, COMP, to the protocol’s users this past June 15. Demand for the token (heightened by the way its automatic distribution was structured) kicked off the present craze and moved Compound into the leading position in DeFi.
The hot new term in crypto is “yield farming,” a shorthand for clever strategies where putting crypto temporarily at the disposal of some startup’s application earns its owner more cryptocurrency.
Another term floating about is “liquidity mining.”
The buzz around these concepts has evolved into a low rumble as more and more people get interested.
The casual crypto observer who only pops into the market when activity heats up might be starting to get faint vibes that something is happening right now. Take our word for it: Yield farming is the source of those vibes.
But if all these terms (“DeFi,” “liquidity mining,” “yield farming”) are so much Greek to you, fear not. We’re here to catch you up. We’ll get into all of them.
We’re going to go from very basic to more advanced, so feel free to skip ahead.

What are tokens?

Most CoinDesk readers probably know this, but just in case: Tokens are like the money video-game players earn while fighting monsters, money they can use to buy gear or weapons in the universe of their favorite game.
But with blockchains, tokens aren’t limited to only one massively multiplayer online money game. They can be earned in one and used in lots of others. They usually represent either ownership in something (like a piece of a Uniswap liquidity pool, which we will get into later) or access to some service. For example, in the Brave browser, ads can only be bought using basic attention token (BAT).
If tokens are worth money, then you can bank with them or at least do things that look very much like banking. Thus: decentralized finance.
Tokens proved to be the big use case for Ethereum, the second-biggest blockchain in the world. The term of art here is “ERC-20 tokens,” which refers to a software standard that allows token creators to write rules for them. Tokens can be used a few ways. Often, they are used as a form of money within a set of applications. So the idea for Kin was to create a token that web users could spend with each other at such tiny amounts that it would almost feel like they weren’t spending anything; that is, money for the internet.
Governance tokens are different. They are not like a token at a video-game arcade, as so many tokens were described in the past. They work more like certificates to serve in an ever-changing legislature in that they give holders the right to vote on changes to a protocol.
So on the platform that proved DeFi could fly, MakerDAO, holders of its governance token, MKR, vote almost every week on small changes to parameters that govern how much it costs to borrow and how much savers earn, and so on.
Read more: Why DeFi’s Billion-Dollar Milestone Matters
One thing all crypto tokens have in common, though, is they are tradable and they have a price. So, if tokens are worth money, then you can bank with them or at least do things that look very much like banking. Thus: decentralized finance.

What is DeFi?

Fair question. For folks who tuned out for a bit in 2018, we used to call this “open finance.” That construction seems to have faded, though, and “DeFi” is the new lingo.
In case that doesn’t jog your memory, DeFi is all the things that let you play with money, and the only identification you need is a crypto wallet.
On the normal web, you can’t buy a blender without giving the site owner enough data to learn your whole life history. In DeFi, you can borrow money without anyone even asking for your name.
I can explain this but nothing really brings it home like trying one of these applications. If you have an Ethereum wallet that has even $20 worth of crypto in it, go do something on one of these products. Pop over to Uniswap and buy yourself some FUN (a token for gambling apps) or WBTC (wrapped bitcoin). Go to MakerDAO and create $5 worth of DAI (a stablecoin that tends to be worth $1) out of the digital ether. Go to Compound and borrow $10 in USDC.
(Notice the very small amounts I’m suggesting. The old crypto saying “don’t put in more than you can afford to lose” goes double for DeFi. This stuff is uber-complex and a lot can go wrong. These may be “savings” products but they’re not for your retirement savings.)
Immature and experimental though it may be, the technology’s implications are staggering. On the normal web, you can’t buy a blender without giving the site owner enough data to learn your whole life history. In DeFi, you can borrow money without anyone even asking for your name.
DeFi applications don’t worry about trusting you because they have the collateral you put up to back your debt (on Compound, for instance, a $10 debt will require around $20 in collateral).
Read more: There Are More DAI on Compound Now Than There Are DAI in the World
If you do take this advice and try something, note that you can swap all these things back as soon as you’ve taken them out. Open the loan and close it 10 minutes later. It’s fine. Fair warning: It might cost you a tiny bit in fees, and the cost of using Ethereum itself right now is much higher than usual, in part due to this fresh new activity. But it’s nothing that should ruin a crypto user.
So what’s the point of borrowing for people who already have the money? Most people do it for some kind of trade. The most obvious example, to short a token (the act of profiting if its price falls). It’s also good for someone who wants to hold onto a token but still play the market.

Doesn’t running a bank take a lot of money up front?

It does, and in DeFi that money is largely provided by strangers on the internet. That’s why the startups behind these decentralized banking applications come up with clever ways to attract HODLers with idle assets.
Liquidity is the chief concern of all these different products. That is: How much money do they have locked in their smart contracts?
“In some types of products, the product experience gets much better if you have liquidity. Instead of borrowing from VCs or debt investors, you borrow from your users,” said Electric Capital managing partner Avichal Garg.
Let’s take Uniswap as an example. Uniswap is an “automated market maker,” or AMM (another DeFi term of art). This means Uniswap is a robot on the internet that is always willing to buy and it’s also always willing to sell any cryptocurrency for which it has a market.
On Uniswap, there is at least one market pair for almost any token on Ethereum. Behind the scenes, this means Uniswap can make it look like it is making a direct trade for any two tokens, which makes it easy for users, but it’s all built around pools of two tokens. And all these market pairs work better with bigger pools.

Why do I keep hearing about ‘pools’?

To illustrate why more money helps, let’s break down how Uniswap works.
Let’s say there was a market for USDC and DAI. These are two tokens (both stablecoins but with different mechanisms for retaining their value) that are meant to be worth $1 each all the time, and that generally tends to be true for both.
The price Uniswap shows for each token in any pooled market pair is based on the balance of each in the pool. So, simplifying this a lot for illustration’s sake, if someone were to set up a USDC/DAI pool, they should deposit equal amounts of both. In a pool with only 2 USDC and 2 DAI it would offer a price of 1 USDC for 1 DAI. But then imagine that someone put in 1 DAI and took out 1 USDC. Then the pool would have 1 USDC and 3 DAI. The pool would be very out of whack. A savvy investor could make an easy $0.50 profit by putting in 1 USDC and receiving 1.5 DAI. That’s a 50% arbitrage profit, and that’s the problem with limited liquidity.
(Incidentally, this is why Uniswap’s prices tend to be accurate, because traders watch it for small discrepancies from the wider market and trade them away for arbitrage profits very quickly.)
Read more: Uniswap V2 Launches With More Token-Swap Pairs, Oracle Service, Flash Loans
However, if there were 500,000 USDC and 500,000 DAI in the pool, a trade of 1 DAI for 1 USDC would have a negligible impact on the relative price. That’s why liquidity is helpful.
You can stick your assets on Compound and earn a little yield. But that’s not very creative. Users who look for angles to maximize that yield: those are the yield farmers.
Similar effects hold across DeFi, so markets want more liquidity. Uniswap solves this by charging a tiny fee on every trade. It does this by shaving off a little bit from each trade and leaving that in the pool (so one DAI would actually trade for 0.997 USDC, after the fee, growing the overall pool by 0.003 USDC). This benefits liquidity providers because when someone puts liquidity in the pool they own a share of the pool. If there has been lots of trading in that pool, it has earned a lot of fees, and the value of each share will grow.
And this brings us back to tokens.
Liquidity added to Uniswap is represented by a token, not an account. So there’s no ledger saying, “Bob owns 0.000000678% of the DAI/USDC pool.” Bob just has a token in his wallet. And Bob doesn’t have to keep that token. He could sell it. Or use it in another product. We’ll circle back to this, but it helps to explain why people like to talk about DeFi products as “money Legos.”

So how much money do people make by putting money into these products?

It can be a lot more lucrative than putting money in a traditional bank, and that’s before startups started handing out governance tokens.
Compound is the current darling of this space, so let’s use it as an illustration. As of this writing, a person can put USDC into Compound and earn 2.72% on it. They can put tether (USDT) into it and earn 2.11%. Most U.S. bank accounts earn less than 0.1% these days, which is close enough to nothing.
However, there are some caveats. First, there’s a reason the interest rates are so much juicier: DeFi is a far riskier place to park your money. There’s no Federal Deposit Insurance Corporation (FDIC) protecting these funds. If there were a run on Compound, users could find themselves unable to withdraw their funds when they wanted.
Plus, the interest is quite variable. You don’t know what you’ll earn over the course of a year. USDC’s rate is high right now. It was low last week. Usually, it hovers somewhere in the 1% range.
Similarly, a user might get tempted by assets with more lucrative yields like USDT, which typically has a much higher interest rate than USDC. (Monday morning, the reverse was true, for unclear reasons; this is crypto, remember.) The trade-off here is USDT’s transparency about the real-world dollars it’s supposed to hold in a real-world bank is not nearly up to par with USDC’s. A difference in interest rates is often the market’s way of telling you the one instrument is viewed as dicier than another.
Users making big bets on these products turn to companies Opyn and Nexus Mutual to insure their positions because there’s no government protections in this nascent space – more on the ample risks later on.
So users can stick their assets in Compound or Uniswap and earn a little yield. But that’s not very creative. Users who look for angles to maximize that yield: those are the yield farmers.

OK, I already knew all of that. What is yield farming?

Broadly, yield farming is any effort to put crypto assets to work and generate the most returns possible on those assets.
At the simplest level, a yield farmer might move assets around within Compound, constantly chasing whichever pool is offering the best APY from week to week. This might mean moving into riskier pools from time to time, but a yield farmer can handle risk.
“Farming opens up new price arbs [arbitrage] that can spill over to other protocols whose tokens are in the pool,” said Maya Zehavi, a blockchain consultant.
Because these positions are tokenized, though, they can go further.
This was a brand-new kind of yield on a deposit. In fact, it was a way to earn a yield on a loan. Who has ever heard of a borrower earning a return on a debt from their lender?
In a simple example, a yield farmer might put 100,000 USDT into Compound. They will get a token back for that stake, called cUSDT. Let’s say they get 100,000 cUSDT back (the formula on Compound is crazy so it’s not 1:1 like that but it doesn’t matter for our purposes here).
They can then take that cUSDT and put it into a liquidity pool that takes cUSDT on Balancer, an AMM that allows users to set up self-rebalancing crypto index funds. In normal times, this could earn a small amount more in transaction fees. This is the basic idea of yield farming. The user looks for edge cases in the system to eke out as much yield as they can across as many products as it will work on.
Right now, however, things are not normal, and they probably won’t be for a while.

Why is yield farming so hot right now?

Because of liquidity mining. Liquidity mining supercharges yield farming.
Liquidity mining is when a yield farmer gets a new token as well as the usual return (that’s the “mining” part) in exchange for the farmer’s liquidity.
“The idea is that stimulating usage of the platform increases the value of the token, thereby creating a positive usage loop to attract users,” said Richard Ma of smart-contract auditor Quantstamp.
The yield farming examples above are only farming yield off the normal operations of different platforms. Supply liquidity to Compound or Uniswap and get a little cut of the business that runs over the protocols – very vanilla.
But Compound announced earlier this year it wanted to truly decentralize the product and it wanted to give a good amount of ownership to the people who made it popular by using it. That ownership would take the form of the COMP token.
Lest this sound too altruistic, keep in mind that the people who created it (the team and the investors) owned more than half of the equity. By giving away a healthy proportion to users, that was very likely to make it a much more popular place for lending. In turn, that would make everyone’s stake worth much more.
So, Compound announced this four-year period where the protocol would give out COMP tokens to users, a fixed amount every day until it was gone. These COMP tokens control the protocol, just as shareholders ultimately control publicly traded companies.
Every day, the Compound protocol looks at everyone who had lent money to the application and who had borrowed from it and gives them COMP proportional to their share of the day’s total business.
The results were very surprising, even to Compound’s biggest promoters.
COMP’s value will likely go down, and that’s why some investors are rushing to earn as much of it as they can right now.
This was a brand-new kind of yield on a deposit into Compound. In fact, it was a way to earn a yield on a loan, as well, which is very weird: Who has ever heard of a borrower earning a return on a debt from their lender?
COMP’s value has consistently been well over $200 since it started distributing on June 15. We did the math elsewhere but long story short: investors with fairly deep pockets can make a strong gain maximizing their daily returns in COMP. It is, in a way, free money.
It’s possible to lend to Compound, borrow from it, deposit what you borrowed and so on. This can be done multiple times and DeFi startup Instadapp even built a tool to make it as capital-efficient as possible.
“Yield farmers are extremely creative. They find ways to ‘stack’ yields and even earn multiple governance tokens at once,” said Spencer Noon of DTC Capital.
COMP’s value spike is a temporary situation. The COMP distribution will only last four years and then there won’t be any more. Further, most people agree that the high price now is driven by the low float (that is, how much COMP is actually free to trade on the market – it will never be this low again). So the value will probably gradually go down, and that’s why savvy investors are trying to earn as much as they can now.
Appealing to the speculative instincts of diehard crypto traders has proven to be a great way to increase liquidity on Compound. This fattens some pockets but also improves the user experience for all kinds of Compound users, including those who would use it whether they were going to earn COMP or not.
As usual in crypto, when entrepreneurs see something successful, they imitate it. Balancer was the next protocol to start distributing a governance token, BAL, to liquidity providers. Flash loan provider bZx has announced a plan. Ren, Curve and Synthetix also teamed up to promote a liquidity pool on Curve.
It is a fair bet many of the more well-known DeFi projects will announce some kind of coin that can be mined by providing liquidity.
The case to watch here is Uniswap versus Balancer. Balancer can do the same thing Uniswap does, but most users who want to do a quick token trade through their wallet use Uniswap. It will be interesting to see if Balancer’s BAL token convinces Uniswap’s liquidity providers to defect.
So far, though, more liquidity has gone into Uniswap since the BAL announcement, according to its data site. That said, even more has gone into Balancer.

Did liquidity mining start with COMP?

No, but it was the most-used protocol with the most carefully designed liquidity mining scheme.
This point is debated but the origins of liquidity mining probably date back to Fcoin, a Chinese exchange that created a token in 2018 that rewarded people for making trades. You won’t believe what happened next! Just kidding, you will: People just started running bots to do pointless trades with themselves to earn the token.
Similarly, EOS is a blockchain where transactions are basically free, but since nothing is really free the absence of friction was an invitation for spam. Some malicious hacker who didn’t like EOS created a token called EIDOS on the network in late 2019. It rewarded people for tons of pointless transactions and somehow got an exchange listing.
These initiatives illustrated how quickly crypto users respond to incentives.
Read more: Compound Changes COMP Distribution Rules Following ‘Yield Farming’ Frenzy
Fcoin aside, liquidity mining as we now know it first showed up on Ethereum when the marketplace for synthetic tokens, Synthetix, announced in July 2019 an award in its SNX token for users who helped add liquidity to the sETH/ETH pool on Uniswap. By October, that was one of Uniswap’s biggest pools.
When Compound Labs, the company that launched the Compound protocol, decided to create COMP, the governance token, the firm took months designing just what kind of behavior it wanted and how to incentivize it. Even still, Compound Labs was surprised by the response. It led to unintended consequences such as crowding into a previously unpopular market (lending and borrowing BAT) in order to mine as much COMP as possible.
Just last week, 115 different COMP wallet addresses – senators in Compound’s ever-changing legislature – voted to change the distribution mechanism in hopes of spreading liquidity out across the markets again.

Is there DeFi for bitcoin?

Yes, on Ethereum.
Nothing has beaten bitcoin over time for returns, but there’s one thing bitcoin can’t do on its own: create more bitcoin.
A smart trader can get in and out of bitcoin and dollars in a way that will earn them more bitcoin, but this is tedious and risky. It takes a certain kind of person.
DeFi, however, offers ways to grow one’s bitcoin holdings – though somewhat indirectly.
A long HODLer is happy to gain fresh BTC off their counterparty’s short-term win. That’s the game.
For example, a user can create a simulated bitcoin on Ethereum using BitGo’s WBTC system. They put BTC in and get the same amount back out in freshly minted WBTC. WBTC can be traded back for BTC at any time, so it tends to be worth the same as BTC.
Then the user can take that WBTC, stake it on Compound and earn a few percent each year in yield on their BTC. Odds are, the people who borrow that WBTC are probably doing it to short BTC (that is, they will sell it immediately, buy it back when the price goes down, close the loan and keep the difference).
A long HODLer is happy to gain fresh BTC off their counterparty’s short-term win. That’s the game.

How risky is it?

Enough.
“DeFi, with the combination of an assortment of digital funds, automation of key processes, and more complex incentive structures that work across protocols – each with their own rapidly changing tech and governance practices – make for new types of security risks,” said Liz Steininger of Least Authority, a crypto security auditor. “Yet, despite these risks, the high yields are undeniably attractive to draw more users.”
We’ve seen big failures in DeFi products. MakerDAO had one so bad this year it’s called “Black Thursday.” There was also the exploit against flash loan provider bZx. These things do break and when they do money gets taken.
As this sector gets more robust, we could see token holders greenlighting more ways for investors to profit from DeFi niches.
Right now, the deal is too good for certain funds to resist, so they are moving a lot of money into these protocols to liquidity mine all the new governance tokens they can. But the funds – entities that pool the resources of typically well-to-do crypto investors – are also hedging. Nexus Mutual, a DeFi insurance provider of sorts, told CoinDesk it has maxed out its available coverage on these liquidity applications. Opyn, the trustless derivatives maker, created a way to short COMP, just in case this game comes to naught.
And weird things have arisen. For example, there’s currently more DAI on Compound than have been minted in the world. This makes sense once unpacked but it still feels dicey to everyone.
That said, distributing governance tokens might make things a lot less risky for startups, at least with regard to the money cops.
“Protocols distributing their tokens to the public, meaning that there’s a new secondary listing for SAFT tokens, [gives] plausible deniability from any security accusation,” Zehavi wrote. (The Simple Agreement for Future Tokens was a legal structure favored by many token issuers during the ICO craze.)
Whether a cryptocurrency is adequately decentralized has been a key feature of ICO settlements with the U.S. Securities and Exchange Commission (SEC).

What’s next for yield farming? (A prediction)

COMP turned out to be a bit of a surprise to the DeFi world, in technical ways and others. It has inspired a wave of new thinking.
“Other projects are working on similar things,” said Nexus Mutual founder Hugh Karp. In fact, informed sources tell CoinDesk brand-new projects will launch with these models.
We might soon see more prosaic yield farming applications. For example, forms of profit-sharing that reward certain kinds of behavior.
Imagine if COMP holders decided, for example, that the protocol needed more people to put money in and leave it there longer. The community could create a proposal that shaved off a little of each token’s yield and paid that portion out only to the tokens that were older than six months. It probably wouldn’t be much, but an investor with the right time horizon and risk profile might take it into consideration before making a withdrawal.
(There are precedents for this in traditional finance: A 10-year Treasury bond normally yields more than a one-month T-bill even though they’re both backed by the full faith and credit of Uncle Sam, a 12-month certificate of deposit pays higher interest than a checking account at the same bank, and so on.)
As this sector gets more robust, its architects will come up with ever more robust ways to optimize liquidity incentives in increasingly refined ways. We could see token holders greenlighting more ways for investors to profit from DeFi niches.
Questions abound for this nascent industry: What will MakerDAO do to restore its spot as the king of DeFi? Will Uniswap join the liquidity mining trend? Will anyone stick all these governance tokens into a decentralized autonomous organization (DAO)? Or would that be a yield farmers co-op?
Whatever happens, crypto’s yield farmers will keep moving fast. Some fresh fields may open and some may soon bear much less luscious fruit.
But that’s the nice thing about farming in DeFi: It is very easy to switch fields.
submitted by pascalbernoulli to Yield_Farming [link] [comments]

Want to know why NEM should be as popular as Ethereum? this will be bigger than any altcoins you see, here's why

Want to know why NEM should be as popular as Ethereum? this will be bigger than any altcoins you see, here's why

Altcoin Explorer: NEM (XEM), the Enterprise-Grade Blockchain Platform


https://preview.redd.it/5ogfihikwkg51.png?width=1300&format=png&auto=webp&s=099780e02777d16d4e2add64b249c46da1cd488b
Nestled among the top 40 cryptocurrencies by reported market cap, New Economy Movement — popularly known as NEM (XEM) – is a peer-to-peer (P2P), dual-layer blockchain smart contract platform written in one of the most influential programming languages, Java. NEM uses the proof-of-importance (POI) consensus algorithm that essentially values the tokens held and the activity conducted by the nodes on the blockchain network.
In this Altcoin Explorer, BTCManager delves deeper into the finer intricacies of the NEM blockchain protocol, including the project’s POI consensus algorithm, its native digital token XEM, and some of its real-world use-cases.
Without further ado, let’s get to the core of this high-performance distributed ledger technology (DLT) platform.

History of NEM

NEM was launched on March 31, 2015, with an aim to develop an enterprise-grade blockchain protocol that could circumvent the infamous trilemma of blockchain: scalability, speed, and privacy.
Operated by a Gibraltar-registered NEM Group, NEM is a fork-out version of the NXT blockchain. After the successful fork, the NEM community decided to build its ecosystem from the ground up and developed its own codebase to make the network more scalable and faster.
NEM’s insistence toward building its own tech infrastructure led to a DLT protocol that is unlike anything resembling other similar platforms.
Today, NEM ranks among the top go-to blockchain platforms for enterprises across the world, rivaling competing protocols including Ethereum (ETH), and TRON (TRX), among others.

NEM’s Proof-of-Importance (POI) Algorithm

Unlike Bitcoin’s (BTC) energy-intensive Proof-of-Work (PoW) and Ethereum’s yet-to-be-implemented Proof-of-Stake (PoS) consensus algorithm, NEM uses PoI consensus mechanism.
The PoI mechanism achieves consensus by incentivizing active user participation in the NEM network. This consensus infrastructure ensures an agile decentralized network by rewarding well-behaved nodes that not only possess a significant stake in the network but are also actively engaged in executing transactions to maintain the network’s robustness.
Specifically, each node in the network possesses an ‘Importance Score’ that impacts the number of times the said node can ‘Harvest’ the XEM altcoin.
Initially, when a user puts XEM tokens into their wallet, they are called ‘unvested coins.’ Over time, as the wallets start accumulating an increasing number of XEM and contribute to the network’s transaction volume, they start collecting importance scores. At the same time, the XEM tokens in these wallets change into ‘vested coins,’ provided that there are at least 10000 tokens in the wallet.
To put things into perspective, let’s take the help of a small example.
On day 1, Joe receives 50,000 XEM in his digital wallet. Now, with each passing day, the NEM network will ‘vest’ 10 percent of the tokens held by Joe. So, on day 2, 5,000 tokens held by Joe are vested into the network. On day 3, 10 percent of the remaining tokens – 15,000 XEM – get vested into the network, leaving Joe with 13,500 XEM, and so one. After a couple of days, Joe sees that the number of XEM vested by him has crossed the 10,000 coins threshold, thereby, making him eligible to seek rewards from the NEM blockchain for his contribution to vesting his tokens.
Close followers of blockchain projects would find the aforementioned network reward mechanism bear a close resemblance to the PoS consensus algorithm. However, it’s worthy of note that vesting coins is just one way of calculating a node’s importance score.
The NEM protocol also rewards nodes that are responsible for most activity on the network. In essence, this means that the higher the number of transactions executed by a node, the more likely it is to gain higher importance points. The balance between vesting XEM and network activity is an important metric to be maintained by NEM nodes as it directly impacts their likelihood of harvesting XEM.
NEM’s consensus algorithm does away with several issues plaguing the more energy-intensive protocols such as PoW. For instance, PoI does not necessarily require high-energy hardware to run the nodes. The decentralized nature of the algorithm means that almost any machine — irrespective of its tech configuration – can participate in the NEM ecosystem ensuring it remains decentralized.

NEM’s Native Digital Token — XEM

XEM, unlike the vast majority of other cryptocurrencies, isn’t mined or staked using Pow or PoS algorithms. Rather, as explained earlier, XEM is ‘harvested’ through the PoI algorithm which ensures a steady supply of the digital token without flooding the market and involving the risk of a dramatic crash in price.
Per data on CoinMarketCap, at the time of writing, XEM trades at $0.04 with a market cap of more than $382 million and a 24-hour trading volume of approximately $6.8 million. The coin reached its all-time high of $1.92 in January 2018.
A large number of reputable cryptocurrency exchanges trade XEM, including Binance, Upbit, OKEx, Bithumb, ProBit, among others. The digital token can be easily traded with BTC, ETH, and USDT trading pairs.
That said, if you wish to vest your XEM to partake in the maintenance of the NEM network and earn rewards, it is recommended you store your tokens in the official NEM Nano wallet for desktop and mobile OS. Only XEM tokens held in the official NEM Nano wallet are eligible for vesting.

NEM Use-Cases

To date, NEM has been deployed for various real-world applications with promising results.
In 2018, Ukraine launched a blockchain-based e-voting trial leveraging the NEM DLT platform.
At the time, Ukraine’s Central Election Commission – with the local NEM Foundation representation – estimated the test vote trial in each polling station could cost as low as $1,227. The organization’s Oleksandr Stelmakh lauded the efforts, saying that using a blockchain-powered voting mechanism would make it impossible for anyone to fiddle with the records. The Commission added that the NEM protocol presents information in a more user-friendly format for voters.
In the same year, Malaysia’s Ministry of Education launched an e-scroll system based on the NEM blockchain to tackle the menace of fake degrees. The University Degree Issuance and Verification System use the NEM blockchain which is interrogated upon scanning of a QR code printed on the degree certificate.
The Ministry added that one of the primary reasons for its decision to selected the NEM platform was its unique and cutting-edge features in managing traceability and authentication requirements.
On a recent note, the Bank of Lithuania announced that it would be issuing its NEM blockchain-powered digital collector’s coin (LBCoin) in July after the successful completion of its testing phase.

Final Thoughts

Summing up, NEM offers a wide array of in-house features that separate it from other blockchain projects in a space that is becoming increasingly congested. NEM’s creative PoI consensus algorithm is a fresh take on the PoS algorithm for performance enhancement. Further, the project’s newly launched enterprise-grade DLT solution, Symbol, offers a tremendous option to businesses to help them cut costs, reduce complexities, and streamline innovation.
NEM uses the Java programming language that makes it an easy project for developers to get involved with, unlike other projects such as Ethereum that use platform-specific programming languages like Solidity. The project’s tech infrastructure not only makes it less power-intensive compared to Bitcoin but also more scalable than its rival projects including Ethereum and NEO.
NEM’s tagline, “Smart Asset Blockchain, Built for Performance,” perfectly captures everything the project has to offer. Over the years, NEM’s active developer community has craftily addressed the notorious bottlenecks in the vast majority of blockchain solutions, The future looks promising for NEM as it continues to foster a trustless and blockchain-driven economy for tomorrow.
Source
submitted by charlesgwynne to CryptocurrencyICO [link] [comments]

Want to know why NEM should be as popular as Ethereum? this will be bigger than any altcoins you see, here's why

Want to know why NEM should be as popular as Ethereum? this will be bigger than any altcoins you see, here's why

Altcoin Explorer: NEM (XEM), the Enterprise-Grade Blockchain Platform


https://preview.redd.it/manbawoqvkg51.png?width=1300&format=png&auto=webp&s=fcbae1f067261326f11641bb9b18cd6f57616966
Nestled among the top 40 cryptocurrencies by reported market cap, New Economy Movement — popularly known as NEM (XEM) – is a peer-to-peer (P2P), dual-layer blockchain smart contract platform written in one of the most influential programming languages, Java. NEM uses the proof-of-importance (POI) consensus algorithm that essentially values the tokens held and the activity conducted by the nodes on the blockchain network.
In this Altcoin Explorer, BTCManager delves deeper into the finer intricacies of the NEM blockchain protocol, including the project’s POI consensus algorithm, its native digital token XEM, and some of its real-world use-cases.
Without further ado, let’s get to the core of this high-performance distributed ledger technology (DLT) platform.

History of NEM

NEM was launched on March 31, 2015, with an aim to develop an enterprise-grade blockchain protocol that could circumvent the infamous trilemma of blockchain: scalability, speed, and privacy.
Operated by a Gibraltar-registered NEM Group, NEM is a fork-out version of the NXT blockchain. After the successful fork, the NEM community decided to build its ecosystem from the ground up and developed its own codebase to make the network more scalable and faster.
NEM’s insistence toward building its own tech infrastructure led to a DLT protocol that is unlike anything resembling other similar platforms.
Today, NEM ranks among the top go-to blockchain platforms for enterprises across the world, rivaling competing protocols including Ethereum (ETH), and TRON (TRX), among others.

NEM’s Proof-of-Importance (POI) Algorithm

Unlike Bitcoin’s (BTC) energy-intensive Proof-of-Work (PoW) and Ethereum’s yet-to-be-implemented Proof-of-Stake (PoS) consensus algorithm, NEM uses PoI consensus mechanism.
The PoI mechanism achieves consensus by incentivizing active user participation in the NEM network. This consensus infrastructure ensures an agile decentralized network by rewarding well-behaved nodes that not only possess a significant stake in the network but are also actively engaged in executing transactions to maintain the network’s robustness.
Specifically, each node in the network possesses an ‘Importance Score’ that impacts the number of times the said node can ‘Harvest’ the XEM altcoin.
Initially, when a user puts XEM tokens into their wallet, they are called ‘unvested coins.’ Over time, as the wallets start accumulating an increasing number of XEM and contribute to the network’s transaction volume, they start collecting importance scores. At the same time, the XEM tokens in these wallets change into ‘vested coins,’ provided that there are at least 10000 tokens in the wallet.
To put things into perspective, let’s take the help of a small example.
On day 1, Joe receives 50,000 XEM in his digital wallet. Now, with each passing day, the NEM network will ‘vest’ 10 percent of the tokens held by Joe. So, on day 2, 5,000 tokens held by Joe are vested into the network. On day 3, 10 percent of the remaining tokens – 15,000 XEM – get vested into the network, leaving Joe with 13,500 XEM, and so one. After a couple of days, Joe sees that the number of XEM vested by him has crossed the 10,000 coins threshold, thereby, making him eligible to seek rewards from the NEM blockchain for his contribution to vesting his tokens.
Close followers of blockchain projects would find the aforementioned network reward mechanism bear a close resemblance to the PoS consensus algorithm. However, it’s worthy of note that vesting coins is just one way of calculating a node’s importance score.
The NEM protocol also rewards nodes that are responsible for most activity on the network. In essence, this means that the higher the number of transactions executed by a node, the more likely it is to gain higher importance points. The balance between vesting XEM and network activity is an important metric to be maintained by NEM nodes as it directly impacts their likelihood of harvesting XEM.
NEM’s consensus algorithm does away with several issues plaguing the more energy-intensive protocols such as PoW. For instance, PoI does not necessarily require high-energy hardware to run the nodes. The decentralized nature of the algorithm means that almost any machine — irrespective of its tech configuration – can participate in the NEM ecosystem ensuring it remains decentralized.

NEM’s Native Digital Token — XEM

XEM, unlike the vast majority of other cryptocurrencies, isn’t mined or staked using Pow or PoS algorithms. Rather, as explained earlier, XEM is ‘harvested’ through the PoI algorithm which ensures a steady supply of the digital token without flooding the market and involving the risk of a dramatic crash in price.
Per data on CoinMarketCap, at the time of writing, XEM trades at $0.04 with a market cap of more than $382 million and a 24-hour trading volume of approximately $6.8 million. The coin reached its all-time high of $1.92 in January 2018.
A large number of reputable cryptocurrency exchanges trade XEM, including Binance, Upbit, OKEx, Bithumb, ProBit, among others. The digital token can be easily traded with BTC, ETH, and USDT trading pairs.
That said, if you wish to vest your XEM to partake in the maintenance of the NEM network and earn rewards, it is recommended you store your tokens in the official NEM Nano wallet for desktop and mobile OS. Only XEM tokens held in the official NEM Nano wallet are eligible for vesting.

NEM Use-Cases

To date, NEM has been deployed for various real-world applications with promising results.
In 2018, Ukraine launched a blockchain-based e-voting trial leveraging the NEM DLT platform.
At the time, Ukraine’s Central Election Commission – with the local NEM Foundation representation – estimated the test vote trial in each polling station could cost as low as $1,227. The organization’s Oleksandr Stelmakh lauded the efforts, saying that using a blockchain-powered voting mechanism would make it impossible for anyone to fiddle with the records. The Commission added that the NEM protocol presents information in a more user-friendly format for voters.
In the same year, Malaysia’s Ministry of Education launched an e-scroll system based on the NEM blockchain to tackle the menace of fake degrees. The University Degree Issuance and Verification System use the NEM blockchain which is interrogated upon scanning of a QR code printed on the degree certificate.
The Ministry added that one of the primary reasons for its decision to selected the NEM platform was its unique and cutting-edge features in managing traceability and authentication requirements.
On a recent note, the Bank of Lithuania announced that it would be issuing its NEM blockchain-powered digital collector’s coin (LBCoin) in July after the successful completion of its testing phase.

Final Thoughts

Summing up, NEM offers a wide array of in-house features that separate it from other blockchain projects in a space that is becoming increasingly congested. NEM’s creative PoI consensus algorithm is a fresh take on the PoS algorithm for performance enhancement. Further, the project’s newly launched enterprise-grade DLT solution, Symbol, offers a tremendous option to businesses to help them cut costs, reduce complexities, and streamline innovation.
NEM uses the Java programming language that makes it an easy project for developers to get involved with, unlike other projects such as Ethereum that use platform-specific programming languages like Solidity. The project’s tech infrastructure not only makes it less power-intensive compared to Bitcoin but also more scalable than its rival projects including Ethereum and NEO.
NEM’s tagline, “Smart Asset Blockchain, Built for Performance,” perfectly captures everything the project has to offer. Over the years, NEM’s active developer community has craftily addressed the notorious bottlenecks in the vast majority of blockchain solutions, The future looks promising for NEM as it continues to foster a trustless and blockchain-driven economy for tomorrow.
Source
submitted by charlesgwynne to ico [link] [comments]

Altcoin Explorer: NEM (XEM), the Enterprise-Grade Blockchain Platform

Altcoin Explorer: NEM (XEM), the Enterprise-Grade Blockchain Platform

https://preview.redd.it/f82bxncaxkg51.png?width=1300&format=png&auto=webp&s=34afde717d1781f7e472c8dcacd18a8b9390a78d
Nestled among the top 40 cryptocurrencies by reported market cap, New Economy Movement — popularly known as NEM (XEM) – is a peer-to-peer (P2P), dual-layer blockchain smart contract platform written in one of the most influential programming languages, Java. NEM uses the proof-of-importance (POI) consensus algorithm that essentially values the tokens held and the activity conducted by the nodes on the blockchain network.
In this Altcoin Explorer, BTCManager delves deeper into the finer intricacies of the NEM blockchain protocol, including the project’s POI consensus algorithm, its native digital token XEM, and some of its real-world use-cases.
Without further ado, let’s get to the core of this high-performance distributed ledger technology (DLT) platform.

History of NEM

NEM was launched on March 31, 2015, with an aim to develop an enterprise-grade blockchain protocol that could circumvent the infamous trilemma of blockchain: scalability, speed, and privacy.
Operated by a Gibraltar-registered NEM Group, NEM is a fork-out version of the NXT blockchain. After the successful fork, the NEM community decided to build its ecosystem from the ground up and developed its own codebase to make the network more scalable and faster.
NEM’s insistence toward building its own tech infrastructure led to a DLT protocol that is unlike anything resembling other similar platforms.
Today, NEM ranks among the top go-to blockchain platforms for enterprises across the world, rivaling competing protocols including Ethereum (ETH), and TRON (TRX), among others.

NEM’s Proof-of-Importance (POI) Algorithm

Unlike Bitcoin’s (BTC) energy-intensive Proof-of-Work (PoW) and Ethereum’s yet-to-be-implemented Proof-of-Stake (PoS) consensus algorithm, NEM uses PoI consensus mechanism.
The PoI mechanism achieves consensus by incentivizing active user participation in the NEM network. This consensus infrastructure ensures an agile decentralized network by rewarding well-behaved nodes that not only possess a significant stake in the network but are also actively engaged in executing transactions to maintain the network’s robustness.
Specifically, each node in the network possesses an ‘Importance Score’ that impacts the number of times the said node can ‘Harvest’ the XEM altcoin.
Initially, when a user puts XEM tokens into their wallet, they are called ‘unvested coins.’ Over time, as the wallets start accumulating an increasing number of XEM and contribute to the network’s transaction volume, they start collecting importance scores. At the same time, the XEM tokens in these wallets change into ‘vested coins,’ provided that there are at least 10000 tokens in the wallet.
To put things into perspective, let’s take the help of a small example.
On day 1, Joe receives 50,000 XEM in his digital wallet. Now, with each passing day, the NEM network will ‘vest’ 10 percent of the tokens held by Joe. So, on day 2, 5,000 tokens held by Joe are vested into the network. On day 3, 10 percent of the remaining tokens – 15,000 XEM – get vested into the network, leaving Joe with 13,500 XEM, and so one. After a couple of days, Joe sees that the number of XEM vested by him has crossed the 10,000 coins threshold, thereby, making him eligible to seek rewards from the NEM blockchain for his contribution to vesting his tokens.
Close followers of blockchain projects would find the aforementioned network reward mechanism bear a close resemblance to the PoS consensus algorithm. However, it’s worthy of note that vesting coins is just one way of calculating a node’s importance score.
The NEM protocol also rewards nodes that are responsible for most activity on the network. In essence, this means that the higher the number of transactions executed by a node, the more likely it is to gain higher importance points. The balance between vesting XEM and network activity is an important metric to be maintained by NEM nodes as it directly impacts their likelihood of harvesting XEM.
NEM’s consensus algorithm does away with several issues plaguing the more energy-intensive protocols such as PoW. For instance, PoI does not necessarily require high-energy hardware to run the nodes. The decentralized nature of the algorithm means that almost any machine — irrespective of its tech configuration – can participate in the NEM ecosystem ensuring it remains decentralized.

NEM’s Native Digital Token — XEM

XEM, unlike the vast majority of other cryptocurrencies, isn’t mined or staked using Pow or PoS algorithms. Rather, as explained earlier, XEM is ‘harvested’ through the PoI algorithm which ensures a steady supply of the digital token without flooding the market and involving the risk of a dramatic crash in price.
Per data on CoinMarketCap, at the time of writing, XEM trades at $0.04 with a market cap of more than $382 million and a 24-hour trading volume of approximately $6.8 million. The coin reached its all-time high of $1.92 in January 2018.
A large number of reputable cryptocurrency exchanges trade XEM, including Binance, Upbit, OKEx, Bithumb, ProBit, among others. The digital token can be easily traded with BTC, ETH, and USDT trading pairs.
That said, if you wish to vest your XEM to partake in the maintenance of the NEM network and earn rewards, it is recommended you store your tokens in the official NEM Nano wallet for desktop and mobile OS. Only XEM tokens held in the official NEM Nano wallet are eligible for vesting.

NEM Use-Cases

To date, NEM has been deployed for various real-world applications with promising results.
In 2018, Ukraine launched a blockchain-based e-voting trial leveraging the NEM DLT platform.
At the time, Ukraine’s Central Election Commission – with the local NEM Foundation representation – estimated the test vote trial in each polling station could cost as low as $1,227. The organization’s Oleksandr Stelmakh lauded the efforts, saying that using a blockchain-powered voting mechanism would make it impossible for anyone to fiddle with the records. The Commission added that the NEM protocol presents information in a more user-friendly format for voters.
In the same year, Malaysia’s Ministry of Education launched an e-scroll system based on the NEM blockchain to tackle the menace of fake degrees. The University Degree Issuance and Verification System use the NEM blockchain which is interrogated upon scanning of a QR code printed on the degree certificate.
The Ministry added that one of the primary reasons for its decision to selected the NEM platform was its unique and cutting-edge features in managing traceability and authentication requirements.
On a recent note, the Bank of Lithuania announced that it would be issuing its NEM blockchain-powered digital collector’s coin (LBCoin) in July after the successful completion of its testing phase.

Final Thoughts

Summing up, NEM offers a wide array of in-house features that separate it from other blockchain projects in a space that is becoming increasingly congested. NEM’s creative PoI consensus algorithm is a fresh take on the PoS algorithm for performance enhancement. Further, the project’s newly launched enterprise-grade DLT solution, Symbol, offers a tremendous option to businesses to help them cut costs, reduce complexities, and streamline innovation.
NEM uses the Java programming language that makes it an easy project for developers to get involved with, unlike other projects such as Ethereum that use platform-specific programming languages like Solidity. The project’s tech infrastructure not only makes it less power-intensive compared to Bitcoin but also more scalable than its rival projects including Ethereum and NEO.
NEM’s tagline, “Smart Asset Blockchain, Built for Performance,” perfectly captures everything the project has to offer. Over the years, NEM’s active developer community has craftily addressed the notorious bottlenecks in the vast majority of blockchain solutions, The future looks promising for NEM as it continues to foster a trustless and blockchain-driven economy for tomorrow.
Source
submitted by charlesgwynne to BlockchainStartups [link] [comments]

First look at the Bitcoin source code - YouTube [ASMR] Bitcoin Source Code Explained Bitcoin - Beyond the Basics What is Bitcoin? Bitcoin Explained Simply for Dummies ... The Bitcoin Source Code: A Guided Tour - Part 1, Block Time and Spacing

If you have been paying attention to bitcoin at all lately, you may have noticed a lot of talk going on about ‘forks’. Not like the kind you would find on a table, on a blockchain, a fork is a ... Bitcoin is the first decentralised digital currency. They are simply digital coins you can send through the internet. You send them directly, from person to person without going through a bank. So, less hassle, less fees and no restrictions (such as country). Your bitcoins are kept in a digital wallet – no bank needed. They are a global currency and can be spent anywhere on anything. Following the community responses to Bitcoin ABC’s blog post, Btc.top founder Jiang Zhuoer published an updated version of the plan the following day. (Called nLockTime in the Bitcoin Core source code.) The locktime indicates the earliest time a transaction can be added to the block chain. Locktime allows signers to create time-locked transactions which will only become valid in the future, giving the signers a chance to change their minds. If any of the signers change their mind, they can create a new non-locktime transaction. The new ... GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together. Sign up. master. 6 branches 236 tags. Go to file Code Clone HTTPS GitHub CLI Use Git or checkout with SVN using the web URL. Work fast with our official CLI. Learn more. Open with GitHub Desktop Download ZIP Launching GitHub Desktop. If nothing happens, download ...

[index] [31870] [14713] [35443] [22579] [14852] [5058] [50516] [15678] [35959] [50287]

First look at the Bitcoin source code - YouTube

----- Bitcoin Source Codes ----- The bitcoin Github release page - versions 0.1.5 to 0.15: https://github.com/bitcoin/bitcoin/releases A copy of the first or... Verdienmodel vervelende Bitcoin Code advertenties uitgelegd - Duration: ... Bitcoin Basics (Part 1) - "Explained For Beginners" - Duration: ... First look at the Bitcoin source code - Duration: ... [ASMR] Bitcoin Source Code Explained [ASMR] Bitcoin Source Code Explained. Skip navigation Sign in. Search. Loading... Close. This video is unavailable. Watch Queue Queue. An overview of the Bitcoin protocol, source code, data structures and algorithms. This presentation was delivered at Nova Southeastern University on June 20, 2014 by Chris DeRose of bitcoinfl.org. Ever wanted to develop altcoins from the bitcoin source code? In this series, I will explain everything there is to know about altcoin development, and the things that you need to know to get ...

#