• chevron_right

      Erlang Solutions: Exploring Key Trends in Digital Payments

      news.movim.eu / PlanetJabber · 3 days ago - 09:55 · 5 minutes

    Digital payments are essential to the global economy and have seen rapid and significant changes in recent years.

    Let’s take a look at the key trends of this change and some of the emerging digital trends are broadening the payments ecosystem. We’ll look at how payments work and the broader payments ecosystem.

    The look into the digital payments landscape

    Evolving customer expectations and technological advances are driving innovation. They now prioritise speed, near-to-real-time payments, frictionless transactions and decentralised models. Fuelled by the pandemic, significant growth of digital commerce has led to record payment volumes in most markets. These factors make payments one of the most interesting areas of financial services. There are opportunities for innovative fintechs to provide better client experiences and for traditional players to expand their services.

    Market competition is driving down fees. It is a challenge for traditional players to maintain the same levels of profitability while using existing payment infrastructure. We have seen fintech businesses launch into the payments ecosystem, offering a more diverse range of services.

    Traditional payment companies are responding by leveraging the huge amounts of data at their disposal to guide a strategy of adding to their offering. These new services are in areas including loyalty, tailored offers, data insights, risk management and more.

    A cashless world leads the way

    Consumers’ shift to digital channels drives demand for seamless fulfilment and instant gratification. A recent Capgemini World Payments Report survey found an increase from 24% to 46% in respondents who had e-commerce accounts for more than half of their monthly spending. This is compared from before the pandemic to now.

    Digital Payments e-commerce

    Retail E-commerce sales worldwide revenue

    According to Statista, 91% of the global population is expected to own a smartphone by 2026 .

    A majority of people have now experienced the efficiencies offered by digital payments. It is unlikely they will ever return to the older, inefficient ways of the past.

    Nayapay , one of our clients in the South Asian market is using the MongooseIM chat engine . They are an example of players in the payments space seizing the opportunity to disrupt local markets.

    Their chat-based payments app targets the unbanked in Pakistan. It is built around fusing the penetration of smartphone usage with people’s willingness to integrate transactions into their daily digital activities, adding ease of cashless payments to their everyday lives.

    The growing demand for faster payments

    Demand for instant transactions is driving change in cross-border payments, international remittances and e-commerce. Mirroring the speed of cash transactions electronically used to be a challenge. Now, the introduction of real-time clearing and settlement facilities across markets makes processing payments almost instant.

    Studies by the Federal Reserve Financial Services saw the strong growth of digital wallets in 2023, (US). Businesses increased their use by 31% from the previous year, and consumers experienced a 32% increase in this digital adoption.

    Here are some more statistics on the most popular use cases for faster payments:

    digital payments

    Source: Federal Reserve study sheet

    Peer to Peer transactions

    As shown by the chart, one of the most popular cases is person-to-person, or peer-to-peer (P2P) payments.

    Consumers are embracing the simplicity of peer-to-peer services. Zelle, Venmo (US), Kuflink and easyMoney (UK) are commonly used for everyday transactions. These services are important to people seeking quick, hassle-free ways to settle informal payments.

    The availability of P2P services is expected to expand to meet the growing market demand.

    According to Precedence Research , the global peer-to-peer (P2P) lending market size was valued at USD 110.9 billion in 2023. It is expected to hit over USD 1,168.1 billion by 2033.

    Peer to peer digital payments

    From peer-to-peer services that enable informal transactions to the widespread adoption of digital payments, consumers are welcoming the future of finance. As technology continues to reshape how we conduct transactions, the prospect of a cashless society becomes more conceivable.

    Growth in embedded payments

    Lengthy checkout pages are seen as a turn-off to e-commerce customers. Using embedded payments allows you to skip the additional steps. So instead of providing just a single, clickable button on your app or website, the customer can choose their desired payment, such as Klarna, Amazon Pay, or PayPal, then click the embedded link and complete the transaction.

    Amazon- pioneering embedded payments

    Amazon customers can log into their accounts that already contain stored payment details and shipping addresses. They then use the “Buy Now” button to instantly complete their purchase.

    digital payments embedded payments

    It requires only a payment confirmation and avoids the need to re-enter payment and shipping information. This quick transaction process takes just seconds and has become commonplace with apps such as Uber, GrubHub, and more.

    Integration of embedded finance

    By integrating financial products into non-financial platforms, embedded finance is enhancing the convenience and speed of digital payments.

    For consumers, embedded finance offers additional benefits including:

    • Better understanding of optimal payment terms for customers
    • Seamless checkouts
    • Easy payment requests
    • Financing options such as buy now, pay later (BNPL), all within a unified customer experience

    Beyond BNPL, other financial products like lending and card issuing are also being integrated into these platforms. Major banks can reach millions of new users through Banking-as-a-Service (BaaS) APIs provided to technology businesses and platforms outside the traditional financial services industry.

    Leveraging payments data

    The diverse range of digital touchpoints involved in a cashless payments ecosystem provides vast amounts of data.

    This is important to banks and fintechs to grow client relationships based on analytics and insights. Companies that can unlock the true value of payment activity data by leveraging artificial intelligence (AI) and machine learning (ML) tools. These can offer more efficient, tailored products and a more secure, protected environment.

    The implementation of the messaging standard ISO20022 is a vital part of improving the amount and quality of payment data available. As the global standard for payment messaging, ISO 20022 provides better-structured and more granular data. A shared language to be used for transactions made by anyone, anywhere.

    Journey towards digital

    The modern digital payments ecosystem is varied. Overall, the journey towards more digital, open and real-time operations mirrors how society at large now lives online.

    We help to create digital and mobile payment solutions that enhance the customer experience and protect customer data. We work with clients who provide services, cryptocurrency, blockchain, embedded finance, payment gateways and more. To learn more about our offering, you can contact our team directly.

    The post Exploring Key Trends in Digital Payments appeared first on Erlang Solutions .

    • wifi_tethering open_in_new

      This post is public

      www.erlang-solutions.com /blog/exploring-key-trends-in-digital-payments/

    • chevron_right

      Erlang Solutions: Top 5 Tips to Ensure IoT Security for Your Business

      news.movim.eu / PlanetJabber · Thursday, 13 June - 11:01 · 9 minutes

    In an increasingly tech-driven world, the implementation of IoT for business is a given. According to the latest data, there are currently 17.08 billion connected IoT devices – and counting. A growing number of devices requires robust IoT security to maintain privacy, protect sensitive data and prevent unauthorised access to connected devices.

    A single compromised device can be a threat to an entire network. For businesses, it can lead to major financial losses, operational disruptions and a major impact on brand reputation. We will be taking you through the five key considerations to ensure IoT for businesses including data encryption methods, password management, IoT audits, workplace education and the importance of disabling unused features.

    Secure password practices

    Weak passwords make IoT devices susceptible to unauthorised access, leading to data breaches, privacy violations and increased security risks. When companies install devices, without changing default passwords or by creating oversimplified ones, they create a gateway entry point for attackers. Implementing strong and unique passwords can ensure the protection of these potential threats.

    Password managers

    Each device in a business should have its own unique password that should change on a regular basis. According to the 2024 IT Trends Report by JumpCloud, 83% of organisations surveyed use password-based authentication for some IT resources.

    Consider using a business-wide password manager to store your passwords securely and that allows you to use unique passwords across multiple accounts.

    Password managers are also incredibly important as they:

    • Help to spot fake websites, protecting you from phishing scams and attacks.
    • Allow you to synchronise passwords across multiple devices, making it easy and safe to log in wherever you are.
    • Track if you are re-using the same password across different accounts for additional security.
    • Spot any password changes that could appear to be a breach of security.

    Multi-factor authentication (MFA)

    Multi-factor authentication (MFA) adds an additional layer of security. It requires additional verification beyond just a password, such as SMS codes, biometric data or other forms of app-based authentication. You’ll find that many password managers actually offer built-in MFA features for enhanced security.

    Some additional security benefits include:

    • Regulatory compliance
    • Safeguarding without password fatigue
    • Easily adaptable to a changing work environment
    • An extra layer of security compared to two-factor authentication (2FA)

    As soon as an IoT device becomes connected to a new network, it is strongly recommended that you reset any settings with a secure, complex password. Using password managers allows you to generate unique passwords for each device to secure your IoT endpoints optimally.

    Data encryption at every stage

    Why is data encryption so necessary? With the increased growth of connected devices, data protection is a growing concern. In IoT, sensitive information (personal data, financial, location etc) is vulnerable to cyber-attacks if transmitted over public networks. When done correctly, data encryption renders personal data unreadable to those who don’t have outside access. Once that data is encrypted, it becomes safeguarded, mitigating unnecessary risks.

    IoT security data encryption

    Additional benefits to data encryption

    How to encrypt data in IoT devices

    There are a few data encryption techniques available to secure IoT devices from threats. Here are some of the most popular techniques:

    Triple Data Encryption Standard (Triple DES): Uses three rounds of encryption to secure data, offering a high-level of security used for mission-critical applications.

    Advanced Encryption Standard (AES) : A commonly used encryption standard, known for its high security and performance. This is used by the US federal government to protect classified information.

    Rivest-Shamir-Adleman (RSA): This is based on public and private keys, used for secure data transfer and digital signatures.

    Each encryption technique has its strengths, but it is crucial to choose what best suits the specific requirements of your business.

    Encryption support with Erlang/Elixir

    When implementing data encryption protocols for IoT security, Erlang and Elixir offer great support to ensure secure communication between IoT devices. We go into greater detail about IoT security with Erlang and Elixir in a previous article, but here is a reminder of the capabilities that make them ideal for IoT applications:

    1. Concurrent and fault-tolerant nature: Erlang and Elixir have the ability to handle multiple concurrent connections and processes at the same time. This ensures that encryption operations do not bottleneck the system, allowing businesses to maintain high-performing, reliable systems through varying workloads.
    2. Built-in libraries: Both languages come with powerful libraries, providing effective tools for implementing encryption standards, such as AES and RSA.
    3. Scalable: Both systems are inherently scalable, allowing for secure data handling across multiple IoT devices.
    4. Easy integration: The syntax of Elixir makes it easier to integrate encryption protocols within IoT systems. This reduces development time and increases overall efficiency for businesses.

    Erlang and Elixir can be powerful tools for businesses, enhancing the security of IoT devices and delivering high-performance systems that ensure robust encryption support for peace of mind.

    Regular IoT inventory audits

    Performing regular security audits of your systems can be critical in protecting against vulnerabilities. Keeping up with the pace of IoT innovation often means some IoT security considerations get pushed to the side. But identifying weaknesses in existing systems allows organisations to implement much- needed strategy.

    Types of IoT security testing

    We’ve explained how IoT audits are key in maintaining secure systems. Now let’s take a look at some of the common types of IoT security testing options available:

    IoT security testing

    IoT security testing types

    Firmware software analysis

    Firmware analysis is a key part of IoT security testing. It explores the firmware, the core software embedded into the IoT hardware of IoT products (routers, monitors etc). Examining the firmware means security tests can identify any system vulnerabilities, that might not be initially apparent. This improves the overall security of business IoT devices.

    Threat modelling

    In this popular testing method, security professionals create a checklist based on potential attack methods, and then suggest ways to mitigate them. This ensures the security of systems by offering analysis of necessary security controls.

    IoT penetration testing

    This type of security testing finds and exploits security vulnerabilities in IoT devices. IoT penetration testing is used to check the security of real-world IoT devices, including the entire ecosystem, not just the device itself.

    Incorporating these testing methods is essential to help identify and mitigate system vulnerabilities. Being proactive and addressing these potential security threats can help businesses maintain secure IoT infrastructure, enhancing operational efficiency and data protection.

    Training and educating your workforce

    Employees can be an entry point for network threats in the workplace.

    The time of BYOD (bring your own devices) where an employee’s work supplies would consist of their laptops, tablets and smartphones in the office to assist with their tasks, is long gone. Now, personal IoT devices are also used in the workplace. Think of your popular wearables like smartwatches, fitness trackers, e-readers and portable game consoles. Even portable appliances like smart printers and smart coffee makers are increasingly popular in office spaces.

    Example of increasing IoT devices in the office. Source: House of IT

    The use of various IoT devices throughout your business network is the most vulnerable target for cybercrime, using techniques such as phishing and credential hacking or malware.

    Phishing attempts are among the most common. Even the most ‘tech-savvy’ person can fall victim to them. Attackers are skilled at making phishing emails seem legitimate, forging real domains and email addresses to appear like a legitimate business.

    Malware is another popular technique concealed in email attachments, sometimes disguised as Microsoft documents, unassuming to the recipient.

    Remote working and IoT security

    Threat or malicious actors are increasingly targeting remote workers. Research by Global Newswire shows that remote working increases the frequency of cyber attacks by a staggering 238%.

    The nature of remote employees housing sensitive data on various IoT devices makes the need for training even more important. There is now a rise in companies moving to secure personal IoT devices that are used for home working, with the same high security as they would corporate devices.

    How are they doing this? IoT management solutions. They provide visibility and control over other IoT devices. Key players across the IoT landscape are creating increasingly sophisticated IoT management solutions, helping companies administer and manage relevant updates remotely.

    The use of IoT devices is inevitable if your enterprise has a remote workforce.

    Regular remote updates for IoT devices are essential to ensure the software is up-to-date and patched. But even with these precautions, you should be aware of IoT device security risks and take steps to mitigate them.

    Importance of IoT training

    Getting employees involved in the security process encourages awareness and vigilance for protecting sensitive network data and devices.

    Comprehensive and regularly updated education and training are vital to prepare end-users for various security threats. Remember that a business network is only as secure as its least informed or untrained employee.

    Here are some key points employees need to know to maintain IoT security :

    • The best practices for security hygiene (for both personal and work devices and accounts).
    • Common and significant cybersecurity risks to your business.
    • The correct protocols to follow if they suspect they have fallen victim to an attack.
    • How to identify phishing, social engineering, domain spoofing, and other types of attacks.

    Investing the time and effort to ensure your employees are well informed and prepared for potential threats can significantly enhance your business’s overall IoT security standing.

    Disable unused features to ensure IoT security

    Enterprise IoT devices come with a range of functionalities. Take a smartwatch, for example. Its main purpose as a watch is of course to tell the time, but it might also include Bluetooth, Near-Field Communication (NFC), and voice activation. If you aren’t using these features, then you’re opening yourself up for hackers to potentially breach your device. Deactivation of unused features reduces the risk of cyberattacks, as it limits the ways for hackers to breach these devices.

    Benefits of disabling unused features

    If these additional features are not being used, they can create unnecessary security vulnerabilities. Disabling unused features helps to ensure IoT security for businesses in several ways:

    1. Reduces attack surface : Unused features provide extra entry points for attackers. Disabling features limits the number of potential vulnerabilities that could be exploited, in turn reducing attacks overall.
    2. Minimises risk of exploits : Many IoT devices come with default settings that enable features which might not be necessary for business operations. Disabling these features minimises the risk of weak security.
    3. Improves performance and stability : Unused features can consume resources and affect the performance and stability of IoT devices. By disabling them, devices run more efficiently and are less likely to experience issues that could be exploited by attackers.
    4. Simplifies security management : Managing fewer active features simplifies security oversight. It becomes simpler to monitor and update any necessary features.
    5. Enhances regulatory compliance : Disabling unused features can help businesses meet regulatory requirements by ensuring that only the necessary and secure functionalities are active.

    To conclude

    The continued adoption of IoT is not stopping anytime soon. Neither are the possible risks. Implementing even some of the five tips we have highlighted can significantly mitigate the risks associated with the growing number of devices used for business operations.

    Ultimately, investing in your business’s IoT security is all about safeguarding the entire network, maintaining the continuity of day-to-day operations and preserving the reputation of your business. You can learn more about our current IoT offering by visiting our IoT page or contacting our team directly .

    The post Top 5 Tips to Ensure IoT Security for Your Business appeared first on Erlang Solutions .

    • wifi_tethering open_in_new

      This post is public

      www.erlang-solutions.com /blog/top-5-tips-to-ensure-iot-security-for-your-business/

    • chevron_right

      Erlang Solutions: 10 Unusual Blockchain Use Cases

      news.movim.eu / PlanetJabber · Thursday, 6 June - 10:55 · 10 minutes

    When Blockchain technology was first introduced with Bitcoin in 2009, no one could have foreseen its impact on the world or the unusual cases of blockchain that have emerged. Fast forward to now and Blockchain has become popular for its ability to ensure data integrity in transactions and smart contracts.

    Thanks to its cost-effectiveness,  transparency, speed and top security, it has found its way into many industries, with blockchain spending expected to reach $19 billion this year .

    In this post, we will be looking into 10 use cases that have caught our attention, in industries benefiting from Blockchain in unusual and impressive ways.

    Ujo Music- Transforming payment for artists

    Let’s start exploring the first unusual use case for blockchain with Ujo Music .

    Ujo Music started with a mission to get artists paid fairly for their music, addressing the issues of inadequate royalties from streaming and complicated copyright laws.

    To solve this, they turned to blockchain technology, specifically Ethereum. In using this, Ujo Music was able to create a community that allowed music owners to automatically receive royalty payments. Artists were also able to retain these rights due to smart contracts and cryptocurrencies. This approach allowed artists to access their earnings instantly, without the need to incur fees or have the wait time associated with more traditional systems.

    As previously mentioned, Blockchain also allows for transparency and security, which is key in preventing theft and copyright infringement of the owner’s information. Ujo Music is transforming the payment landscape for artists in the digital, allowing for better management and rights over their music.

    Cryptokitties-Buying virtual cats and gaming

    For anyone looking to collect and breed digital cats in 2017, Cryptokitties was the place to be. While the idea of a cartoon crypto animation seems incredibly niche, the initial Cryptokitties craze is one that cannot be denied in the blockchain space.

    Upon its launch, it immediately went viral, with the alluring tagline “The world’s first Ethereum game.” According to nonfungible.com , the NFT felines saw sales volume spike from just 1,500 on launch day, to 52,000 by the of 2017.

    CryptoKitties was among the first projects to harness smart contracts by attaching code to data constructs called tokens on the Ethereum blockchain. Each chunk of the game’s code (which it refers to as a “gene”) describes the attributes of a digital cat. Players buy, collect, sell, and even breed new felines.

    AD_4nXd5CuA_8r612NtimgQvYawRuk8uAA5XdxR_XmdzOpIRMI2XmEImsWEvE0l80fHEbYdkjnrJ6yntTAR_p9qnnGku4zZSZVyTeuNQEBwb74IGs0Pfa9uGR03nN0uEHV6Plt-YFAAKkRZ1hcDI47Pfanq4cC0M?key=gUKCSBjQvqkvfaRqHyRAKQ

    Source: Dapper Labs

    Just like individual Ethereum tokens and bitcoins, the cat’s code also ensures that the token representing each cat is unique, which is where the nonfungible token, or NFT, comes in. A fungible good is, by definition, one that can be replaced by an identical item—one bitcoin is as good as any other bitcoin. An NFT, by contrast, has a unique code that applies to no other NFT.

    Blockchain can be used in gaming in general by creating digital and analogue gaming experiences. By investing in CryptoKitties, players could invest, build and extend their gaming experience.

    ParagonCoin for Cannabis

    Our next unusual blockchain case stems from legal cannabis.

    Legal cannabis is a booming business, expected to be worth $1.2 billion by the end of 2024. With this amount of money, a cashless solution offers business owners further security. Transactions are easily trackable and offer transparency and accountability that traditional banking doesn’t.

    AD_4nXcHxBj7Jck1V9TDxn5lcfcSt277zRBpMyL0vjq9lOSeBoCZXJaeJ7bXonimwTT0mi4wREivxIzDy_7SJPAUr07zRZQjzol4tQ4vRDuNBeqCp_PFN80MRvmXqND6ZfVVMPxRoy06KMI-Y61ijAlz1IEaSlo9?key=gUKCSBjQvqkvfaRqHyRAKQ

    ParagonCoin business roadmap

    Transparency in the legal cannabis space is key for businesses looking to challenge its negative image. ParagonCoin, a cryptocurrency startup had a unique value proposition for its entire ecosystem, making it clear that its business would be used for no illegal activity.

    Though recently debunked, ParagonCoin was a pioneer in its field in utilising B2B payments. At the time of its launch, paying for services was only possible with cash, as businesses that were related to cannabis were not allowed to officially have a bank account.

    This creates a dire knock-on effect, making it difficult for businesses to pay for solicitors, staff and other operational costs. The only ways to get an operation running would have been unsafe, inconvenient and possibly illegal. ParagonCoin remedied this by asking businesses to adopt a pseudo-random generator (PRG) payment system to answer the immediate issues.

    Here are some other ways ParagonCoin adopted blockchain technology in their cannabis industry:

    • Regulatory compliance – Simplifying compliance issues on a local and federal level.
    • Secure transactions – Utilising smart contracts to automate and enforce agreement terms, reducing the risk of fraud.
    • Decentralised marketplace – Creating a platform for securely listing and reviewing products and services, while fostering a community of engaged users, businesses and regulators.
    • Innovative business models – The facilitating of crowdfunding to transparently raise business capital.

    These cases highlight blockchain technologies’ ability to enhance transparency, compliance, and security, within even the most unexpected industries.

    Siemens partnership- Sharing solar power

    Siemens has partnered with startup LO3 Energy with an app called Brooklyn Microgrid . This allows residents of Brooklyn who own solar panels to transfer their energy to others who don’t have this capability. Consumers and solar panel owners are in control of the entire transaction.

    Residents with solar panels sell excess energy back to their neighbours, in a peer-to-peer transaction. If you’d like to learn more about the importance of peer-to-peer (p2p) networks, you can check out our post about the Principles of Blockchain .

    Microgrids reduce the amount of energy that gets lost during transmission. It provides a more efficient alternative since approximately 5% of electricity generated in the US is lost in transit. The Brooklyn microgrid not only minimises these losses but also offers economic benefits to those who have installed solar panels, as well as the local community.

    Björn Borg and same-sex marriage

    Same-sex marriage is still banned in a majority of countries across the world. With that in mind, the Swedish sportswear brand Björn Borg discovered an ingenious way for loved ones to be in holy matrimony, regardless of sexual orientation on the blockchain. But how?

    Blockchain is stereotypically linked with money, but remove those connotations and you have an effective ledger that can record events as well as transactions.

    Björn Borg has put this loophole to extremely good use by forming the digital platform Marriage Unblocked, where you can propose, marry and exchange vows all on the blockchain. What’s more, the records can be kept anonymous offering security for those in potential danger, and you get the flexibility of smart contracts.

    Of course, you can request a certificate to display proudly too!

    Whilst this doesn’t hold any legal requirements, everything is produced and stored online. If religion or government isn’t a primary concern of yours, where’s the harm in a blockchain marriage?

    Tangle -Simplifying the Internet of Things (IoT)

    Blockchain offers ledgers that can record the huge amounts of data produced by IoT systems. Once again the upside is the level of transparency it offers that simply cannot be found in other services.

    The Internet of Things is one of the most exciting elements to come out of technology. The connected ecosystems can record and share various interactions. Blockchain lends itself perfectly to this, as it can transfer data and give identification for both public and private sector use cases. Here is an example:

    Public sector- Infrastructure management, taxes (and other municipal services).

    Private sector -logistical upgrade, warehousing tracking, greater efficiency, and enhanced data capabilities.

    IOTA’s Tangle is a blockchain specifically for IoT which handles machine-to-machine micropayments. It has reengineered distributed ledger technology (DLT), enabling the secure exchange of both value and data.

    Tangle is the data structure behind micro-transaction crypto tokens that are purposely optimised and developed for IoT. It differs from other blockchains and cryptocurrencies by having a much lighter, more efficient way to deal with tens of billions of devices.

    It includes a decentralised peer-to-peer network that relies on a Distributed Acyclic Graph (DAG), which creates a distributed ledger rather than “blocks”. There are no transaction fees, no mining, and no external consensus process. This also secures data to be transferred between digital devices.

    Walmart and IBM- Improving supply chains

    Blockchain’s real-time tracking is essential for any company with a significant number of supply chains.

    Walmart partnered with IBM to produce a blockchain called Hyperledger Fabric blockchain to track foods from the supplier to the shop shelf. When a food-borne disease outbreak occurs, it can take weeks to find the source. Better traceability through blockchain helped save time and lives, allowing companies to act fast and protect affected farms.

    Walmart chose blockchain technology as the best option for a decentralised food supply ecosystem. With IBM, they created a food traceability system based on Hyperledger Fabric.

    The food traceability system built for the two products worked and Walmart can now trace the origin of over 25 products from five of its different suppliers using this system.

    Agora for elections and voter fraud

    Voting on a blockchain offers full transparency, and reduces the chance of voter fraud. A prime example of this is in Sierra Leone, which in 2018 became the first country to run a blockchain-based election, with 70% of the pollers using the technology to anonymously store votes in an immutable ledger.

    AD_4nXel07n9r91A8y0t13HqXmj-yWezIZxh8_equlqWA_GMg2QiJYv1zodL4SLkyKOeyU1bLcRc9SrPKKgl3DzRVWe3i_u2awA2WJn8zU_epDJGuAlpg4IijQTCB3Un3FxkAajU7QBOin0qFWS8cgJvC1MXzOym?key=gUKCSBjQvqkvfaRqHyRAKQ

    Sierra Leone results on the Agora blockchain

    These results were placed on Agora’s blockchain and by allowing anyone to view it, the government aimed to provide a level of trust with its citizens. The platform reduced controversy and costs enquired when using paper ballots.

    The result of this is a trustworthy and legitimate result that will also limit the amount of the hearsay from opposition voters and parties, especially in Sierra Leone which has had heavy corruption claims in the past.

    MedRec and Dentacoin Healthcare

    With the emphasis on keeping many records in a secure manner, blockchain lends itself nicely to medical records and healthcare.

    MedRec is one business using blockchain to keep secure files of medical records by using a decentralised CMS and smart contracts. This also allows transparency of data and the ability to make secure payments connected to your health. Blockchain can also be used to track dental care in the same sort of way.

    One example is Dentacoin, which uses the global token ERC20. It can be used for dental records but also to ensure dental tools and materials are sourced appropriately, whether tools are used on the correct patients, networks that can transfer information to each other quickly and a compliance tool.

    Everledger- Luxury items and art selling

    Blockchain’s ability to track data and transactions lends itself nicely to the world of luxury items.

    Everledger.io is a blockchain-based platform that enhances transparency and security in supply chain management. It’s particularly used for high-value assets such as diamonds, art, and fine wines.

    The platform uses blockchain technology to create a digital ledger that records the provenance and lifecycle of these assets, ensuring authenticity and preventing fraud. Through offering a tamper-proof digital ledger, Everledger allows stakeholders to trace the origin and ownership history of valuable assets, reducing the risk of fraud and enhancing overall market transparency.

    The diamond industry is a great use case of the Everledger platform.

    By recording each diamond’s unique attributes and history on an immutable blockchain, Everledger provides a secure and transparent way to verify the authenticity and ethical sourcing of diamonds. This helps in combating the circulation of conflict diamonds but also builds consumer trust by providing a verifiable digital record of each diamond’s journey from mine to market.

    To conclude

    While there is a buzz around blockchain, it’s important to note that the industry is well-established, and these surprising cases of blockchain display the broad and exciting nature of the industry as a whole. There are still other advantages to blockchain that we haven’t delved into in this article, but we’ve highlighted one of its greatest advantages for businesses and consumers alike- its transparency.
    If you or your business are working on an unusual blockchain case, let us know – we would love to hear about it! Also if you are looking for reliable FinTech or blockchain experts, give us a shout, we offer many services to fix issues of scale .

    The post 10 Unusual Blockchain Use Cases appeared first on Erlang Solutions .

    • chevron_right

      ProcessOne: Understanding messaging protocols: XMPP and Matrix

      news.movim.eu / PlanetJabber · Thursday, 6 June - 08:04 · 5 minutes · 4 visibility

    In the world of real-time communication, two prominent protocols often come into discussion: XMPP and Matrix. Both protocols aim to provide robust and secure messaging solutions, but they differ in architecture, features, and community adoption. This article delves into the key differences and similarities between XMPP and Matrix to help you understand which might be better suited for your needs.

    What is XMPP?

    Overview

    XMPP (Extensible Messaging and Presence Protocol) is an open-standard communication protocol originally developed for instant messaging (IM). It was designed as the Jabber protocol in 1999 to aggregate communication across a number of options, such as ICQ, Yahoo Messenger, and MSN. It was standardized by the IETF as RFC 3920 and RFC 3921 in 2004, and later revised as RFC 6120 and RFC 6121 in 2011.

    Key Features

    • Decentralized Architecture : XMPP operates on a decentralized network of servers. The protocol is said to be federated. The network of all interconnected XMPP servers is called the XMPP federation.
    • Extensibility : The protocol is highly extensible through XMPP Extension Protocols (XEPs). There are currently more than 400 extensions covering a broad range of use cases like social networking and Internet of Things features through PubSub extensions, Groupchat (aka MUC, Multi-user chat), and VoIP with the Jingle protocol.
    • Security : Supports TLS for encryption and SASL for authentication. End-to-end encryption is available through the OMEMO extension.
    • Interoperability : Widely adopted with numerous clients and servers available.
    • Gateways : Built-in support for gateways to other protocols, allowing for communication across different messaging systems.

    Network Protocol Design

    • TCP-Level Stream Protocol : XMPP is based on a TCP-level stream protocol using XML and namespaces. This extensibility while maintaining schema consistency is key. It can also run on top of other protocols like WebSocket or HTTP through the concept of binding.

    Use Cases

    • Instant messaging
    • Presence information
    • Multi-user chat (MUC)
    • Social networks
    • Voice and video calls (with extensions)
    • Internet of Things
    • Massive messaging (massive scale messaging platforms like WhatsApp)

    What is Matrix?

    Overview

    Matrix is an open standard protocol for real-time communication, designed to provide interoperability between different messaging systems. It was introduced in 2014 by the Matrix.org Foundation.

    Key Features

    • Decentralized Architecture : Like XMPP, Matrix is also decentralized and supports a federated model.
    • Event-Based Model : Uses an event-based architecture where all communications are stored in a distributed database. The conversations are replicated on all servers in the federation that participate in the discussion.
    • End-to-End Encryption : Built-in end-to-end encryption using the Olm and Megolm libraries.
    • Bridging : Strong focus on bridging to other communication systems like Slack, IRC, and XMPP.

    Network Protocol Design

    • HTTP-Based Protocol : Matrix uses HTTP for communication and JSON for its data structure, making it suitable for web environments and easy to integrate with web technologies.

    Use Cases

    • Instant messaging
    • VoIP and video conferencing
    • Bridging different chat systems

    Detailled Comparison

    Architecture

    • XMPP : Uses a federated model to build a network of communication that works for both messaging and social networking. The content is not duplicated by default.
    • Matrix : Uses a federated model where each server stores a complete history of conversations, allowing for decentralized control and redundancy.

    XMPP is built around an event-based architecture to reach the largest possible scale. Matrix is built around a distributed model that may be more appealing to smaller community servers. As the conversations are distributed, it can cope more easily with servers suffering from frequent disconnections in the federated network.

    Extensibility

    • XMPP : Extensible through XEPs that are standardized by the XMPP Standards Foundation, allowing for a wide variety of additional features. As the protocol is based on XML, it can also be extended for custom client features, using your own namespace. The XML schema can be used to define your extension data structure.
    • Matrix : Extensible through modules and APIs, with a strong focus on bridging to other protocols. It is extensible as well and allows custom events and custom properties.

    Security

    • XMPP : Supports TLS for secure communication and SASL for authentication. End-to-end encryption is available through extensions like OMEMO.
    • Matrix : Supports TLS for secure communication. Built-in end-to-end encryption using Olm and Megolm, providing robust security out of the box.

    Both end-to-end encryption approaches are similar, as they are both based on the same double ratchet encryption algorithm made popular by the Signal messaging platform.

    Interoperability

    • XMPP : Known for its interoperability due to its long-standing presence and wide adoption. Includes built-in support for gateways to other protocols.
    • Matrix : Designed with interoperability in mind, with native support for bridging to other protocols. More recent gateways are available. They could be ported to work on both protocols (which would be neat).

    Scalability

    • XMPP : By design, XMPP has an edge in terms of scalability. XMPP is event-based and works as a broadcast hub for messages, making it efficient in handling a large number of concurrent users. It is proven to sustain millions of concurrent users.
    • Matrix : Matrix maps conversations to documents that are replicated across servers involved in the discussion. This means the document state needs to be merged and reconciled for each new posted message, which incurs significant overhead in terms of processing power, memory, and storage. Its use case is mainly “organization level” chat, supporting thousands of users, not millions.

    Community and Adoption

    • XMPP : Established and widely adopted with a large number of client and server implementations. This can be seen as a drawback, leading to intimidating choices of tools. However, this has proven to be a strength with many competing implementations that have proven to be interoperable. This is a validation of the robustness of the protocol. Initially developed by Jeremy Miller, he cocreated Jabber, Inc to support the first server. The company was later acquired by Cisco. It is now an Internet Engineering Task Force standard used for massive scale deployments and a protocol drive by the non-profit XMPP Standard Foundation.
    • Matrix : Rapidly growing community with increasing adoption, particularly in open-source projects and decentralized applications. The main implementation is developed by Element, the company funded to grow the Matrix protocol.

    Conclusion

    Both XMPP and Matrix offer robust solutions for real-time communication with their own strengths. XMPP’s long history, extensibility, and efficient scalability make it a reliable choice for traditional instant messaging and presence-based applications, but also social networks, Internet of Things, and workflows that mix human users and devices. On the other hand, Matrix’s architecture, built-in end-to-end encryption, and focus on gateway development make it an excellent choice for those looking to integrate multiple communication systems or require secure corporate messaging through the Element client.

    Using a server like ejabberd is a future-proof approach, as it is multiprotocol by design. ejabberd supports XMPP, MQTT, SIP, can act as a VoIP and video call proxy (STUN/TURN), and can federate with the Matrix network. It is likely to support the Matrix client protocol as well in beta in the near future.

    Choosing between XMPP and Matrix depends largely on your specific needs, existing infrastructure, and future scalability requirements. Both protocols continue to evolve, offering exciting possibilities for real-time communication.


    Mistakes? If you spot a mistake, please reach out to share it! Thanks! I would like this document to be as accurate as possible.

    The post Understanding messaging protocols: XMPP and Matrix first appeared on ProcessOne .
    • chevron_right

      Erlang Solutions: 7 Key Blockchain Principles for Business

      news.movim.eu / PlanetJabber · Thursday, 30 May - 09:46 · 15 minutes

    Welcome to the final instalment of our Blockchain series. Here, we are taking a look at the seven fundamental principles that make blockchain: Immutability, decentralisation

    ‘workable’ consensus, distribution and resilience, transactional automation (including ‘smart contracts’), transparency and trust, and links to the external world.

    For business leaders, understanding these core principles is crucial in harnessing the potential for building trust, spearheading innovation and driving overall business efficiency.

    If you missed the previous blog, feel free to learn all about the strengths of Erlang and Elixir in blockchain here .

    Now let’s discuss how these seven principles can be leveraged to transform business operations.

    Understanding the Core Concepts

    In a survey conducted by EY , over a third (38%) of US workers surveyed said that blockchain technology is widely used within their businesses. A further 44% said the tech would be widely used within three years and 18% reported that they were still a few years away from being widely used within their business.

    To increase the adoption of blockchain, it is key to understand its principles, how it operates, and the advantages it offers across various industries, such as financial services, retail, advertising and marketing, and digital health.

    Immutability

    In an ideal world, we would want to keep an accurate record of events and make sure it doesn’t degrade over time due to natural events, human error, or fraud. While physical items can change over time, digital information can be continuously corrected to prevent deterioration.

    Implementing an immutable blockchain aims to maintain a digital history that remains unaltered over time. This is especially useful for businesses when it comes to assessing the ownership or the authenticity of an asset or to validate one or more transactions. In the context of legalities and business regulation, having an immutable record of transactions is key as this can save time and resources by streamlining these processes.

    In a well-designed blockchain, data is encoded using hashing algorithms. This ensures that only those with sufficient information can verify a transaction. This is typically implemented on top of Merkle trees, where hashes of combined hashes are calculated.

    BfJsaZep8NAm337iZqvel9dp-8zcBl5PyC1VkKO_v3ykU7M1DyceJagf10TX1T8-7IjCgRcKCQXQVKwg081J1yEdj9_8b6-bIWl7-hVYQ0DgCnPp3nhH3rWyfi8VRvLlv3j0ukC525iNmO1TF1CPzgA

    Merkle tree or hash tree

    In a well-designed blockchain, data is encoded using hashing algorithms. This ensures that only those with sufficient information can verify a transaction.

    Challenges raised by business leaders

    Legitimate questions can be raised by business leaders about storing an immutable data structure:

    • Scalability: How is the increasing volume of data handled once it surpasses ledger capacities?
    • Impact of decentralisation: What effect does growing data history and validation complexity have on decentralisation and participant engagement?
    • Performance verification: How does verification degrade as data history expands, particularly during peak usage?
    • Risk mitigation: How can we ensure consensus and prevent fragmented networks or unauthorised forks in transaction history?

    Businesses face challenges in managing growing data, maintaining decentralisation, verifying transactions, and preventing risks in immutable data storage. Meeting regulations also add complexity, and deciding what data to store must consider sensitivity.

    Addressing regulatory challenges

    Compliance with GDPR introduces challenges, especially concerning the “ right to be forgotten. ” This is important because fines for breaches of GDPR are potentially very severe for non-compliance. The solutions introduced so far effectively aim at anonymising the information that enters the immutable on-chain storage process, while sensitive information is stored separately in support databases where this information can be deleted if required.

    The challenge lies in determining upfront what information is considered sensitive and suitable for inclusion in the immutable record.. A wrong choice has the potential to backfire at a later stage if any involved actor manages to extract or trace sensitive information through immutable history.

    Immutability in blockchain technology provides a solution to preserving accurate historical records, ensuring the authenticity and ownership of assets, streamlining transaction validation, and saving businesses time and resources. But it also has its challenges, such as managing data volumes, maintaining decentralisation, and ensuring it is complying with regulations, for example, GDPR. Despite these challenges, businesses can leverage immutable blockchain technology to modernise record-keeping practices and uphold the integrity of their operations.

    Decentralisation of control

    Remember the 2008 financial crash? One of the reactions following this crisis was against over-centralisation.

    In response to the movement towards decentralisation, businesses have acknowledged the potential for innovation and adaptation. Embracing decentralisation not only aligns with consumer values of independence and democratic fairness, but it also presents opportunities for businesses to explore new markets and develop innovative products and services, as well as implement decentralised governance models within their own organisations.

    Use cases for decentralisation

    There are many ways in which businesses can leverage blockchain technology in order to embrace decentralisation and unlock new growth opportunities:

    Decentralised finance (DeFi): DeFi platforms leverage blockchain technology to provide financial services without the need for intermediaries, such as banks or brokerages.

    Supply chain management: By recording every transaction on a blockchain ledger, businesses can track the movement of goods from the point of origin to the end consumer.

    Smart contracts: Automatically enforce and execute contractual agreements when predefined conditions are met, also without the need for intermediaries.

    Tokenisation of assets: Businesses can turn their assets into digital tokens. This helps split ownership into smaller parts, making it easier to buy and sell, and allowing direct trading between people without intermediaries.

    Identity management: Blockchain-based identity management systems offer secure and decentralised solutions. Businesses can use blockchain to verify the identity of customers, employees, and partners while giving people greater control over their data.

    Data management and monetisation: Blockchain allows for businesses to securely manage and monetise data by giving individuals control over their data, facilitating direct transactions between data owners and consumers.

    Further considerations of decentralisation

    With full decentralisation, there is no central authority to resolve potential transactional issues. Traditional, centralised systems have well-developed anti-fraud and asset recovery mechanisms which people have become used to.

    Using new, decentralised technology places a far greater responsibility on the user if they are to receive all of the benefits of the technology, forcing them to take additional precautions when it comes to handling and storing their digital assets.

    There has no point in having an ultra-secure blockchain if one then hands over one’s wallet private key to an intermediary whose security is lax: it’s like having the most secure safe in the world and then writing the combination on a whiteboard in the same room.

    Decentralisation, security, and usability

    For businesses, embracing decentralisation unlocks new opportunities while posing challenges in security and usability. Balancing these factors is key as businesses continue to navigate decentralised technologies, shaping the future of commerce and industry.

    Businesses must consider whether the increased level of personal responsibility associated with secure blockchain implementation is a price users are willing to pay, or if they will trade off some security for ease of use and potentially more centralisation.

    Workable Consensus

    As businesses are increasingly pushing towards decentralised forms of control and responsibility, it has since been brought to light the fundamental requirement to validate transactions without a central authority; known as the ‘consensus’ problem. The blockchain industry has seen various approaches emerge to address this, with some competing and others complementing each other.

    There’s been a lot of attention on governance in blockchain ecosystems. This involves regulating how quickly new blocks are added to the chain and the rewards for miners (especially in proof-of-work blockchains). Overall, it’s crucial to set up incentives and deterrents so that everyone involved helps the chain grow healthily.

    Besides serving as an economic deterrent against denial of service and spam attacks, Proof of Work (POW) approaches are amongst the first attempts to automatically work out, via the use of computational power, which ledgers/actors have the authority to create/mine new blocks. Similar approaches (proof of space, proof of bandwidth etc) have followed, but all of them are vulnerable to deviations from the intended fair distribution of control.

    t4a_01IfUMBU_jCqaJmZbYQwzyL-QJ9nmD3lTYxeRLFKpKuB9RjFoXcHzcQx_JeIZozeD_J_4iiUOjsha1thSdtFVMTguEuTxQHr7xUXa8ULNPi745RWgVNjmmTSwJ8qixOkTl68DDVCHZq2HHxygRg

    Proof of work algorithm

    How do these methods benefit businesses? It gives them an edge by purchasing powerful hardware in bulk and running it in areas with cheaper electricity. This can help to outpace competitors in mining new blocks and gaining control, ultimately centralising authority.

    In response to the challenges brought on by centralised control and environmental concerns associated with traditional mining methods, alternative approaches such as Proof of Stake (POS) and Proof of Importance (POI) have emerged. These methods remove the focus from computing resources and tie authority to accumulated digital asset wealth or participant productivity. However, implementing POS and POI while mitigating the risk of power and wealth concentration could present significant challenges for developers and business leaders alike.

    Distribution and resilience

    Apart from decentralising authority, control and governance, blockchain solutions typically embrace a distributed peer-to-peer (P2P) design paradigm.

    This preference is motivated by the inherent resilience and flexibility that these types of networks have introduced and demonstrated, particularly in the context of file and data sharing. A centralised network, typical of mainframes and centralised services is exposed to a ‘single point of failure’ vulnerability as the operations are always routed towards a central node.

    If the central node breaks down or is congested, all the other nodes will be affected by disruptions. In a business context, decentralised and distributed networks attempt to reduce the detrimental effects that issues occurring on a node might trigger on other nodes. In a decentralised network, the failure of a node can still affect several neighbouring nodes that rely on it to carry out their operations. In a distributed network the idea is that the failure of a single node should not impact significantly any other node. Even when one preferential/optimal route in the network becomes congested or breaks down entirely, a message can still reach the destination via an alternative route.

    This greatly increases the chance of keeping a service available in the event of failure or malicious attacks such as a denial of service (DOS) attack. Blockchain networks with a distributed ledger redundancy are known for their resilience against hacking, especially when it comes to very large networks, such as Bitcoin. In such a highly distributed network, the resources needed to generate a significant disruption are very high, which not only delivers on the resilience requirement but also works as a deterrent against malicious attacks (mainly because the cost of conducting a successful malicious attack becomes prohibitive).

    Although a distributed topology can provide an effective response to failures or traffic spikes, businesses need to be aware that delivering resilience against prolonged over-capacity demands or malicious attacks requires adequate adapting mechanisms. While the Bitcoin network is well positioned, as it currently benefits from a high capacity condition (due to the historically high incentive to purchase hardware by third-party miners), this is not the case for other emerging networks as they grow in popularity. This is where novel instruments, capable of delivering preemptive adaptation combined with back pressure throttling applied to the P2P level, can be of great value.

    Distributed systems are not new and, whilst they provide highly robust solutions to many enterprise and governmental problems, they are subject to the laws of physics and require their architects to consider the trade-offs that need to be made in their design and implementation (e.g. consistency vs availability).

    Automation

    A high degree of automation is required for businesses to sustain a coherent, fair and consistent blockchain and surrounding ecosystem. Existing areas with a high demand for automation include those common to most distributed systems. For example; deployment, elastic topologies, monitoring, recovery from anomalies, testing, continuous integration, and continuous delivery.

    For blockchains, these represent well-established IT engineering practices. Additionally, there is a creative R&D effort to automate the interactions required to handle assets, computational resources and users across a range of new problem spaces (e.g. logistics, digital asset creation and trading).

    The trend of social interactions has seen a significant shift towards scripting for transactional operations. This is where smart contracts and constrained virtual machine (VM) interpreters have emerged – an effort pioneered by the Ethereum project .

    Many blockchain enthusiasts are drawn to the ability to set up asset exchanges, specifying conditions and actions triggered by certain events. Smart contracts find various applications in lotteries, digital asset trading, and derivative trading. However, despite the exciting potential of smart contracts, getting involved in this area requires a significant level of expertise. Only skilled developers who are willing to invest time in learning Domain Specific Languages (DSL) can create and modify these contracts.

    The challenge is to respond to safety and security concerns when smart contracts are applied to edge case scenarios that deviate from the ‘happy path’. If badly designed contracts cannot properly roll back or undo a miscarried transaction, their execution might lead to assets being lost or erroneously handed over to unwanted receivers.

    Automation and governance

    Another area in high need of automation is governance. Any blockchain ecosystem of users and computing resources requires periodic configurations of the parameters to carry on operating coherently and consensually. This results in a complex exercise of tuning for incentives and deterrents to guarantee the fulfilment of ambitious collaborative and decentralised goals. The newly emerging field of ‘blockchain economics’ (combining economics; game theory; social science and other disciplines) remains in its infancy.

    The removal of a central ruling authority produces a vacuum that needs to be filled by an adequate decision-making body, which is typically supplied with automation that maintains a combination of static and dynamic configuration settings. Those consensus solutions referred to earlier which use computational resources or social stackable assets to assign the authority, not only to produce blocks but also to steer the variable part of governance, have succeeded in filling the decision-making gap in a fair and automated way. Successively, the exploitation of flaws in the static element of governance has hindered the success of these models. This has contributed to the rise in popularity of curated approaches such as POA or DPOS, which not only bring back centralised control but also reduce the automation of governance.

    This a major area of evolution in blockchain where we expect to see major widespread market adoption.

    Transparency and trust

    For businesses to produce the desired audience engagement for blockchain and eventual mass adoption and success, consensus and governance mechanisms need to operate transparently. Users need to know who has access to what data so that they can decide what can be stored and possibly shared on-chain. These are the contractual terms by which users agree to share their data. As previously discussed, users might be required to exercise the right for their data to be deleted, which typically is a feature delivered via auxiliary, ‘off-chain’ databases. In contrast, only hashed information, effectively devoid of its meaning, is preserved permanently on-chain.

    Given the immutable nature of the chain history, it is important to decide upfront what data should be permanently written on-chain and what gets written off-chain. The users should be made aware of what data gets stored on-chain and with whom it could potentially be shared. Changing access to on-chain data or deleting it goes against the fundamentals of immutability and therefore is almost impossible. Getting that decision wrong at the outset can significantly affect the cost and usability (and therefore likely adoption) of the particular blockchain in question.

    Besides transparency, trust is another critical feature that users and customers legitimately seek. Trust has to go beyond the scope of the people involved as systems need to be trusted as well. Every static element, such as an encryption algorithm, the dependency on a library, or a fixed configuration, is potentially exposed to vulnerabilities.

    Link to the external world

    The attractive features that blockchain has brought to the internet market would be limited to handling digital assets unless there was a way to link information to the real world. Embracing blockchain solely within digital boundaries may diminish its appeal, as businesses seek solutions that integrate seamlessly with the analogue realities of our lives.

    Technologies used to overcome these limitations include cyber-physical devices such as sensors for input and robotic activators for output, and in most circumstances, people and organisations. As we read through most blockchain white papers, we occasionally come across the notion of the Oracle, which in short, is a way to name an input coming from a trusted external source that could potentially trigger/activate a sequence of transactions in a Smart Contract or which can otherwise be used to validate some information that cannot be validated within the blockchain itself.

    fi1kvaNewkbPFT3v-NJOGTRACJnJR5jp4qbpSP20qf0_1y_AIMBeh_yvJ3tzlSC-q8CM_pfml0Gn4JY8Ts_kPueBhizGUyM4gFs6yv6JMTSr9jZG7InIXrIJ509EeImUAJgbTmamxKc8idMgo3HsMJo

    Blockchain oracles connecting blockchains to inputs and outputs

    Bitcoin and Ethereum, still the two dominant projects in the blockchain space are viewed by many investors as an opportunity to diversify a portfolio or speculate on the value of their respective cryptocurrencies. The same applies to a wide range of other cryptocurrencies except fiat pegged currencies, most notably Tether, where the value is effectively bound to the US dollar. Conversions from one cryptocurrency to another and to/from fiat currencies are normally operated by exchanges on behalf of an investor. These are again peripheral services that serve as a link to the external physical world. For businesses, these exchanges provide crucial services that facilitate investment and trading activities, contributing to the broader ecosystem of blockchain-based assets.

    Besides oracles and cyber-physical links, interest is emerging in linking smart contracts together to deliver a comprehensive solution. Contracts could indeed operate in a cross-chain scenario to offer interoperability among a variety of digital assets and protocols. Although attempts to combine different protocols and approaches have emerged, this is still an area where further R&D is necessary to provide enough instruments and guarantees to developers and entrepreneurs. The challenge is to deliver cross-chain functionalities without the support of a central governing agency/body.

    To conclude

    As we’ve highlighted throughout the series, blockchain provides real transformative potential across varying business industries. For a business to truly leverage this technology, the fundamentals we have highlighted must be understood to navigate the complexities of blockchain adoption successfully.

    If you want to start a conversation with the team, feel free to drop us a line .

    The post 7 Key Blockchain Principles for Business appeared first on Erlang Solutions .

    • wifi_tethering open_in_new

      This post is public

      www.erlang-solutions.com /blog/7-key-blockchain-principles-for-business/

    • chevron_right

      Erlang Solutions: Blockchain Tech Deep Dive 2/4 | Myths vs. Realities

      news.movim.eu / PlanetJabber · Thursday, 30 May - 09:06 · 15 minutes

    This is the second part of our ‘Making Sense of Blockchain’ blog post series – you can read part 1 on ‘6 Blockchain Principles’ here. This article is based on the original post by Dominic Perini here.

    Join our FinTech mailing list for more great content and industry and events news, sign up here >>

    With so much hype surrounding blockchain, we separate the reality from the myths to ensure delivery of the ROI and competitive advantage that you need.
    It’s not our aim here to discuss the data structure of blockchain itself, issues like those of transactions per second (TPS) or questions such as ‘what’s the best Merkle tree solution to adopt?’. Instead, we shall examine the state of maturity of blockchain technology and its alignment with the core principles that underpin a distributed ledger ecosystem.

    Blockchain technology aims to embrace the following high-level principles:

    7 founding principles of blockchain

    • Immutability
    • Decentralisation
    • ‘Workable’ consensus
    • Distribution and resilience
    • Transactional automation (including ‘smart contracts’)
    • Transparency and Trust
    • A link to the external world

    Immutability of history

    In an ideal world it would be desirable to preserve an accurate historical trace of events, and make sure this trace does not deteriorate over time, whether through natural events, human error or by the intervention of fraudulent actors. Artefacts produced in the analogue world face alterations over time while in the digital world the quantized / binary nature of stored information provides the opportunity for continuous corrections to prevent deterioration that might occur over time.

    Writing an immutable blockchain aims to retain a digital history that cannot be altered over time. This is particularly useful when it comes to assessing the ownership or the authenticity of an asset or to validate one or more transactions.

    We should note that, on top of the inherent immutability of a well-designed and implemented blockchain, hashing algorithms provide a means to encode the information that gets written in the history so that the capacity to verify a trace/transaction can only be performed by actors possessing sufficient data to compute the one-way cascaded encoding/encryption. This is typically implemented on top of Merkle trees where hashes of concatenated hashes are computed.

    Legitimate questions can be raised about the guarantees for indefinitely storing an immutable data structure:

    • If this is an indefinitely growing history, where can it be stored once it grows beyond the capacity of the ledgers?
    • As the history size grows (and/or the computing power needed to validate further transactions increases) this reduces the number of potential participants in the ecosystem, leading to a de facto loss of decentralisation. At what point does this concentration of ‘power’ create concerns?
    • How does verification performance deteriorate as the history grows?
    • How does it deteriorate when a lot of data gets written on it concurrently by users?
    • How long is the segment of data that you replicate on each ledger node?
    • How much network traffic would such replication generate?
    • How much history is needed to be able to compute a new transaction?
    • What compromises need to be made on linearisation of the history, replication of the information, capacity to recover from anomalies and TPS throughput?


    Further to the above questions, how many replicas converging to a specific history (i.e. consensus) are needed for it to carry on existing? And in particular:

    • Can a fragmented network carry on writing to their known history?
    • Is an approach designed to ‘heal’ any discrepancies in the immutable history of transactions by rewarding the longest fork, fair and efficient?
    • Are the deterrents strong enough to prevent a group of ledgers forming their own fork that eventually reaches wider adoption?


    Furthermore, a new requirement to comply with the General Data Protection Regulations (GDPR) in Europe and ‘the right to be forgotten’ introduces new challenges to the perspective of keeping permanent and immutable traces indefinitely. This is important because fines for breaches of GDPR are potentially very severe. The solutions introduced so far effectively aim at anonymising the information that enters the immutable on-chain storage process, while sensitive information is stored separately in support databases where this information can be deleted if required. None of these approaches has yet been tested by the courts.

    The challenging aspect here is to decide upfront what is considered sensitive and what can safely be placed on the immutable history. A wrong choice can backfire at a later stage in the event that any involved actor manages to extract or trace sensitive information through the immutable history.

    Immutability represents one of the fundamental principles that motivate the research into blockchain technology, both private and public. The solutions explored so far have managed to provide a satisfactory response to the market needs via the introduction of history linearisation techniques, one-way hashing encryptions, merkle trees and off-chain storage, although the linearity of the immutable history comes at a cost (notably transaction volume).

    Decentralisation of control

    One of the reactions following the 2008 global financial crisis was against over-centralisation. This led to the exploration of various decentralised mechanisms. The proposition that individuals would like to enjoy the freedom to be independent of a central authority gained in popularity. Self-determination, democratic fairness and heterogeneity as a form of wealth are among the dominant values broadly recognised in Western (and, increasingly, non-Western) society. These values added weight to the movement that introducing decentralisation in a system is positive.

    With full decentralisation, there is no central authority to resolve potential transactional issues for us. Traditional, centralised systems have well developed anti-fraud and asset recovery mechanisms which people have become used to. Using new, decentralised technology places a far greater responsibility on the user if they are to receive all of the benefits of the technology, forcing them to take additional precautions when it comes to handling and storing their digital assets.

    There’s no point having an ultra-secure blockchain if one then hands over one’s wallet private key to an intermediary whose security is lax: it’s like having the most secure safe in the world then writing the combination on a whiteboard in the same room.

    Is the increased level of personal responsibility that goes with the proper implementation of a secure blockchain a price that users are willing to pay? Or, will they trade off some security in exchange for ease of use (and, by definition, more centralisation)?

    Consensus

    The consistent push towards decentralised forms of control and responsibility has brought to light the fundamental requirement to validate transactions without a central authority; known as the ‘consensus’ problem. Several approaches have grown out of the blockchain industry, some competing and some complementary.

    There has also been a significant focus on the concept of governance within a blockchain ecosystem. This concerns the need to regulate the rates at which new blocks are added to the chain and the associated rewards for miners (in the case of blockchains using proof of work (POW) consensus methodologies). More generally, it is important to create incentives and deterrent mechanisms whereby interested actors contribute positively to the healthy continuation of chain growth.

    Besides serving as an economic deterrent against denial of service and spam attacks, POW approaches are amongst the first attempts to automatically work out, via the use of computational power, which ledgers/actors have the authority to create/mine new blocks. Other similar approaches (proof of space, proof of bandwidth etc) followed, however, they all suffer from exposure to deviations from the intended fair distribution of control. Wealthy participants can, in fact, exploit these approaches to gain an advantage via purchasing high performance (CPU / memory / network bandwidth) dedicated hardware in large quantities and operating it in jurisdictions where electricity is relatively cheap. This results in overtaking the competition to obtain the reward, and the authority to mine new blocks, which has the inherent effect of centralising the control. Also, the huge energy consumption that comes with the inefficient nature of the competitive race to mine new blocks in POW consensus mechanisms has raised concerns about its environmental impact and economic sustainability.

    Proof of Stake (POS) and Proof of Importance (POI) are among the ideas introduced to drive consensus via the use of more social parameters, rather than computing resources. These two approaches link the authority to the accumulated digital asset/currency wealth or the measured productivity of the involved participants. Implementing POS and POI mechanisms, whilst guarding against the concentration of power/wealth, poses not insubstantial challenges for their architects and developers.

    More recently, semi-automatic approaches, driven by a human-curated group of ledgers, are putting in place solutions to overcome the limitations and arguable fairness of the above strategies. The Delegated Proof of Stake (DPOS) and Proof of Authority (POA) methods promise higher throughput and lower energy consumption, while the human element can ensure a more adaptive and flexible response to potential deviations caused by malicious actors attempting to exploit a vulnerability in the system.

    Distribution and resilience

    Apart from a decentralising authority, control and governance, blockchain solutions typically embrace a distributed peer to peer (P2P) design paradigm. This preference is motivated by the inherent resilience and flexibility that these types of networks have introduced and demonstrated, particularly in the context of file and data sharing.

    A centralised network, typical of mainframes and centralised services is clearly exposed to a ‘single point of failure’ vulnerability as the operations are always routed towards a central node. In the event that the central node breaks down or is congested, all the other nodes will be affected by disruptions.

    Decentralised and distributed networks attempt to reduce the detrimental effects that issues occurring on a node might trigger on other nodes. In a decentralised network, the failure of a node can still affect several neighbouring nodes that rely on it to carry out their operations. In a distributed network the idea is that the failure of a single node should not impact significantly any other node. In fact, even when one preferential/optimal route in the network becomes congested or breaks down entirely, a message can reach the destination via an alternative route. This greatly increases the chance of keeping a service available in the event of failure or malicious attacks such as a denial of service (DOS) attack.

    Blockchain networks where a distributed topology is combined with a high redundancy of ledgers backing a history have occasionally been declared ‘unhackable’ by enthusiasts or, as some more prudent debaters say, ‘difficult to hack’. There is truth in this, especially when it comes to very large networks such as that of Bitcoin. In such a highly distributed network, the resources needed to generate a significant disruption are very high, which not only delivers on the resilience requirement but also works as a deterrent against malicious attacks (principally because the cost of conducting a successful malicious attack becomes prohibitive).

    Although a distributed topology can provide an effective response to failures or traffic spikes, you need to be aware that delivering resilience against prolonged over-capacity demands or malicious attacks requires adequate adapting mechanisms. While the Bitcoin network is well positioned, as it currently benefits from a high capacity condition (due to the historical high incentive to purchase hardware by third-party miners), this is not the case for other emerging networks as they grow in popularity. This is where novel instruments, capable of delivering preemptive adaptation combined with back pressure throttling applied to the P2P level, can be of great value.

    Distributed systems are not new and, whilst they provide highly robust solutions to many enterprise and governmental problems, they are subject to the laws of physics and require their architects to consider the trade-offs that need to be made in their design and implementation (e.g. consistency vs availability).

    Automation

    In order to sustain a coherent, fair and consistent blockchain and surrounding ecosystem, a high degree of automation is required. Existing areas with a high demand for automation include those common to most distributed systems. For instance; deployment, elastic topologies, monitoring, recovery from anomalies, testing, continuous integration, and continuous delivery. In the context of blockchains, these represent well-established IT engineering practices. Additionally, there is a creative R&D effort to automate the interactions required to handle assets, computational resources and users across a range of new problem spaces (e.g. logistics, digital asset creation and trading).

    The trend of social interactions has seen a significant shift towards scripting for transactional operations. This is where smart contracts and constrained virtual machine (VM) interpreters have emerged – an effort pioneered by the Ethereum project.

    The ability to define how to operate an asset exchange, by which conditions and actioned following which triggers, has attracted many blockchain enthusiasts. Some of the most common applications of smart contracts involve lotteries, trade of digital assets and derivative trading. While there is clearly exciting potential unleashed by the introduction of smart contracts , it is also true that it is still an area with a high entry barrier . Only skilled developers that are willing to invest time in learning Domain Specific Languages (DSL) have access to the actual creation and modification of these contracts.

    The challenge is to respond to safety and security concerns when smart contracts are applied to edge case scenarios that deviate from the ‘happy path’. If badly-designed contracts cannot properly rollback or undo a miscarried transaction, their execution might lead to assets being lost or erroneously handed over to unwanted receivers.

    Another area in high need for automation is governance. Any blockchain ecosystem of users and computing resources requires periodic configurations of the parameters to carry on operating coherently and consensually. This results in a complex exercise of tuning for incentives and deterrents to guarantee the fulfilment of ambitious collaborative and decentralised goals. The newly emerging field of ‘blockchain economics’ (combining economics; game theory; social science and other disciplines) remains in its infancy.

    Clearly, the removal of a central ruling authority produces a vacuum that needs to be filled by an adequate decision-making body, which is typically supplied with automation that maintains a combination of static and dynamic configuration settings. Those consensus solutions referred to earlier which use computational resources or social stackable assets to assign the authority, not only to produce blocks but also to steer the variable part of governance, have succeeded in filling the decision making gap in a fair and automated way. Successively, the exploitation of flaws in the static element of governance has hindered the success of these models. This has contributed to the rise in popularity of curated approaches such as POA or DPOS, which not only bring back a centralised control but also reduce the automation of governance.

    We expect this to be one of the major areas where blockchain has to evolve in order to succeed in getting widespread market adoption.

    Transparency and trust

    In order to produce the desired audience engagement for blockchain and eventual mass adoption and success, consensus and governance mechanisms need to operate transparently. Users need to know who has access to what data so that they can decide what can be stored and possibly shared on-chain. These are the contractual terms by which users agree to share their data. As previously discussed users might be required to exercise the right for their data to be deleted, which typically is a feature delivered via auxiliary, ‘off-chain’ databases. In contrast, only hashed information, effectively devoid of its meaning, is preserved permanently on-chain.

    Given the immutable nature of the chain history, it is important to decide upfront what data should be permanently written on-chain and what gets written off-chain. The users should be made aware of what data gets stored on-chain and with whom it could potentially be shared. Changing access to on-chain data or deleting it goes against the fundamentals of immutability and therefore is almost impossible. Getting that decision wrong at the outset can significantly affect the cost and usability (and therefore likely adoption) of the particular blockchain in question.

    Besides transparency, trust is another critical feature that users legitimately seek. Trust has to go beyond the scope of the people involved as systems need to be trusted as well. Every static element, such as an encryption algorithm, the dependency on a library, or a fixed configuration, is potentially exposed to vulnerabilities.

    Link to the external world

    The attractive features that blockchain has brought to the internet market would be limited to handling digital assets unless there was a way to link information to the real world. It is safe to say that there would be less interest if we were to accept that a blockchain can only operate under the restrictive boundaries of the digital world, without connecting to the analog real world in which we live.

    Technologies used to overcome these limitations including cyber-physical devices such as sensors for input and robotic activators for output, and in most circumstances, people and organisations. As we read through most blockchain white papers we occasionally come across the notion of the Oracle, which in short, is a way to name an input coming from a trusted external source that could potentially trigger/activate a sequence of transactions in a Smart Contract or which can otherwise be used to validate some information that cannot be validated within the blockchain itself.

    Bitcoin and Ethereum, still the two dominant projects in the blockchain space are viewed by many investors as an opportunity to diversify a portfolio or speculate on the value of their respective cryptocurrency. The same applies to a wide range of other cryptocurrencies with the exception of fiat pegged currencies, most notably Tether, where the value is effectively bound to the US dollar. Conversions from one cryptocurrency to another and to/from fiat currencies are normally operated by exchanges on behalf of an investor. These are again peripheral services that serve as a link to the external physical world.

    Besides oracles and cyber-physical links, interest is emerging in linking smart contracts together to deliver a comprehensive solution. Contracts could indeed operate in a cross-chain scenario to offer interoperability among a variety of digital assets and protocols. Although attempts to combine different protocols and approaches have emerged, this is still an area where further R&D is necessary in order to provide enough instruments and guarantees to developers and entrepreneurs. The challenge is to deliver cross-chain functionalities without the support of a central governing agency/body.

    * originally published 2018 by Dominic Perini

    For any business size in any industry, we’re ready to investigate, build and deploy your blockchain-based project on time and to budget.

    Let’s talk

    If you want to start a conversation about engaging us for your fintech project or talk about partnering and collaboration opportunities, please send our Fintech Lead, Michael Jaiyeola, an email or connect with him via Linkedin.

    The post Blockchain Tech Deep Dive 2/4 | Myths vs. Realities appeared first on Erlang Solutions .

    • wifi_tethering open_in_new

      This post is public

      www.erlang-solutions.com /blog/blockchain-tech-deep-dive-2-4-myths-vs-realities/

    • chevron_right

      Ignite Realtime Blog: New Openfire plugin: XMPP Web!

      news.movim.eu / PlanetJabber · Sunday, 26 May - 17:50 · 1 minute

    We are excited to be able to announce the immediate availability of a new plugin for Openfire: XMPP Web!

    This new plugin for the real-time communications server provided by the Ignite Realtime community allows you to install the third-party webclient named ‘ XMPP Web ’ in mere seconds! By installing this new plugin, the web client is immediately ready for use.

    This new plugin compliments others that similarly allow to deploy a web client with great ease, like Candy , inVerse and JSXC ! With the addition of XMPP Web, the selection of easy-to-install clients for your users to use becomes even larger!

    The XMPP Web plugin for Openfire is based on release 0.10.2 of the upstream project, which currently is the latest release. It will automatically become available for installation in the admin console of your Openfire server in the next few days. Alternatively, you can download it immediately from its archive page.

    Do you think this is a good addition to the suite of plugins? Do you have any questions or concerns? Do you just want to say hi? Please stop by our community forum or our live groupchat !

    For other release announcements and news follow us on Mastodon or X

    1 post - 1 participant

    Read full topic

    • chevron_right

      Erlang Solutions: Balancing Innovation and Technical Debt

      news.movim.eu / PlanetJabber · Thursday, 23 May - 10:58 · 10 minutes

    Let’s explore the delicate balance between innovation and technical debt.

    We will look into actionable strategies for managing debt effectively while optimising our infrastructure for resilience and agility.

    Balancing acts and trade-offs

    I was having this conversation with a close acquaintance not long ago. He’s setting up his new startup, filling a market gap he’s found, rushed before the gap closes in. It’s a common starting point for many entrepreneurs. You have an idea you need to implement, and until it is implemented and (hopefully) sold, there is no revenue, all while someone else can close the gap before you do. Time-to-market is key.

    While there’s no revenue, you acquire debt. But while reasonably careful to keep it under control, you pay the Financial Debt off with a different kind of debt: Technical Debt . You choose to make a trade-off here, a trade-off that all too often is taken without awareness. This trade-off between debts requires careful thinking too, just as much as financial debt is an obvious risk, so is a technical one.

    Let’s define these debts. Technical is the accumulated cost of shortcuts or deferred maintenance in software development and IT infrastructure. Financial is the borrowing of funds to finance business operations or investments. They share a common thread: the trade-off between short-term gains and long-term sustainability .

    Just like financial debt can provide immediate capital for growth, it can also drag the business into financial inflexibility and burdensome interest rates. Technical debt expedites product development or reduces time-to-market, at the expense of increased maintenance, reduced scalability, and decreased agility . It is an often overlooked aspect of a technological investment, whose prompt care can have a huge impact on the lifespan of the business. As an enterprise must manage its financial leverage to maintain solvency and liquidity, it must also manage its technical debt to ensure the reliability, scalability, and maintainability of their systems and software.

    The Economics of Technical Debt

    Consider the example of a rapidly growing e-commerce platform: appeal attracts demand, demand requires resources, and resources mean increased vulnerability: the increasing user data and resources attract threats, aiming to disrupt services, steal sensitive data, or cause reputational harm. In this environment, the platform’s success is determined by its ability to strike a delicate balance between serving legitimate customers and thwarting malicious actors, where both play ever-increasing proportions.

    Early on, the platform prioritised rapid development and deployment of new features; however, in their haste to innovate, the technical team accumulated debt by taking shortcuts and deferring critical maintenance tasks. What results from this is a platform that is increasingly fragile and inflexible, leaving it vulnerable to disruptive attacks and more agile competitors . Meanwhile, reasonably, the platform’s financial team kept allocating capital to funding marketing campaigns, product launches, and strategic acquisitions, under pressure to maximise profitability and shareholder value; however, they neglected to allocate sufficient resources towards cybersecurity initiatives, viewing them as discretionary expenses rather than critical investments in risk mitigation and resilience .

    Technical currencies

    If we’re talking about debt, and drawing a parallel with financial terms, let’s complete the parallel. By establishing the concept of currencies, we can build quantifiable metrics of value that reflect the health and resilience of digital assets. Code coverage, for instance, measures the proportion of codebase exercised by automated tests, providing insights into the potential presence of untested or under-tested code paths. In this line, tests and documentation are the two assets that pay the highest technical debt.

    See for example how coverage for MongooseIM has been continuously trending higher .

    Similarly, Continuous Integration and Continuous Deployment (CI/CD) pipelines automate the process of integrating code changes, running automated tests, verifying engineering work, and deploying applications to diverse environments, enabling teams to deliver software updates frequently and with confidence. By streamlining the development workflow and reducing manual intervention, CI/CD pipelines enhance productivity, accelerate time-to-market, and minimise the risk of human error. Humans have bad days and sleepless nights, well-developed automation doesn’t.

    Additionally, valuations on code quality that are diligently tracked on the organisation’s ticketing system provide valuable insights into the evolution of software assets and the effectiveness of ongoing efforts to address technical debt and improve code maintainability. These valuations enable organisations to prioritise repayment efforts, allocating resources effectively.

    Repaying Technical Debt

    The longer any debt remains unpaid, the greater its impact on the organisation — (technical) debt accrues “interest” over time. But, much like in finances, a debt is paid with available capital, and choosing a payment strategy can make a difference in whether capital is wasted or successfully (re)invested:

    1. Priorities and Plans : Identify and prioritise areas of technical debt based on their impact on the system’s performance, stability, and maintainability. Develop a plan that outlines the steps needed to address each aspect of technical debt systematically.
    2. Refactoring : Allocate time and resources to refactor code and systems to improve their structure, readability, and maintainability. Break down large, complex components into smaller, more manageable units, and eliminate duplicate or unnecessary code. See for example how we battled technical debt in MongooseIM .
    3. Automated Testing : Invest in automated testing frameworks and practices to increase test coverage and identify regression issues early in the development process. Implement continuous integration and continuous deployment (CI/CD) pipelines to automate the testing and deployment of code changes. Establishing this pipeline is always the first step into any new project we join and we’ve become familiar with diverse CI technologies like GitHub Actions , CircleCI , GitlabCI, or Jenkins.
    4. Documentation : Enhance documentation efforts to improve understanding and reduce ambiguity in the codebase. Document design decisions, architectural patterns, and coding conventions to facilitate collaboration and knowledge sharing among team members. Choose technologies that facilitate and enhance documentation work .

    Repayment assets

    Repayment assets are resources or strategies that can be leveraged to make debt repayment financially viable. Here are some key repayment assets to consider:

    1. Training and Education : Provide training and education opportunities for developers to enhance their skills and knowledge in areas such as software design principles, coding best practices, and emerging technologies. Encourage continuous learning and professional development to empower developers to make informed decisions and implement effective solutions.
    2. Technical Debt Reviews : Conduct regular technical debt reviews to assess the current state of the codebase, identify areas of concern, and track progress in addressing technical debt over time. Use metrics and KPIs to measure the impact of technical debt reduction efforts and inform decision-making.
    3. Collaboration and Communication : Foster a culture of collaboration and communication among development teams, stakeholders, and other relevant parties. Encourage open discussions about technical debt, its implications, and potential strategies for repayment, and involve stakeholders in decision-making processes.
    4. Incremental Improvement : Break down technical debt repayment efforts into smaller, manageable tasks and tackle them incrementally. Focus on making gradual improvements over time rather than attempting to address all technical debt issues at once, prioritising high-impact and low-effort tasks to maximise efficiency and effectiveness.

    Don’t acquire more debt than you have to

    While debt is a quintessential aspect of entrepreneurship, acquiring it unwisely is obviously shooting in one’s foot. You’ll have to make many decisions and choose over many trade-offs, so you better be well-informed before putting your finger on the red buttons.

    Your service will require infrastructure

    Whether you choose one vendor over another or decide to go self-hosted, use containerised technologies, so that future changes to better infrastructures are possible. Containers also provide a consistent environment for development, testing and production. Choose technologies that are good citizens in containerised environments .

    Your service will require hardware resources

    Whether you choose one or another hardware architecture or any amount of memory, use runtimes that can efficiently use and adapt to any given hardware, so that future changes to better hardware are fruitful. For example Erlang’s concurrency model is famous for automatically taking advantage of any number of cores, and with technologies like Elixir’s Nx you can take advantage of esoteric GPUs and TPUs hardware for your machine learning tasks.

    Your service will require agility

    The market will push your offerings to its limit, in a never-ending stream of requests for new functionality and changes to your service. Your code will need to change, and respond to changes. From Elixir ‘s metaprogramming and language extensibility to Gleam ‘s strong type-safety, prioritise tools that likewise aid your developers to change things safely and powerfully.

    Your service will require resiliency

    There are two philosophies in the culture of error handling: either it is mathematically proven that errors cannot happen – Haskell’s approach – or it is assumed they can’t always be avoided and we need to learn to handle them – Erlang’s approach. Wise technologies take one starting point as an a-priori foundation of the technology and, a-posteriori, deal with the other end. Choose wisely your point on the scale, and be wary of technologies that don’t take a safe stance. Errors can happen: electricity goes down, cables are cut, and attackers attack. Programmers have bad sleepless nights or get sick. Take a stance, before errors bite your service.

    Your service will require availability

    No fancy unique idea will sell if it can’t be bought, and no service will be used if it is not there to begin with. Unavailability takes an exponential toll on your revenue, so prioritise availability. Choose technologies that can handle not just failure, but even upgrades (!), without downtime . And to have real availability, you always need at least two computers, in case one dies: choose technologies that make many independent computers cooperate easily and can take over another’s work transparently.

    A Case Study: A Balancing Act in Traffic Management

    A chat system, like many web services, handles a countably infinite number of independent users. It is a heavily network-based application that needs to respond to requests that are independent of each other in a timely and fair manner. It is an embarrassingly parallel problem, messages can be processed independently of each other, but it is also a challenge of soft real-time properties, where messages should be processed sufficiently soon for a human to have a good user experience. It also faces the challenge of bad actors, which makes requests blacklisting and throttling necessary.

    MongooseIM is one such system. It is written in Erlang, and in its architecture, every user is handled by one actor.

    It is containerised, and easily uses all available resources efficiently and smoothly, adapting to any change of hardware, from small embedded systems to massive mainframes. Its architecture uses the Publish-Subscribe programming pattern heavily, and because Erlang is a functional language, functions are first-class citizens, and therefore functions are installed to handle all sorts of events extensively because we never know what new functionality we will need to implement in the future.

    One important event is a new session starting: mechanisms for blacklisting are plenty, whether they’re based on specific identifiers, IP regions, or even modern AI-based behaviour analysis, we can’t predict the future,  so we simply publish the “session opened” event and leave for future us to install the right handler when is needed.

    Another important event is that of a simple message being sent. What if bad actors have successfully opened sessions and start flooding the system, consuming the CPU and Database unnecessarily? Again, changing requirements might dictate the system is to handle some users with preferential treatment. One default option is to slow down all message processing within some reasonable rate, for which we use a traffic shaping mechanism called the Token Bucket algorithm, implemented in our library Opuntia – named that way because if you touch it too fast, it stings you.

    You can read more about how scalable MongooseIM is in this article, where we pushed it to its limit . And while we continuously load-test our server, we haven’t done another round of limit-pushing since then, stay tuned for a future blog when we do just this!

    Lessons Learned

    Technical Debt has an inherent value akin to Financial Debt. Choosing the right tool for the job means acquiring the right Technical Debt when needed – leveraging strategies, partnerships, and solutions, that prioritise resilience, agility, and long-term sustainability.

    The post Balancing Innovation and Technical Debt appeared first on Erlang Solutions .

    • wifi_tethering open_in_new

      This post is public

      www.erlang-solutions.com /blog/balancing-innovation-and-technical-debt/

    • chevron_right

      JMP: Newsletter: SMS Routes, RCS, and more!

      news.movim.eu / PlanetJabber · Tuesday, 21 May - 19:22 · 3 minutes

    Hi everyone!

    Welcome to the latest edition of your pseudo-monthly JMP update!

    In case it’s been a while since you checked out JMP, here’s a refresher: JMP lets you send and receive text and picture messages (and calls) through a real phone number right from your computer, tablet, phone, or anything else that has a Jabber client.  Among other things, JMP has these features: Your phone number on every device; Multiple phone numbers, one app; Free as in Freedom; Share one number with multiple people.

    SMS Censorship, New Routes

    We have written before about the increasing levels of censorship across the SMS network. When we published that article, we had no idea just how bad things were about to get. Our main SMS route decided at the beginning of April to begin censoring all messages both ways containing many common profanities. There was quite some back and forth about this, but in the end this carrier has declared that the SMS network is not meant for person-to-person communication and they don’t believe in allowing any profanity to cross their network.

    This obviously caused us to dramatically step up the priority of integration with other SMS routes, work which is now nearing completion. We expect very soon to be offering long-term customers with new options which will not only dramatically reduce the censorship issue, but also in some cases remove the max-10 group text limit, dramatically improve acceptance by online services , and more.

    RCS

    We often receive requests asking when JMP will add support for RCS, to complement our existing SMS and MMS offerings. We are happy to announce that we have RCS access in internal testing now. The currently-possible access is better suited to business use than personal use, though a mix of both is certainly possible. We are assured that better access is coming later in the year, and will keep you all posted on how that progresses. For now if you are interested in testing this, especially if you are a business user, please do let us know and we’ll let you know when we are ready to start some testing.

    One thing to note is that “RCS” means different things to different people. The main RCS features we currently have access to are typing notifications, displayed/read notifications, and higher-quality media transmission.

    Cheogram Android

    Cheogram Android 2.15.3-1 was released this month, with bug fixes and new features including:

    • Major visual refresh, including optional Material You
    • Better audio routing for calls
    • More customizable custom colour theme
    • Conversation read-status sync with other supporting apps
    • Don’t compress animated images
    • Do not default to the network country when there is no SIM (for phone number format)
    • Delayed-send messages
    • Message loading performance improvements

    New GeoApp Experiment

    We love OpenStreetMap , but some of us have found existing geocoder/search options lacking when it comes to searching by business name, street address, etc. As an experimental way to temporarily bridge that gap, we have produced a prototype Android app ( source code ) that searches Google Maps and allows you to open search results in any mapping app you have installed. If people like this, we may also extend it with a server-side component that hides all PII, including IP addresses, from Google, for a small monthly fee. For now, the prototype is free to test and will install as “Maps+” in your launcher until we come up with a better name (suggestions welcome!).

    To learn what’s happening with JMP between newsletters, here are some ways you can find out:

    Thanks for reading and have a wonderful rest of your week!

    • wifi_tethering open_in_new

      This post is public

      blog.jmp.chat /b/may-newsletter-2024