Cloud - TechHQ Technology and business Thu, 03 Aug 2023 16:32:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 Is a private 5G network the right choice for your business? https://techhq.com/2023/08/are-private-5g-networks-the-right-choice-for-businesses/ Thu, 03 Aug 2023 16:32:53 +0000 https://techhq.com/?p=226884

Twelve months ago, Frost & Sullivan made the case for why private 5G networks will be game-changing for some companies. And Troy Morley, an Industry Principal at the business consulting firm, believes that over the next decade private 5G networks will evolve to support the needs of smaller businesses in almost all industries. What makes... Read more »

The post Is a private 5G network the right choice for your business? appeared first on TechHQ.

]]>

Twelve months ago, Frost & Sullivan made the case for why private 5G networks will be game-changing for some companies. And Troy Morley, an Industry Principal at the business consulting firm, believes that over the next decade private 5G networks will evolve to support the needs of smaller businesses in almost all industries.

What makes a private 5G network the right choice for firms?

Installing a private 5G network addresses a series of pain points that businesses may face. The first is network coverage. Cell towers in public mobile networks are primarily located based on demand – for example, across urban sites with large populations and alongside transport routes.

And this arrangement is fine for companies based in metropolitan areas or near the highway. But what if firms have business interests in remote locations that need to be connected locally and to headquarters? Extreme examples are underground operations such as mining and activities out at sea.

Businesses could be the big winners from 5G private networks

Telecoms trend: 5G antennas being installed on a building in South Korea. Image credit: Ericsson.

It may also be the case that firms near a public mast have great coverage outside, but experience dead spots when trying to use the cell network inside. Morley notes that factories and warehouses can have problems in this area, with buildings and their contents acting as potential sources of signal interference.

Given these network coverage concerns, it’s easy to understand why early adopters of private 5G networks have been operators in mining, energy, and manufacturing industries. There’s also data security to consider, which is another reason for firms to opt for a private setup rather than build solutions using public mobile network infrastructure.

5G performance gains

5G brings faster download speeds and low-latency performance to devices. And high-definition video and mobile gaming are well-advertised as reasons for mobile customers to ditch their old smartphones and buy new 5G handsets. But this is just scratching the surface.

“While most consumers think that 5G is all about them, the truth is 5G is ideal for addressing the networking needs of business and enterprise,” writes Morley.

For example, 5G brings significant edge capabilities. Private 5G networks don’t just connect staff, they enable industrial IoT communications too. And wireless networks provide flexibility to make businesses more agile and composable – in other words, tools and teams are easier to reconfigure for different projects.

Cellnex’s Catherine Gull lists automation, worker safety, and situational awareness as the top three benefits that private networks can bring to operations. Enterprises can use 5G systems to automate indoors and outdoors, from self-driving vehicles to factory robots.

“They do it to increase safety, and they do it to increase reliability,” she told UPTIME attendees in June 2023. “The more of these robots that you put in one single space, the more other mechanisms of connectivity fall down and become unreliable or unsafe.”

Gull makes a strong case that systems such as private 5G networks give users reliable bandwidth and, for firms, can be ‘where they want them, when they want them, and how they want them’. And companies are no longer held back by the downsides of basing their operations on a public mobile network.

Adding to the appeal of being in full control, firms may find that they are able to stack multiple use cases on private 5G networks. Beyond automation, systems can also enable asset tracking, help with training, streamline maintenance, and provide ERP integration – to give just a few additional applications.

Once businesses have the fat bandwidth that 5G offers, there’s a lot that they can do. And low latency (plus video over wireless) opens the door to accurate remote control, which has broad appeal across a wide range of industries – from logistics to healthcare.

“Most enterprises start with something that is really key to them and that’s oftentimes connectivity availability,” Gull points out. “And once that’s resolved, you can build on that.” Cellnex, headquartered in Spain, has more than 138,000 sites on which mobile network operators (MNOs) put their infrastructure. And it has cell towers located in 12 countries.

What private 5G network architecture do you need?

The multiple antennas associated with 5G infrastructure enable powerful beamforming capabilities. Signals from multiple 5G radios can be purposefully overlapped and grouped together. And regions of constructive interference in the emissions can even be steered toward devices by adjusting the phase of each of those broadcasts.


As the name suggests, mobile networks are ideal for maintaining connections on the move, and beamforming adds further precision to the technology. Using beamforming methods, signals can be tuned to follow devices. Buildings can be utilized too, as reflective surfaces to bounce mobile signals to recipients.

One of the trade-offs of using much higher frequencies, which offer more bandwidth, is that these shorter wavelength mobile signals don’t travel as far. But beamforming has been shown to compensate for this, putting suitably configured 5G systems on par with longer wavelength 4G networks, in terms of coverage – at least at the lower end of the 5G spectrum.

Technically, to use 3GPP (the standards group for mobile broadband) terminology, a private 5G network is dubbed a ‘5G non-public network’, highlighting the absence of commercial MNO subscribers. And, as mentioned, such networks could be providing industrial control or replacing enterprise Wi-Fi.

Radio access requires physical hardware, but core network elements can be virtualized and made available in the cloud. Also, circling back to the security advantage for businesses running private 5G networks, enterprise systems will only be visible to authorized user equipment.

Devices belonging to the 5G non-public network will look for a standalone non-public network (SNPN) ID. In contrast, consumer devices latch onto mobile services based on a public land mobile number (PLMN) ID – a combination of a mobile country code and a mobile network code – which is one of the details contained on a handset’s SIM.

Private 5G network starter kit

AWS is trying to bridge the knowledge gap for business users who are thinking about experimenting with private 5G services. The cloud giant has a kit that’s priced based on network traffic rather than the number of connected devices. And, based on the AWS demo video, the setup process is straightforward – comparable to configuring a Wi-Fi network.

Taking a kit-based first step gives firms the chance to run small-scale pilot schemes ahead of making larger investments in mobile infrastructure. And AWS is by no means the only vendor offering easy to navigate solutions. Firecell’s Orion Private 5G dashboard requires no knowledge of a 5G network architecture and configuration.

And the French firm, which aims to democratize private networks and believes in open source as the way forward in telecoms, can supply clients with a rack server, access point, ten pre-configured SIM cards, and one omnidirectional antenna to see how a private 5G network can improve company performance.

If the idea of private 5G networks sounds appealing and you want to run the numbers on whether it’s an investment that’s worthwhile for your organization, Nokia has made available a 5G business modeling tool. The web application allows users to compare the total cost of ownership of Wi-Fi versus 5G wireless and is based on more than 220+ customer use cases.

Returning to the Frost & Sullivan observations at the top of the story, businesses could end up being the big winners from 5G, as telecoms firms will be highly motivated to tailor their solutions to industrial clients.

“Communications Service Providers (CSPs) have invested significantly in 5G,” emphasizes Morley. “The stark truth is those CSPs depending just on the consumer market for a return on investment will fail.”

The post Is a private 5G network the right choice for your business? appeared first on TechHQ.

]]>
The need for a biotech industrial revolution https://techhq.com/2023/08/how-do-we-achieve-an-industrial-revolution-for-biotech-through-technological-development/ Wed, 02 Aug 2023 17:28:27 +0000 https://techhq.com/?p=226873

• Technological development could create a “biotech industrial revolution.” • That would lead to faster processing of therapies and vaccines. • The Covid pandemic taught lessons on the value of collaborative tech and collaborative science. Biotech and bioscience are industries to which humanity looks with hope. They help us cure diseases and conditions that otherwise... Read more »

The post The need for a biotech industrial revolution appeared first on TechHQ.

]]>

• Technological development could create a “biotech industrial revolution.”
• That would lead to faster processing of therapies and vaccines.
• The Covid pandemic taught lessons on the value of collaborative tech and collaborative science.

Biotech and bioscience are industries to which humanity looks with hope. They help us cure diseases and conditions that otherwise impact people’s lives negatively long-term, or even, with appalling regularity, kill them.

But biotech and bioscience are being stymied by a lack of technological development, of the kind most tech businesses don’t think about twice. Things like a solid collaborative cloud infrastructure, or the ability to use AI and machine learning in a structured way, are absurdly missing from most biotech development firms.

And, according to Bob Burke, EMEA GM at biotech cloud platform Benchling and Jason C. Foster, CEO of cell and gene therapy manufacturing technology company Ori Biotech, that’s resulting in low throughput manual processes that are very expensive.

What does that mean in non-scientist?

It means horrendously expensive cures and treatments that are scientifically brilliant but economically non-viable. It means people suffering and dying because the treatments that exist can’t be successfully, economically marketed or sold to them.

That’s the market gone mad.

But it’s also indicative of a chronic structural failure, where biotech companies never have the time, the money or the explicit motivation to go “back” and provide the technological development needed to not only fast-track development of cures and treatments, but by virtue of changing a low throughput industry to a high throughput one, change the likelihood that therapies will be available to all – or more – or the people, more of the time.

The importance of technological developments for the biotech revolution.

In Parts 1 and 2 of this article, Bob and Jason explained the need for technology-focused CIOs in the biotech sector to not only implement technological developments that would underpin faster, more economically viable treatments, but also to champion the value of that renaissance approach – looking backward to the tech stack that could make those technological developments a reality, and upgrading it to something that would be recognizably 21st century.

We asked them about that shift in focus.

THQ:

Is it fair to liken biotech without the tech layer in place to a catwalk fashion house, rather than an off-the-peg clothing manufacturer, which it could be if it had that tech layer in place?

JCF:

I mean, we’re dealing with biology, but I don’t think fundamentally there’s any difference. As you say, it’s like needlepointing by hand, in a manual, expensive way, as opposed to doing it on an automated assembly line, which we know improves quality and consistency, and we also know it reduces costs and makes products more widely available.

It’s exactly the same challenge, yes. The data layer, the cloud layer that underpins the necessary changes makes the automation technology smarter and more efficient.

We can learn faster about what works, what doesn’t work, all those lessons that have been applied to cars and clothes and computers and all the other things that we make at a big scale. We need to take those learnings and apply them to biotech and to this next generation of therapies.

THQ:

A biotech version of an industrial revolution?

JCF:

People are calling it that, yeah.

Technological development will power an industrial revolution for biotech.

The steam engine, the industrial robot…the gene therapy? Industrial revolutions have many faces.

THQ:

So how do we get there?

We know a tech layer can be helpful and potentially revolutionary, but that too often in biotech, it isn’t there. How do we go from here to there, to a place where that tech layer is universal in biotech?

JCF:

This isn’t just a government problem, this isn’t just a clinician problem. This isn’t just a pharma company problem, either. This is an everyone problem. One of the reasons we partnered with Benchling is that they do what they do really well, we’re trying to do what we do really well, and the ability to actually put those pieces together felt like the way forward.

Previously, the technology industry has been very siloed, where we have a solution, and Bob’s got a solution and someone else has got a solution, and we throw them over the wall. But a biotech company has to knit them together so they work, but they’re never really designed to work together.

By creating standards, by using standard cloud language, we can knit them together ahead of time, and the result is we’ll get a better solution for biotech, to help it with that turbo boost effect we mentioned in Part 2.

Gene therapies are a significant hope for curing a whole range of conditions.

Good for almost whatever ails you – gene therapies are the future of medicine.

BB:

It’s incumbent upon us as service providers or technology providers to make sure that our products work, that they play nicely with those that are also trying to solve similar problems in the ecosystem.

It’s been very hard for biotech to adopt technology, because up until now, the tools haven’t been very good. It’s not the scientists’ fault or the biotech companies’ fault. We haven’t done our job as technology providers to say, well, actually, let us make this easy for you, we’ll make pre-integrated systems that all talk to one other, so you can pick it off the shelf, and it solves your problem.

That’s part of what we’re trying to do together – trying to create something that solves the problem in an integrated way, in a holistic way, because we fundamentally understand the needs of the customer.

THQ:

That’s a similar journey to one that tech companies have already had to make. They’re out there doing their thing, but then “What happens if we do this and connect with other companies?” So it becomes more than the sum of its parts at that point?

Collaborative technological developments helped deliver Covid vaccines.

JCF:

Yeah. I think the other important thing we learned from the pandemic was that timelines get shorter in radical ways through collaborative technology. Usually, vaccine development takes 10 years, right? But when we come together, and we collaborate, and we use best in class tools and solutions, we can deliver something in a tenth of the time – the Covid vaccines were developed in less than 12 months.

Covid vaccines are an example of how scientific collaboration, underpinned by technological development, can do great things.

Covid vaccines. Developed quickly through significant collaboration. now about that climate change thing…

Now, of course, that’s an extreme example, because we had the entire world focused on one thing, but I think it’s important because it does tell the tale of what we can do when we’ve got data, when we’re able to collaborate on it, when we can bring people from across the globe that are best in class at what they do, and let them work alongside technology that lets them stop spending 50 to 60% of their week on data management, and lets them get on with science.

The opportunities really are limitless.

THQ:

Was the pandemic a kind of lightbulb moment as far as that sort of thing was concerned?

JCF:

Absolutely. Yeah, for sure.

We’ve proven to ourselves that we can do this, right? So let’s keep doing it. And, you know, we’ve got big problems that we need to solve out there. Whether it’s cancer, whether it’s climate change, you know, whether it’s distribution of food, there’s lots of big global challenges that if we work together, and we use the right tools, and free up people’s time to do what they do best, we can make real change here in a positive way.

THQ:

*Lights joss sticks* “All we are saying is give solid data, technological development, a collaborative tech stack, a ton of money and a worldwide community of geniuses a chaaaaance…”

JCF:

Ha. Yeah. I mean, what an incredible case study of the possibilities for human ingenuity, if we’re all focused on trying to accomplish a goal. We developed not only one vaccine in a year, we developed four or five in a year. So let’s take those learnings and not lose them.

Covid was a catastrophe for so many people. But we saw unprecedented regulatory innovation, market access innovation, and of course, scientific and technical innovation in action. Covid was a global problem that impacted everyone, no matter where they were, so it crystalized our attention on that problem. But cancer kills way more people every year than Covid did in its whole existence (so far).

If you think about things that way, we just had a cell therapy approved for Type 1 diabetes. They’re studying cell therapies for cardiovascular disease too, massive diseases that impact tens of millions of patients every single year, not just in one concentrated burst. Let’s not lose those Covid learnings.

And that ability to bring regulatory innovation, technical innovation, and scientific innovation together and streamline them is critical to keeping those learnings alive and making use of them. This technical layer, this foundational layer underneath the science, will enable us to get much better at doing those things more quickly.

Bob Burke and Jason C Foster, both advocates of technological development.

Bob Burke and Jason C Foster.

As with Covid vaccines, so with gene therapies?

The post The need for a biotech industrial revolution appeared first on TechHQ.

]]>
What happens after Oracle Java licensing changes? https://techhq.com/2023/07/what-happens-after-oracle-java-licensing-changes/ Mon, 31 Jul 2023 13:36:17 +0000 https://techhq.com/?p=226709

Gartner finds that Java licensing changes by Oracle are two to five times more expensive for most organizations. One in five Java users can expect an Oracle audit in the next three years. Eclipse, Azul, and other providers may see increased Java downloads from new Oracle pricing. Oracle has once again changed licensing rules for... Read more »

The post What happens after Oracle Java licensing changes? appeared first on TechHQ.

]]>
  • Gartner finds that Java licensing changes by Oracle are two to five times more expensive for most organizations.
  • One in five Java users can expect an Oracle audit in the next three years.
  • Eclipse, Azul, and other providers may see increased Java downloads from new Oracle pricing.

Oracle has once again changed licensing rules for its widely-used Java product. On January 23, 2023, the company introduced a new license metric, the SE Universal Subscription, and the controversial new Java pricing plan is based on the customer’s total number of employees, rather than the number of employees using the software.

What this means for an organization is that regardless of its number of Java users or its server footprint, it must count every employee, contractor, consultant, and agent to determine its Java subscription bill. In short, an organization is potentially on the hook for a massive subscription fee increase that may have minimal benefit to the operation.

Big Red – which acquired Java with its buyout of Sun Microsystems in 2009 – said the new Java SE Universal Subscription is “a simple, low-cost monthly subscription that includes Java SE Licensing and Support for use on desktops, servers or cloud deployments.”

According to Oracle, the pricing starts at US$15 per employee per month for as many as 999 employees and drops as low as US$5.25 per month for 40,000 to 49,999 users. 

Oracle cited an example in which a company with a total employee count of 28,000, including full-time and part-time employees and agents, consultants, and contractors, would be charged US$2.268 million annually.

However, the changes in the Java pricing model present opportunities for Java rivals Eclipse Foundation and Azul. Eclipse Foundation immediately seized on the opportunity to pitch its alternative. “Stumbled across Oracle’s latest Java price list,” tweeted Eclipse Executive Director Mike Milinkovich on January 27

“Wow, I had no idea that Java was so expensive! Fortunately, you can download the fully compatible, community-supported, quality-certified Temurin OpenJDK distribution for free!” On the other hand, Azul said that it had seen a massive increase in inquiries about Java licensing since Oracle’s Universal pricing plan debuted. 

“This is a major shock to the Java ecosystem,” Azul CEO Scott Sellers reportedly said. He described Oracle’s plan as one of the few instances he could think of in which pricing was decoupled from the value derived from the software. Azul’s Java pricing, Sellers noted, is based on how many people are using it.

New Oracle Java licensing, now what?

The changes primarily affect large companies with many employees, but will also significantly impact medium-sized businesses. Although Oracle promises to allow legacy users to renew under their current terms and conditions, the company will likely pressure users to adopt the new Java licensing model over time.

Changes to Oracle Jave Licensing mean the cost will skyrocket.

“Welcome to your latest Oracle bill, how would you like to pay today?”

Gartner had recently estimated that most organizations adapting to the new licensing terms by Oracle for Java should expect the per-employee subscription model to be two to five times more expensive than the legacy model. The global technology firm has spoken to clients since the new model was introduced in January.

Oracle Java SE Universal Subscription Global Price ListMarch 1, 2023

Oracle Java SE Universal Subscription Global Price List. Source: Oracle

Having spoken to many clients, Gartner concluded that the steep increase in Oracle licensing costs for most Java users would mean that by 2026, more than 80% of Java applications will be deployed on third-party Java runtimes, up from 65% in 2023. Gartner also warned that Oracle is ready to test whether users comply with Java licensing terms as it sees them. 

“One in five Java users can expect an Oracle audit in the next three years,” Gartner said. Nitish Tyagi, the co-author of the new Gartner research note, told The Register that for large organizations, the research analyst expects the increase to be two to five times, depending on the number of employees an organization has. 

“Please remember, Oracle defines employees as part-time, full-time, temporary, agents, contractors as whosoever supports internal business operations has to be licensed as per the new Java Universal SE Subscription model,” he added. To top it off, Gartner also estimated that by 2026, one in five organizations using Java applications will be audited by Oracle, leading to “unbudgeted noncompliance fees.” 

By the same year, more than 30% of organizations using Java applications won’t comply with their Oracle contracts, Gartner predicts. Added to that, pressure from Oracle on license costs will translate to 80% of Java applications deploying on third-party Java runtimes by 2026 – up from 65% in 2023, Gartner said.

Tyagi noted that clients are moving towards third-party Java runtimes such as Azul, Amazon Coretto, Eclipse Temurin, and IBM Semuru since the heavy pricing model was announced. “Surveys also indicate a decline in Oracle JDK usage and an increase in the use of other Java runtimes,” Tyagi added.

The post What happens after Oracle Java licensing changes? appeared first on TechHQ.

]]>
How to deliver technological development for faster bioscience https://techhq.com/2023/07/how-can-technological-development-deliver-faster-bioscience-results/ Mon, 31 Jul 2023 08:30:25 +0000 https://techhq.com/?p=226691

• The speed of bioscience depends on technological development. • AI and machine learning could improve diagnostic efficiency. • Bioscience needs tech-forward CIOs to steer its destiny. Biotech and bioscience are the kind of industries of which great, optimistic, utopian science fiction are made. Which is a problem. It’s a problem because according to lots... Read more »

The post How to deliver technological development for faster bioscience appeared first on TechHQ.

]]>

• The speed of bioscience depends on technological development.
• AI and machine learning could improve diagnostic efficiency.
• Bioscience needs tech-forward CIOs to steer its destiny.

Biotech and bioscience are the kind of industries of which great, optimistic, utopian science fiction are made.

Which is a problem.

It’s a problem because according to lots of people working in the field, they should be kind of industries of which great, optimistic, utopian present fact are made.

Scientists are frequently brilliant – it more or less goes with the territory – but, especially in bioscience, they’re often let down by their access to technological developments which most tech organizations give no mind, but just accept as the structural wallpaper of their work.

In Part 1 of this article, we spoke with Bob Burke, EMEA GM at biotech cloud platform Benchling and Jason C. Foster, CEO of cell and gene therapy manufacturing technology company Ori Biotech, both of whom are advocating for a pathway for bioscience that embraces 21st century technological developments like a cloud-based collaborative network.

Technological development and scientific progress.

Such a network, they said, could cut years off the process of getting new gene therapies to patients who desperately need them.

The miraculous insanity of the situation in which many in the bioscience industry find themselves, they said, is that many scientists have no ability to organize or share their data, let alone easily collaborate on it. Hence the need for significant investment in technological development, to speed up collaboration, scientific development, and the achievement of governmental goals (particularly in the UK), like delivering cutting-edge bioscience.

Towards the end of Part 1, Jason casually said:

“We have cures for cancers that patients can’t get access to, because they can’t be made, they can’t be manufactured, they’re too expensive. And a lot of that comes back to the processes that we start with in the lab when we develop these products.”

We boggled at that for a moment. Cures for cancers, as well as gene therapies for other dread conditions the like of which only gene therapies can combat, being held up by the lack of a consistent collaborative cloud?

THQ:

So what you’re saying is that modern technological developments like the cloud and AI would act as a kind of turbo boost button for life-saving or life-altering cures?

JCF:

Yeah, and the speed would, as you say, be genuinely life-saving, because the reality is that the areas where we are seeing AI and machine learning working today are on those edges. You see AI and ML a lot in the world of radiology, right? Something like 20% of all breast cancer goes undetected, because it’s just missed on the mammogram.

But there are places now, countries like Hungary, that are very tech-forward when it comes to cancer detection, where they’re bringing in machine learning to do a second round of evaluation on those scans, and they’re finding things that very hardworking and well-meaning radiologists unfortunately sometimes miss. Those can be life-saving breakthroughs.

It’s critical for the industry. And without the investment in technological development, lives are unfortunately at risk.

Technological developments like machine learning can improve mammogram effectiveness.

The machine learning mammogram? Yes – think of it as a technological safety net.

Technological development delivering data insights.

THQ:

That’s the thing, isn’t it? For all the screaming headlines out there about how they’re going to blow up the world, AI and ML are extremely good at a certain group of activities. And one of those things is the detection of patterns and the detection of anomalies. Which is exactly what you need in things like a mammogram.

JCF:

Yeah, and they’re getting better at more things. The whole concept behind machine learning is that the machine gets smarter as you feed it more structured data. We use similar tools internally at Ori to try and create and understand patterns in data. So I often say the controversial thing – data isn’t valuable.

THQ:

Heathen! Burn the heretic!

JCF:

The information and insights that you derive from data are what’s valuable. I mean, I have tons of unstructured datasets in my filing cabinet at home and pieces of paper that will never see the light of day because I can’t make any sense of it anyway.

But  the ability to capture that data in ways like Bob was talking about in Part 1, to structure it, that’s crucial. When you have clean datasets to look at, to analyze, then you can productively say, “Well, what does this data mean?”

And when you say “What does this data mean?,” that’s effectively the question that machine learning and artificial intelligence tools are trying to answer that gives those experts like the radiologists the information they need to make a clinical decision.

“Do I call that person? Do I do a biopsy?” Understanding what the data means is the enabler of better (and faster) clinical decisions. That’s the power of these new tools.

THQ:

Okay – that’s a fairly solid answer to our question of why we need to build technological development and why we need to build a tech culture in order to foster a bioscience culture.

The next obvious question is how we build the kind of tech culture that will support that bioscience culture.

What are we actually aiming to achieve? When are we looking to achieve it by? And how do we get there?

The how of technological development.

BB:

Life sciences organizations have got to prioritize bringing in the right level of leadership at the right stage of the business development to establish this proper technology layer. That’s historically not happened, albeit for understandable reasons.

Biotech organizations have traditionally had limited resources, and as you’d expect, they have to invest in the science first and foremost, so they haven’t spent money on CIOs and information-focused executives, who can then help deliver that really clear technology layer alongside the science to help it do everything that we just talked about it needing to do in order to scale and be innovative and help find life-saving breakthroughs.

So it starts with the people and the leadership of those organizations. And we’re seeing that more now – that’s a trend that’s happening more and more – biotech is bringing in technology leadership at an early stage, to help implement the systems, put the tools and the processes in place, as well as attract the talent that is required to manage those systems, work with vendors, and help drive that innovation at ground level.

It starts organizationally, inside the business, and then goes from there.

Only ever forward.

JCF:

I’d argue, actually, that it starts above the management layer, at the board and investor level. Because when you’re a scientific company, if you’re developing therapies, what your board and your investors have historically cared about is how quickly you can get into the clinical phase, and how quickly you can get clinical data, because that’s an inflection point for the value of that business.

The way that’s manifested itself in cell therapy is that we start off very focused on science, so we’re developing these products in highly manual lab scale processes. We’ve got very experienced people micro-pipetting fluids from one vial to the next. And we’re really focused on hitting that scientific endpoint.

But if we do hit that scientific endpoint, we get into a clinical trial, and we get a good clinical result. But it’s then too late to go back and fix our infrastructure (that is now paper-based, and very slow and not very intelligent), in order to scale it up.

Technological development – the way out of a catch-22?

So when I said that we have cures for cancer that patients can’t get access to, the reason for that is because we haven’t planned on scaling from the very beginning. Because scaling requires a technology infrastructure underneath it to be able to make products with high quality, high throughput and low cost.

What we end up with is low throughput manual processes that are very expensive. So these products are half a million dollars to $4 million per patient – these are expensive products, because they’re very expensive to make, because we haven’t spent the time to put this onto a technology platform that can scale, or to digitize our data from the beginning.

Because ultimately, my investors aren’t telling me to do that. My investors are saying give me clinical data as fast as you can, because that’s what the market cares about. But ultimately, what we end up with is the dead-end situation. We have products that get approved, that ultimately only reach a very, very small percentage of the patient population.

Technological developments could bring down the cost drivers of bioscience.

When money and medicine meet, money often talks louder. How can tech soften that influence?

In Europe, something like eight out of 24 of the approved gene therapies that have been approved clinically have been removed from the market, not for political reasons, but for commercial reasons – they couldn’t actually make an ROI case work.

We’ve recently seen one of the big players say they’re not going to bring their product to the UK, because they can’t manufacture it enough, even for the US, and because the reimbursement landscape is a little uncertain, they’re just going to focus on that market.

My fear is that if we don’t fix this problem of technological development, more and more of this could happen, where these products exist, but patients in Europe and the UK can’t access them, because of all of these infrastructure challenges.

Technological development can help bioscience scale at speed.

Well yes, but can you scale?

We have cures for diseases like multiple myeloma or leukemia lymphoma – we have incredibly effective therapies, but many patients can’t get access to them for these reasons.

Bob Burke and Jason C Foster, both advocates of technological development.

Bob Burke and Jason C Foster.

In Part 3 of this article, we’ll take a holistic look at how bioscience, with its as-yet-notional technological development and its tech-forward CIOs could transform the industry into the kind of superpower that, for instance, the UK is depending on.

The post How to deliver technological development for faster bioscience appeared first on TechHQ.

]]>
Don’t get left behind: unlock competitive advantage in banking and payments https://techhq.com/2023/07/dont-get-left-behind-unlock-competitive-advantage-in-banking-and-payments/ Mon, 31 Jul 2023 06:46:16 +0000 https://techhq.com/?p=226703

Don’t let your competitors get the edge. Take part in the Bottomline 2023 Banking and FI survey to get full insights into trends in compliance, real-time payments, and more.

The post Don’t get left behind: unlock competitive advantage in banking and payments appeared first on TechHQ.

]]>

From technological advancements to changing consumer preferences, the banking and payment industry is evolving rapidly. While exciting, its dynamic nature can make it challenging for banks and non-banking financial institutions of all sizes to keep up.

It becomes crucial for institutions to compare their strategic priorities, product roadmaps, and plans for future innovation against their competitors. These valuable insights into market trends, customer demands, and emerging opportunities allow them to make informed strategic decisions.

For example, a 2022 report from Bottomline revealed that 66% of banks and financial institutions (FIs) think that compliance and RegTech will become more important over the next year, and 87% thought it would be somewhat or very challenging to remain compliant.

This knowledge allows executives to make the right decisions regarding resource allocation and investment in technology, helping them meet regulatory requirements effectively. If this isn’t prioritised, companies face significant challenges with compliance, potentially exposing themselves to regulatory penalties, reputational damage, and loss of customer trust.

Benchmarking against peers enables banks to identify areas of improvement and best practice. Analysing successful institutions can help them identify gaps in their operations and implement necessary changes to enhance efficiency, customer experience, and overall performance. The process fosters innovation in the industry and boosts the institution’s growth.

Source: Shutterstock

The report from Bottomline, which surveyed  more than  500 banking and FI players across Treasury, Fraud, Operations, and Product at C-suite level in 34 countries, revealed that 64% view digital transformation as their biggest focus, with 27% of banks and FIs specifically identifying the need to address legacy payments infrastructure as a priority. Those not prioritising technological updates – like integrating a SaaS communication platform to reduce data silos and improve customer experience – may fall behind in the future.

In most industries, surprise outcomes are a hindrance to a competitive business. Keeping abreast of the latest developments and having a healthy insight into what is around the corner is essential to remain a key player. Being unaware of the latest shift in customer expectations regarding fraud protection, for example, can mean competitors meet demand, and create a drop in your company’s revenue.

To alleviate this danger, leverage benchmarking activities, like taking the annual survey from Bottomline. After answering just 12 questions (it takes  around 5 minutes only), you will receive a personalised comparison of how your strategy and pain points compare to those of your peers in banking and payments.

Once the survey is completed by what’s expected to be more than double last year’s participant numbers, you will receive a comprehensive report highlighting insights from the full data picture.

The 2022 report showed that the top priority of most banks and FIs was real-time payments, an increase of 15% from 2021.

Source: Shutterstock

Zhenya Winter, the Head of Financial Messaging Marketing at Bottomline, said the increased interest in real-time payments can be linked to schemes and mandates such as SEPA Inst and SIC IP, which provide more accessible access models for banks and FIs. A good example would be the UK’s New Payments Architecture and Europe’s Target2, which champion  how real-time payments can provide new revenue streams by using digital overlays and, of course, meeting customer demand.

Winter said: “The business case for real-time is no longer in doubt, whereas previously, banks were concerned about having enough volume and value to justify the spend on implementation.”

Riccardo Colnaghi, the Head of Business Development at Solarisbank AG, agreed. “Not adopting real-time payments would mean ruling out the upfront opportunity to join a wider set of longer-term payment trends that are likely to be key for future competitiveness.” Indeed, 54% of survey respondents said they have either completed planning and want to start implementation or are already ‘live.’

The report also highlighted that many banks and FIs consider prioritisation in a busy roadmap the biggest barrier to adopting real-time payments. In 2022, financial institutions were caught up with maintaining regulatory and industry compliance, and educating customers on new digital platforms implemented since the pandemic. With access to this kind of insight, you can take advantage of the shortcomings of your competitors to stay ahead.

Can you afford to fall behind your peers’ strategic priorities, technological advancements and innovation? Ultimately, having access to industry insights provides an essential competitive advantage and the opportunity to stay at the forefront of the banking and financial landscape.

If you want to take advantage of a personalised comparison of your business against its competitors and get access to the full 2023 report, take the Bottomline survey today.

The post Don’t get left behind: unlock competitive advantage in banking and payments appeared first on TechHQ.

]]>
The public-sector’s software vacuum https://techhq.com/2023/07/software-publishers-public-sector-open-code-foss-fsfe/ Tue, 25 Jul 2023 18:56:28 +0000 https://techhq.com/?p=226423

Software publishers should adopt open-source for publicly-funded code. European pressure group’s camapign gathers momentum. Many of the benefits our taxes pay for are for everyone, including ourselves, to use. Roads, street lighting, police forces, public transport, and a thousand more everyday necessities are provided by governments and funded by the direct and indirect taxes we... Read more »

The post The public-sector’s software vacuum appeared first on TechHQ.

]]>
  • Software publishers should adopt open-source for publicly-funded code.
  • European pressure group’s camapign gathers momentum.

Many of the benefits our taxes pay for are for everyone, including ourselves, to use. Roads, street lighting, police forces, public transport, and a thousand more everyday necessities are provided by governments and funded by the direct and indirect taxes we all pay every day.

So why should software and software publishers be different? That’s the question posed by the Free Software Foundation Europe in its Public Money? Public Code! campaign that it’s running right across Europe.

The thought behind the campaign is very simple: if taxpayers pay for software commissioned by governmental bodies, that software should be open for everyone to use.

At a time when public budgets are tight and contracting, public institutions are positively encouraged to share resources, minimize expenditure and not duplicate existing infrastructure.

It seems odd, therefore, that one local authority can commission software from the private sector, for example, but should a neighboring authority want similar services, they too have to spend their citizen’s data to get a similar, if not identical, outcome.

The FSFE’s campaign is pushing for software developed on the taxpayer’s account to be open to all to copy, redistribute and re-use. By procuring software that is released under open-source licenses, all taxpayers benefit.

Logo of the FSFE (Free Software Foundation Europe)

Source: FSFE

Why open code, ask software publishers

Open-sourcing public software projects massively lowers the costs to the taxpayer. Instead of five municipalities each paying for the development of, for example, a smart public transport timetabling platform, it could be commissioned once and then shared. Individual municipalities could, if they wanted, pay a little more for the code to be right-sized for their specific use cases, but essentially, software procurement costs fall for everyone.

Currently, the ethos of resource-sharing and cost consciousness seems to stop at the point of procurement, where multiple public sector bodies all have to pay the private sector piper.

In the public sector’s IT space, many look to the cloud to operate more efficient and cost-conscious services to the citizen. SaaS provisions from software publishers, it is said, provides ready-made solutions to many of the needs of the public sector. Using these services is cheaper than developing from scratch and cheaper than paying in-house developers and hardware costs for on-premise solutions. But taxpayers’ money ends up paying large private sector companies multiple times. The cloud may offer some savings over legacy solutions, but such savings are pitiful compared to those available if the FSFE’s campaign were to gain more traction.

The arguments against open-source in this context tend to focus on job creation and the positive financial effects of the public sector commissioning software that delivers services. However, there are no local positives for a local government or town council: money flows to large municipalities where the big tech companies have offices, or worse, overseas.

Free Software Foundation illustrating its Public Money, Public Code! campaign

Source: FSFE via Mastodon.

Civil society organizations, developers, technologists, and activists across the continent have signed an Open Letter to attempt to convince local and national lawmakers of the savings and benefits of public code. As is often the case with FOSS and technology above a certain level, the FSFE’s role is partly educative in this campaign. It’s easy to equate something “free” with something of no value.

Defining open-source

But “free” in this context, and the FSFE’s title, refers to “free” as in “freedom to see, redistribute, copy and improve.” Once that distinction is made, the economic arguments for Public Money? Public Code! are unassailable.

The public sector is one of the big purchasers of IT and software today from hardware vendors and software publishers, with up to 27% of software companies’ revenue coming from the sector. That type of financial weight can change practices locally and nationally, delivering huge savings for citizens who, like their governments at all levels, are experiencing tough economic times.

This campaign makes sense to any technologist aware of the transformative nature of open-source software. It must make sense to accountants in your local city hall, too.

The post The public-sector’s software vacuum appeared first on TechHQ.

]]>
Why bioscience desperately needs technological development https://techhq.com/2023/07/why-bioscience-desperately-needs-technological-development/ Mon, 24 Jul 2023 11:53:56 +0000 https://techhq.com/?p=226525

• Technological development in life sciences is vital to unlock the power of the science. • Historically, technological levels in the life sciences has been primitive. • With better collaborative technology, scientists could cut years off treatment development. Bioscience and biotech are some of the most genuinely mind-boggling and potentially life-changing industries it’s possible to... Read more »

The post Why bioscience desperately needs technological development appeared first on TechHQ.

]]>

• Technological development in life sciences is vital to unlock the power of the science.
• Historically, technological levels in the life sciences has been primitive.
• With better collaborative technology, scientists could cut years off treatment development.

Bioscience and biotech are some of the most genuinely mind-boggling and potentially life-changing industries it’s possible to be in right now. So much so that the UK Government has determined to make the country, as well as a cyber-superpower through projects like the Cyber Runway, a “life sciences superpower” into the bargain – which is to say a biotech superpower.

But it turns out there’s an issue in getting from A-B in terms of such superpower status. Before you can turbo-charge the development of biotech and bioscience, you actually need to invest significantly in modern technological development to empower your biotech growth.

We sat down with Bob Burke, EMEA GM at biotech cloud platform Benchling and Jason C. Foster, CEO of cell and gene therapy manufacturing technology company Ori Biotech, both of whom are advocating for a pathway based on technological development, to find out why the expansion of biotech could be great news for trad-tech too.

THQ:

So, the UK government has set the industry a goal of turning post-Brexit Britain into a life sciences superpower. And it’s said it’s investing £650 million to achieve that.

That’s nowhere near enough money, is it?

JCF:

Ouch. Starting off with the softballs, right?

BB:

The thing is, the reality of the overall amount of funding that has been going into the R&D space in the UK in the world of the life sciences over the past 10 years is that it’s down 20% from where it was a decade ago.

So, as far as we’re concerned, any amount of investment is positive. And the hope, of course, is that if the government takes the lead and starts putting money into the sector, then private investment will follow. So, it’s positive news that we’re moving in that direction.

Added to that, pension funds are going to be allowed to put up to 5% of their assets under management into alternatives, which means private equity and venture capital. That’s a great start. Sure, it’s been a long time coming, but that 5% represents £75 billion of new funding that will go into the innovation ecosystem. So those are positive steps forward.

THQ:

So it’s a starter fund, rather than, say, a college fund?

JCF:

Yeah, I think so. It’s a good first step, and I don’t think the government said that that’s all it’ll put in, ever. This is where we start, and hopefully the ecosystem builds around it. And to Bob’s point, often following public capital is private capital, which we need more of in the ecosystem as well. So it will definitely help to prime the pump for the next generation of innovative companies coming out of the UK.

Why we need technological development.

THQ:

You’re both advocating for a significant amount of technological development to empower biotech and bioscience.

Hit us with the logic. Why do we need a highly developed technology culture in order to build a highly developed science culture?

Technological development allows breakthroughs to reach patients faster.

Curing the sick and healing the blind. Sounds vaguely familiar…

BB:

The thing that is really important to understand about this is that the level of technology that’s been invested in the world of science, historically, has been woefully low. The reality is that the tools and the technology that sit beside the level of innovation that’s happening from a scientific perspective are woefully out of balance.

THQ:

So we’re not just getting ChatGPT and telling it to cure bowel cancer or the like?

BB:

Ha. Before we can start thinking about how organizations can really start unlocking the potential of something like AI or machine learning, there are a lot of fundamentals that they’ve got to start getting right, which historically, they haven’t done a great job of.

If you think about the way that scientists in a laboratory have gone about their jobs over the past 10, 20, 30 years, it’s often been with a pencil and a piece of paper.

That means the data they’re collecting has historically been incredibly siloed.

THQ:

It is difficult to think of a more siloed database than a scientifically notated Post-It note, to be sure.

Primitive data tools need replacing through technological development.

Note to self: cure horrible diseases.

BB:

They can’t collaborate on their data, or share that data, or even gather all the data together, let alone gain any meaningful insights from it. So, establishing a baseline platform of data and having clear datasets across an organization is really key. Also, having it in a structured form that then sits in a logical database that you can unleash some AI and machine learning on is critical.

Those are all things that historically have not happened in the world of research and development. Those are the sort of challenges we’re facing here.

Technological development to at least business standard.

THQ:

So when you say we need to invest in technological development, that’s not some super-duper hyper-tech demand, that’s to get scientific research for the most part up to the level that multinational companies use as routine, day in and day out?

BB:

Yep. We’d call it delivering modern software for modern science with a true scientific cloud environment. That would be 100% cloud-based, helping organizations overcome these historic challenges.

JCF:

I can give you an example of the difference that would make if you like.

THQ:

Please – we’re all about the cogent examples.

JCF:

At Ori, we work in cell and gene therapy, which is a whole new modality that’s bringing cures for cancer and rare diseases to patients. So, it’s an incredibly important and interesting area to work in.

But we’re having major access problems. That means effectively, we have cures for cancers that patients can’t get access to, because they can’t be made, they can’t be manufactured, they’re too expensive. And a lot of that comes back to the processes that we start with in the lab when we develop these products.

THQ:

We’re sorry. You have cures for what-now?

JCF:

It’s often on paper, as Bob said. We can’t learn as quickly as we otherwise might about what works, what doesn’t work, we can’t get products into clinical trials fast enough, we can’t get them to the patients fast enough.

We think if we had a fully digital system from research all the way through the clinical process, we could shave three years off the drug development timeline for a new cell therapy.

Curing conditions is all down to the time-to-patient. Technological development would speed up the timer.

What would YOU do to get the cure for what was killing you 30% faster?

So, something that now takes maybe 10 or even 12 years to reach patients could do so in 30% less time. So that’s a very obvious use case for us to say there’s a massive benefit to going digital and using these data from the very beginning, rather than working in paper batch records or lab notebooks, which is where much of the industry still is.

And advanced therapies is one of the cornerstones of the industrial strategy for the UK, so this is a major area of focus for us.

But we’re still working as we have done for the last 50 years. So part of the answer is that we have new science that requires new tools and new infrastructure to underpin it. That demands significant technological development to help the science serve the people.

THQ:

Right. You understand we’ll be coming back to that whole “We have cures for cancer that people can’t get access to” thing… right?

Bob Burke and Jason C Foster, both advocates of technological development.

Bob Burke and Jason C Foster.

In Part 2 of this article, you’d better believe we’ll come back to that whole cures for cancer thing, but we’ll also dig deeper into the kind of technological development that could unlock the true speed of biotech and bioscience.

The post Why bioscience desperately needs technological development appeared first on TechHQ.

]]>
Meta Pixel “scandal” surprises too many tax-payers https://techhq.com/2023/07/meta-pixel-data-leak-tax-payers-democrats-congressional-scandal-news-comment/ Thu, 13 Jul 2023 19:04:23 +0000 https://techhq.com/?p=226270

The revelation that Meta’s tracking technology, Pixel, was deployed on three US tax return preparation websites has caused shockwaves that have spread as far as the Senate. A report by Democrats urges further investigation to see exactly what information Meta had access to. In a letter to the IRS and several other peri governmental organizations,... Read more »

The post Meta Pixel “scandal” surprises too many tax-payers appeared first on TechHQ.

]]>

The revelation that Meta’s tracking technology, Pixel, was deployed on three US tax return preparation websites has caused shockwaves that have spread as far as the Senate. A report by Democrats urges further investigation to see exactly what information Meta had access to.

In a letter to the IRS and several other peri governmental organizations, the seven signatories say there has been “a shocking breach of taxpayer privacy by tax prep companies and by Big Tech firms.”

The tax preparation companies installed Pixel code snippets on their websites that allowed them to monitor users’ activities while on site. The data is sent to Meta, which correlates it and helps companies optimize their activities for marketing or site optimization.

It’s worth noting that the tax-preparation companies also ran similar code snippets from Google, which denied tracking users.

Meta Pixel’s tracking

What’s most surprising about the “revelations” is the revelation that many people are surprised. Tracking end-users is commonplace, bordering on ubiquitous on the modern ‘web. Whether using a browser or mobile app, internet users are constantly tracked through Meta’s Pixel, Google Analytics cookies, or any number of the many thousands of tracking methods.

On this author’s smartphone, for example, there have been 29,313 tracking attempts recorded in the last week. A tracking attempt typically comprises third-party software installed in an app (or website) attempting to “phone home” with data such as location, network, phone ID, ZIP code, email address, contacts lists, and many more juicy digital tidbits.

Mastodon reactions to Meta Pixel scandals in the NHS

Source: Fosstodon.org

That situation has led to the emergence of many ad-blocking, anti-tracking and -fingerprinting methods, including browser add-ons such as UBlock Origin, Privacy Badger, and NoScript. A game of cat-and-mouse is constantly played out by digital advertisers and anti-tracker software developers, with new methods of fingerprinting users springing up as quickly as prophylactic methods are spun up.

Tracking technology is deliberately simple to deploy on an organization’s internet real estate (websites and apps) and is often free to use. Data is collected by the third party and can be used for its own purposes. In addition to Meta, there’s Google, Adobe, OneSignal, Microsoft, Urban Airship, Criteo, Amazon, Index Exchange, Bing, Improve Digital, Adform, Yahoo, Twitter, Zemanta, Yieldlab, et al. ad nauseam – there are literally thousands of companies offering tracking methods.

Meta Pixel in black and white

The small print of the Pixel documentation does note that some users may need to be wary of GDPR legislation, and those wishing to collate data from iOS devices may struggle due to Apple’s shutting down of default tracking capabilities on apps available from its App Store.

Small print from Meta Pixel documentation.

Source: Meta

The horror exhibited in the tone of the Democrats’ lawmakers’ letter to the IRS and its watchdog exhibits the kind of naivete that is all too prevalent. Similar tones of outrage are present when journalists “discover” that TikTok (owned by Bytedance) allows access to American and Australian citizens’ data by Chinese people working for a Chinese company.

News item on TikTok saying it allows Chinese people to see Americans' data

Source: Buzzfeed

Australian users' data sent to China.

Source: The Guardian

The truth is that any company deploying tracking technology for whatever reason on its website or in its apps is sending data to the company that supplies the tracking code. If an organization works in any area where privacy is important to its users, it must know that its real estate is handing information to a third party.

Although companies may only be interested in their customers’ traversals around their websites, the data collected by the tracking technology company may not be limited. Similarly, those signing up for the “free” tiers of user tracking may get only limited metrics (until they start to pay up, of course). But the tracking company – you can be sure – will absorb all the information it can.

That a third party receiving data may be in Beijing or San Francisco is irrelevant. Companies need to know that using off-the-shelf tracking technology supplied by a third-party spills their information to that third party. Whether that’s a good deal to get internal marketing insights is highly debatable.

The post Meta Pixel “scandal” surprises too many tax-payers appeared first on TechHQ.

]]>
IBM to use in-house AI chip to reduce costs of Watsonx https://techhq.com/2023/07/why-ibm-using-in-house-ai-chip-for-watsonx/ Wed, 12 Jul 2023 16:48:46 +0000 https://techhq.com/?p=226212

• IBM to use its own in-house AI chip for Watsonx. • The company hopes to tap a market in the age of generative AI. • Using an in-house AI chip could save IBM crucial expenditure on the project. It’s been a while since IBM has actively touted Watson — the tech giant’s early splash... Read more »

The post IBM to use in-house AI chip to reduce costs of Watsonx appeared first on TechHQ.

]]>

• IBM to use its own in-house AI chip for Watsonx.
• The company hopes to tap a market in the age of generative AI.
• Using an in-house AI chip could save IBM crucial expenditure on the project.

It’s been a while since IBM has actively touted Watson — the tech giant’s early splash in artificial intelligence, that never amounted to a profitable offering.

Today, Watson has given way to WatsonX, IBM’s new enterprise-ready AI and data platform, indicating that the Big Blue is back in the game, this time trying to ride the latest boom in AI. But to avoid past mistakes in cost management, IBM plans to utilize its own in-house AI chip.

IBM is billing the platform, Watsonx, as a development studio for companies to “train, tune and deploy” machine-learning models. The platform includes a feature for AI-generated code, an AI governance toolkit, and a library of thousands of large-scale AI models trained on language, geospatial data, IT events, and code, according to a release.

The tech giant first unveiled it in May this year, and on July 11, this week, IBM revealed that it had been shaped by more than 150 users across industries participating in its beta and tech preview programs. 

Clients using Watsonx. Source: IBM

Clients using Watsonx. Source: IBM

“Previewed at IBM THINK in May, Watsonx comprises three products to help organizations accelerate and scale AI – the watsonx.ai studio for new foundation models, generative AI, and machine learning; the watsonx.data fit-for-purpose data store, built on an open lakehouse architecture; and the watsonx.governance toolkit to help enable AI workflows to be built with responsibility, transparency, and explainability (coming later this year),” IBM said in a statement.

Simply put, Watsonx allows clients and partners to specialize and deploy models for various enterprise use cases or build their own. “The models, according to the tech giant, are pre-trained to support a range of Natural Language Processing (NLP)-type tasks including question-answering, content generation and summarization, text classification, and extraction.”

A déjà vu?

In February 2011, the world was introduced to Watson, IBM’s cognitive computing system that defeated Ken Jennings and Brad Rutter in a game show called Jeopardy! It was the first widely seen demonstration of cognitive computing, and Watson’s ability to answer subtle, complex, pun-laden questions made clear that a new era of computing was at hand. 

On the back of that very public success, in 2011, IBM turned Watson toward one of the most lucrative but untapped industries for AI: healthcare. Over the next decade, what followed was a series of ups and downs – but primarily downs – that exemplified the promise and the numerous shortcomings of applying AI to healthcare. The Watson health odyssey finally ended in 2022 when it was sold off “for parts.”

In retrospect, IBM described Watson as a learning journey for the company. “There have been wrong turns and setbacks,” IBM says, “but that comes with trying to commercialize pioneering technology.”

Fast forward to today, and IBM is hoping to take advantage of the boom in generative AI technologies that can write human-like text more than a decade after Watson failed to gain market traction. Mukesh Khare, general manager of IBM Semiconductors, told Reuters recently that one of the old Watson system’s barriers was high costs, which IBM hopes to address now. 

Khare said using its own chips could lower cloud service costs because they are power efficient. 

On the AI chip

IBM Research AI Hardware Center created a specialized computer chip for AI – calling it an Artificial Intelligence Unit, or AIU. Source: IBM

IBM Research AI Hardware Center created a specialized computer chip for AI – calling it an Artificial Intelligence Unit, or AIU. Source: IBM

IBM announced the chip’s existence in October 2022, but did not disclose the manufacturer or how it would be used. “It’s our first complete system-on-chip designed to run and train deep learning models faster and more efficiently than a general-purpose CPU,” IBM said in a release dated October 13, 2022. 

IBM also shared that, for the last decade, it had run deep learning models on CPUs and GPUs — graphics processors designed to render images for video games — when it needed an all-purpose chip optimized for the types of matrix and vector multiplication operations used for deep learning. “At IBM, we’ve spent the last five years figuring out how to design a chip customized for the statistics of modern AI,” it said.

IBM was trying to say that AI models are growing exponentially, but the hardware to train these behemoths and run them on servers in the cloud or on edge devices like smartphones and sensors hasn’t advanced as quickly. “That’s why the IBM Research AI Hardware Center created a specialized computer chip for AI. We’re calling it an Artificial Intelligence Unit, or AIU,” the tech giant iterated. 

The workhorses of traditional computing — standard chips known as CPUs — were designed before the revolution in deep learning, a form of machine learning that makes predictions based on statistical patterns in big data sets. “CPUs’ flexibility and high precision suit general-purpose software applications. But those winning qualities put them at a disadvantage when training and running deep learning models, which require massively parallel AI operations,” IBM added.

IBM is hoping to use its own AI chip.

Can an in-house AI chip gamble pay dividends for IBM?

The AI chip by the tech giant uses a range of smaller bit formats, including both floating point and integer representations, to make running an AI model far less memory intensive. “We leverage key IBM breakthroughs from the last five years to find the best trade-off between speed and accuracy,” the company said.

So by leveraging its own chips, IBM aims to improve cost efficiency, which could make their cloud service more competitive in the market. Khare also told Reuters that IBM has collaborated with Samsung Electronics for semiconductor research and has selected it to manufacture those AI chips. 

The approach is similar to those adopted by other tech giants like Google and Amazon. IBM, too, by developing proprietary chips, can differentiate its cloud computing service in the market. But Khare reassured the market that IBM was not trying to design a direct replacement for semiconductors from Nvidia, whose chips lead the market in training AI systems with vast amounts of data.

The post IBM to use in-house AI chip to reduce costs of Watsonx appeared first on TechHQ.

]]>
Business intelligence enters the age of the smart summary https://techhq.com/2023/07/business-intelligence-enters-the-age-of-the-smart-summary/ Tue, 11 Jul 2023 17:17:00 +0000 https://techhq.com/?p=226190

Interest in artificial intelligence (AI) is rocketing. And, as business editors at the New York Times pointed out recently, it’s not just tech companies that are talking about AI. Large language models (LLMs) fed with industry-specific data provide incredible search powers for companies looking to move ahead of their competitors – including new smart summary... Read more »

The post Business intelligence enters the age of the smart summary appeared first on TechHQ.

]]>

Interest in artificial intelligence (AI) is rocketing. And, as business editors at the New York Times pointed out recently, it’s not just tech companies that are talking about AI. Large language models (LLMs) fed with industry-specific data provide incredible search powers for companies looking to move ahead of their competitors – including new smart summary features.

Imagine a vast library where you can automatically retrieve not just the book you’re looking for, but the exact phrase, together with other supporting facts and figures. And that’s just the tip of the enterprise AI iceberg. Generative AI tools give companies the edge by digesting mind-blowing amounts of data and distilling all of that market intelligence into a smart summary that’s both insightful and time-saving.


Information is gold in investment circles. And a rising star in providing market analysis is AlphaSense. The US-headquartered firm, which has offices in London, Germany, Finland, and India – delivers insights from what it describes as ‘an extensive universe of public and private content—including company filings, event transcripts, news, trade journals, and equity research’.

For example, by analyzing data from more than 9000 publicly listed firms, which regularly host investor calls, AlphaSense determined that AI was mentioned twice as frequently in the first quarter of 2023 compared with the last quarter of 2022. And its enterprise AI tooling is helping the market intelligence provider go head-to-head with business analysis heavyweights such as Bloomberg.

In fact, it’s telling that Bloomberg has just announced BloombergGPT – a custom LLM that benefits from a 700 billion token corpus of curated financial data. The training data is equivalent to hundreds of millions of pages of text and Google’s Bard notes that a dataset of 700 billion tokens would be ‘a very valuable dataset for training LLMs’.

BloombergGPT’s training dataset –dubbed FinPile – consists of a range of English financial documents including news, filings, press releases, web-scraped financial documents, and social media drawn from the Bloomberg archives.

Company filings – data that AlphaSense and other analysis providers also mine for market insight – represent 14 billion tokens (or around 4 billion words, assuming that 3-4 tokens are used to represent each word) in BloombergGPT. And it’s worth noting that financial statements prepared by public companies, such as annual 10-K filings or quarterly 10-Q reports, are long PDFs that provide rich pickings for smart summary generators, as we’ll highlight shortly.

General LLMs – for example, OpenAI’s GPT-4, Google’s PaLM, and the open-source Falcon-40B are trained on data scraped from the web. And while they do include technical content from scientific research repositories and the US Patent and Trademark Office (USPTO), they haven’t been built to be domain-specific.

Falcon’s LLM team, based at the Technology Innovation Institute in the UAE, reports that filtering and deduplicating web data at very large scale – a pipeline that it dubs MacroData refinement – can produce LLMs capable of outperforming versions trained on curated corpora. But the power of having an LLM trained using domain-specific data can be seen by viewing the test results of BloombergGPT.

In four out of five tests, the domain-specific LLM came out on top. And on the one occasion when it was ranked second, the performance advantage of the winning LLM (the open-source GPT-NeoX, developed by ElutherAI) was slight. Training generative AI models on a refined diet of industry-specific data opens the door to superior smart summary performance.

In June, AlphaSense introduced AI-generated summarizations of key events in earnings calls to dramatically speed up the workflow for fund managers and other analysts keeping an eye on company performance.

The finance sector has long used AI and machine learning to try and spot patterns that would otherwise remain hidden. Natural language processing has been deployed for years to perform sentiment analysis on CEO statements, and other company reports, to determine how optimistic firms are about their future.

But generative AI straps a huge word engine onto the process and races ahead of what’s been done before. For example, tools don’t just put a number on the positives and the negatives, they can bullet the smart summary into columns depending on whether it represents good news or bad.

Putting smart summaries to the test using Google Bard.

Having said that generative AI algorithms trained with domain-specific market data have the edge on general LLMs, readily available tools are still impressive.

Asking Google Bard for its thoughts on the sentiment (including a request to add emojis for extra sparkle – see image below) of the article that you are reading now, produces the following output –

Sentiment:

  • Overall, the sentiment of the article is positive. The author is excited about the potential of LLMs to revolutionize the way financial data is analyzed.
  • There are a few instances of negative sentiment, such as when the author mentions that general LLMs are not as good as domain-specific LLMs at generating smart summaries.
emoji enhanced smart summary

Emoji sentiment summary: advanced chatbots continue to dazzle.

Security guardrails and air-gapped LLMs

However, before you rush to ask LLMs for a smart summary, make sure you understand where the data you submit goes. Many companies have restricted employees from using generative AI tools, such as OpenAI’s publicly available ChatGPT service, to ring-fence their business data.

The UK’s National Cyber Security Centre (NCSC) advised earlier this year that users should pay attention to the terms of use and privacy policy of generative AI services before asking advanced chatbots to answer sensitive questions. Text prompts alone can give much away, should anyone be able to view that information.

Recognizing that companies may want to guard their business operations closely, developers such as Yurts.ai – based in San Francisco, US – are offering air-gapped LLMs to provide clients with maximum security.

“We have seen an explosion of interest in generative AI for enterprise use, but most C-suites have genuine and rightful concerns about security and privacy,” said Ben Van Roo, CEO and Co-founder of Yurts.ai. “Our platform can be embedded within an enterprise and give companies private and secure access to generative AI-based assistants for writing, chat, and search.”

There are other options too. For example, on TechHQ we’ve written about how machine learning can work on data sets in a highly secure sandbox thanks to solutions such as BlindAI cloud.

Benefits beyond finance

The ability to generate a smart summary of vast amounts of data, automatically, in seconds, benefits not just the financial sector, but organizations of all kinds. Governments are taking a keen interest in measuring happiness to better allocate funding – rather than relying solely on conventional indicators that may not tell the whole story.

Back in 2020, before the current boom in LLMs, researchers showed that AI could be useful in understanding what makes us angry, happy, or sad – as reported by the World Economic Forum. And this is just one example of how valuable smart summaries could turn out to be, not just to firms, but more broadly.

The post Business intelligence enters the age of the smart summary appeared first on TechHQ.

]]>