Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Infrastructure Laundering: Blending in with the Cloud

30 January 2025 at 12:10

Image: Shutterstock, ArtHead.

In an effort to blend in and make their malicious traffic tougher to block, hosting firms catering to cybercriminals in China and Russia increasingly are funneling their operations through major U.S. cloud providers. Research published this week on one such outfit — a sprawling network tied to Chinese organized crime gangs and aptly named “Funnull” — highlights a persistent whac-a-mole problem facing cloud services.

In October 2024, the security firm Silent Push published a lengthy analysis of how Amazon AWS and Microsoft Azure were providing services to Funnull, a two-year-old Chinese content delivery network that hosts a wide variety of fake trading apps, pig butchering scams, gambling websites, and retail phishing pages.

Funnull made headlines last summer after it acquired the domain name polyfill[.]io, previously the home of a widely-used open source code library that allowed older browsers to handle advanced functions that weren’t natively supported. There were still tens of thousands of legitimate domains linking to the Polyfill domain at the time of its acquisition, and Funnull soon after conducted a supply-chain attack that redirected visitors to malicious sites.

Silent Push’s October 2024 report found a vast number of domains hosted via Funnull promoting gambling sites that bear the logo of the Suncity Group, a Chinese entity named in a 2024 UN report (PDF) for laundering millions of dollars for the North Korean Lazarus Group.

In 2023, Suncity’s CEO was sentenced to 18 years in prison on charges of fraud, illegal gambling, and “triad offenses,” i.e. working with Chinese transnational organized crime syndicates. Suncity is alleged to have built an underground banking system that laundered billions of dollars for criminals.

It is likely the gambling sites coming through Funnull are abusing top casino brands as part of their money laundering schemes. In reporting on Silent Push’s October report, TechCrunch obtained a comment from Bwin, one of the casinos being advertised en masse through Funnull, and Bwin said those websites did not belong to them.

Gambling is illegal in China except in Macau, a special administrative region of China. Silent Push researchers say Funnull may be helping online gamblers in China evade the Communist party’s “Great Firewall,” which blocks access to gambling destinations.

Silent Push’s Zach Edwards said that upon revisiting Funnull’s infrastructure again this month, they found dozens of the same Amazon and Microsoft cloud Internet addresses still forwarding Funnull traffic through a dizzying chain of auto-generated domain names before redirecting malicious or phishous websites.

Edwards said Funnull is a textbook example of an increasing trend Silent Push calls “infrastructure laundering,” wherein crooks selling cybercrime services will relay some or all of their malicious traffic through U.S. cloud providers.

“It’s crucial for global hosting companies based in the West to wake up to the fact that extremely low quality and suspicious web hosts based out of China are deliberately renting IP space from multiple companies and then mapping those IPs to their criminal client websites,” Edwards told KrebsOnSecurity. “We need these major hosts to create internal policies so that if they are renting IP space to one entity, who further rents it to host numerous criminal websites, all of those IPs should be reclaimed and the CDN who purchased them should be banned from future IP rentals or purchases.”

A Suncity gambling site promoted via Funnull. The sites feature a prompt for a Tether/USDT deposit program.

Reached for comment, Amazon referred this reporter to a statement Silent Push included in a report released today. Amazon said AWS was already aware of the Funnull addresses tracked by Silent Push, and that it had suspended all known accounts linked to the activity.

Amazon said that contrary to implications in the Silent Push report, it has every reason to aggressively police its network against this activity, noting the accounts tied to Funnull used “fraudulent methods to temporarily acquire infrastructure, for which it never pays. Thus, AWS incurs damages as a result of the abusive activity.”

“When AWS’s automated or manual systems detect potential abuse, or when we receive reports of potential abuse, we act quickly to investigate and take action to stop any prohibited activity,” Amazon’s statement continues. “In the event anyone suspects that AWS resources are being used for abusive activity, we encourage them to report it to AWS Trust & Safety using the report abuse form. In this case, the authors of the report never notified AWS of the findings of their research via our easy-to-find security and abuse reporting channels. Instead, AWS first learned of their research from a journalist to whom the researchers had provided a draft.”

Microsoft likewise said it takes such abuse seriously, and encouraged others to report suspicious activity found on its network.

“We are committed to protecting our customers against this kind of activity and actively enforce acceptable use policies when violations are detected,” Microsoft said in a written statement. “We encourage reporting suspicious activity to Microsoft so we can investigate and take appropriate actions.”

Richard Hummel is threat intelligence lead at NETSCOUT. Hummel said it used to be that “noisy” and frequently disruptive malicious traffic — such as automated application layer attacks, and “brute force” efforts to crack passwords or find vulnerabilities in websites — came mostly from botnets, or large collections of hacked devices.

But he said the vast majority of the infrastructure used to funnel this type of traffic is now proxied through major cloud providers, which can make it difficult for organizations to block at the network level.

“From a defenders point of view, you can’t wholesale block cloud providers, because a single IP can host thousands or tens of thousands of domains,” Hummel said.

In May 2024, KrebsOnSecurity published a deep dive on Stark Industries Solutions, an ISP that materialized at the start of Russia’s invasion of Ukraine and has been used as a global proxy network that conceals the true source of cyberattacks and disinformation campaigns against enemies of Russia. Experts said much of the malicious traffic  traversing Stark’s network (e.g. vulnerability scanning and password brute force attacks) was being bounced through U.S.-based cloud providers.

Stark’s network has been a favorite of the Russian hacktivist group called NoName057(16), which frequently launches huge distributed denial-of-service (DDoS) attacks against a variety of targets seen as opposed to Moscow. Hummel said NoName’s history suggests they are adept at cycling through new cloud provider accounts, making anti-abuse efforts into a game of whac-a-mole.

“It almost doesn’t matter if the cloud provider is on point and takes it down because the bad guys will just spin up a new one,” he said. “Even if they’re only able to use it for an hour, they’ve already done their damage. It’s a really difficult problem.”

Edwards said Amazon declined to specify whether the banned Funnull users were operating using compromised accounts or stolen payment card data, or something else.

“I’m surprised they wanted to lean into ‘We’ve caught this 1,200+ times and have taken these down!’ and yet didn’t connect that each of those IPs was mapped to [the same] Chinese CDN,” he said. “We’re just thankful Amazon confirmed that account mules are being used for this and it isn’t some front-door relationship. We haven’t heard the same thing from Microsoft but it’s very likely that the same thing is happening.”

Funnull wasn’t always a bulletproof hosting network for scam sites. Prior to 2022, the network was known as Anjie CDN, based in the Philippines. One of Anjie’s properties was a website called funnull[.]app. Loading that domain reveals a pop-up message by the original Anjie CDN owner, who said their operations had been seized by an entity known as Fangneng CDN and ACB Group, the parent company of Funnull.

A machine-translated message from the former owner of Anjie CDN, a Chinese content delivery network that is now Funnull.

“After I got into trouble, the company was managed by my family,” the message explains. “Because my family was isolated and helpless, they were persuaded by villains to sell the company. Recently, many companies have contacted my family and threatened them, believing that Fangneng CDN used penetration and mirroring technology through customer domain names to steal member information and financial transactions, and stole customer programs by renting and selling servers. This matter has nothing to do with me and my family. Please contact Fangneng CDN to resolve it.”

In January 2024, the U.S. Department of Commerce issued a proposed rule that would require cloud providers to create a “Customer Identification Program” that includes procedures to collect data sufficient to determine whether each potential customer is a foreign or U.S. person.

According to the law firm Crowell & Moring LLP, the Commerce rule also would require “infrastructure as a service” (IaaS) providers to report knowledge of any transactions with foreign persons that might allow the foreign entity to train a large AI model with potential capabilities that could be used in malicious cyber-enabled activity.

“The proposed rulemaking has garnered global attention, as its cross-border data collection requirements are unprecedented in the cloud computing space,” Crowell wrote. “To the extent the U.S. alone imposes these requirements, there is concern that U.S. IaaS providers could face a competitive disadvantage, as U.S. allies have not yet announced similar foreign customer identification requirements.”

It remains unclear if the new White House administration will push forward with the requirements. The Commerce action was mandated as part of an executive order President Trump issued a day before leaving office in January 2021.

MasterCard DNS Error Went Unnoticed for Years

22 January 2025 at 10:24

The payment card giant MasterCard just fixed a glaring error in its domain name server settings that could have allowed anyone to intercept or divert Internet traffic for the company by registering an unused domain name. The misconfiguration persisted for nearly five years until a security researcher spent $300 to register the domain and prevent it from being grabbed by cybercriminals.

A DNS lookup on the domain az.mastercard.com on Jan. 14, 2025 shows the mistyped domain name a22-65.akam.ne.

From June 30, 2020 until January 14, 2025, one of the core Internet servers that MasterCard uses to direct traffic for portions of the mastercard.com network was misnamed. MasterCard.com relies on five shared Domain Name System (DNS) servers at the Internet infrastructure provider Akamai [DNS acts as a kind of Internet phone book, by translating website names to numeric Internet addresses that are easier for computers to manage].

All of the Akamai DNS server names that MasterCard uses are supposed to end in “akam.net” but one of them was misconfigured to rely on the domain “akam.ne.”

This tiny but potentially critical typo was discovered recently by Philippe Caturegli, founder of the security consultancy Seralys. Caturegli said he guessed that nobody had yet registered the domain akam.ne, which is under the purview of the top-level domain authority for the West Africa nation of Niger.

Caturegli said it took $300 and nearly three months of waiting to secure the domain with the registry in Niger. After enabling a DNS server on akam.ne, he noticed hundreds of thousands of DNS requests hitting his server each day from locations around the globe. Apparently, MasterCard wasn’t the only organization that had fat-fingered a DNS entry to include “akam.ne,” but they were by far the largest.

Had he enabled an email server on his new domain akam.ne, Caturegli likely would have received wayward emails directed toward mastercard.com or other affected domains. If he’d abused his access, he probably could have obtained website encryption certificates (SSL/TLS certs) that were authorized to accept and relay web traffic for affected websites. He may even have been able to passively receive Microsoft Windows authentication credentials from employee computers at affected companies.

But the researcher said he didn’t attempt to do any of that. Instead, he alerted MasterCard that the domain was theirs if they wanted it, copying this author on his notifications. A few hours later, MasterCard acknowledged the mistake, but said there was never any real threat to the security of its operations.

“We have looked into the matter and there was not a risk to our systems,” a MasterCard spokesperson wrote. “This typo has now been corrected.”

Meanwhile, Caturegli received a request submitted through Bugcrowd, a program that offers financial rewards and recognition to security researchers who find flaws and work privately with the affected vendor to fix them. The message suggested his public disclosure of the MasterCard DNS error via a post on LinkedIn (after he’d secured the akam.ne domain) was not aligned with ethical security practices, and passed on a request from MasterCard to have the post removed.

MasterCard’s request to Caturegli, a.k.a. “Titon” on infosec.exchange.

Caturegli said while he does have an account on Bugcrowd, he has never submitted anything through the Bugcrowd program, and that he reported this issue directly to MasterCard.

“I did not disclose this issue through Bugcrowd,” Caturegli wrote in reply. “Before making any public disclosure, I ensured that the affected domain was registered to prevent exploitation, mitigating any risk to MasterCard or its customers. This action, which we took at our own expense, demonstrates our commitment to ethical security practices and responsible disclosure.”

Most organizations have at least two authoritative domain name servers, but some handle so many DNS requests that they need to spread the load over additional DNS server domains. In MasterCard’s case, that number is five, so it stands to reason that if an attacker managed to seize control over just one of those domains they would only be able to see about one-fifth of the overall DNS requests coming in.

But Caturegli said the reality is that many Internet users are relying at least to some degree on public traffic forwarders or DNS resolvers like Cloudflare and Google.

“So all we need is for one of these resolvers to query our name server and cache the result,” Caturegli said. By setting their DNS server records with a long TTL or “Time To Live” — a setting that can adjust the lifespan of data packets on a network — an attacker’s poisoned instructions for the target domain can be propagated by large cloud providers.

“With a long TTL, we may reroute a LOT more than just 1/5 of the traffic,” he said.

The researcher said he’d hoped that the credit card giant might thank him, or at least offer to cover the cost of buying the domain.

“We obviously disagree with this assessment,” Caturegli wrote in a follow-up post on LinkedIn regarding MasterCard’s public statement. “But we’ll let you judge— here are some of the DNS lookups we recorded before reporting the issue.”

Caturegli posted this screenshot of MasterCard domains that were potentially at risk from the misconfigured domain.

As the screenshot above shows, the misconfigured DNS server Caturegli found involved the MasterCard subdomain az.mastercard.com. It is not clear exactly how this subdomain is used by MasterCard, however their naming conventions suggest the domains correspond to production servers at Microsoft’s Azure cloud service. Caturegli said the domains all resolve to Internet addresses at Microsoft.

“Don’t be like Mastercard,” Caturegli concluded in his LinkedIn post. “Don’t dismiss risk, and don’t let your marketing team handle security disclosures.”

One final note: The domain akam.ne has been registered previously — in December 2016 by someone using the email address um-i-delo@yandex.ru. The Russian search giant Yandex reports this user account belongs to an “Ivan I.” from Moscow. Passive DNS records from DomainTools.com show that between 2016 and 2018 the domain was connected to an Internet server in Germany, and that the domain was left to expire in 2018.

This is interesting given a comment on Caturegli’s LinkedIn post from an ex-Cloudflare employee who linked to a report he co-authored on a similar typo domain apparently registered in 2017 for organizations that may have mistyped their AWS DNS server as “awsdns-06.ne” instead of “awsdns-06.net.” DomainTools reports that this typo domain also was registered to a Yandex user (playlotto@yandex.ru), and was hosted at the same German ISP — Team Internet (AS61969).

Top Prescriptive Analytics Tools & Software (2024)

29 May 2024 at 13:40

Unlike descriptive and predictive analytics, which focus on understanding past data and predicting future trends, prescriptive analytics provides actionable recommendations on what steps to take next. Below are nine of the best prescriptive analytics tools to help you forecast the business weather and prepare for the storms ahead.

TechnologyAdvice is able to offer our services for free because some vendors may pay us for web traffic or other sales opportunities. Our mission is to help technology buyers make better purchasing decisions, so we provide you with information for all vendors — even those that don’t pay us.

Featured partners

Product

Score

Best For

Key Differentiator

Pricing

Free Trial/Free Plan

Alteryx logo.

Alteryx

4.46

Best end-user experience

User-friendly interface with drag-and-drop functionality

Starts at $4,950/year

Yes/No

Azure logo.

Azure Machine Learning

4.40

Best data privacy

Advanced security features and integration with Azure services

Pay-as-you-go pricing

Yes/No

SAP logo.

SAP Integrated Business Planning

4.32

Best for supply chain optimization

Real-time supply chain analytics and planning

Subscription-based pricing

Yes/No

Google Looker logo.

Looker

4.30

Best for data modeling

Strong data modeling capabilities and integration with Google Cloud

Custom pricing

Yes/No

Tableau logo.

Tableau

4.23

Best for data visualization

Industry-leading data visualization tools

Starts at $75/user/month

Yes/No

Oracle logo.

Oracle Autonomous Data Warehouse

4.18

Best for scalable data management

Elastic scaling and built-in machine learning capabilities

Based on Oracle Cloud services

Yes/No

Altair logo.

RapidMiner Studio

4.18

Best data mining and aggregation

Comprehensive data mining and machine learning tools

Starts at $2,500/year

Yes/Yes

IBM logo.

IBM Decision Optimization

4.15

Best machine learning

Powerful optimization solvers and integration with Watson Studio

Starts at $199/user/month

Yes/No

4.11

Best data science flexibility on a budget

Open-source platform with extensive data integration capabilities

Free for basic version

Yes/Yes

Alteryx logo.

Alteryx: Best for end-user experience

Overall Score

4.46/5

Pricing

2.7/5

General features and interface

4.7/5

Core features

4.8/5

Advanced features

5/5

Integration and compatibility

5/5

UX

4.3/5

Pros

  • Intuitive workflow
  • Data blending capabilities
  • Advanced analytics
  • Data visualization

Cons

  • Complex for beginners
  • Limited collaboration features
  • Cost

Why we chose Alteryx

Alteryx’s simple interface helps break down complex data workflows, making data analysis accessible even to non-coders. This feature, coupled with a comprehensive suite of pre-built analytic models and an extensive library of connectors, allows you to derive actionable insights seamlessly. Its Alteryx Academy further enhances its usability and facilitates speedy adoption. The availability of Alteryx Community, a platform for peer support and learning, underlines why it is our top choice for the best end-user experience.

KNIME, another strong contender known for its flexibility and budget-friendly options, still falls short in user experience compared to Alteryx. While KNIME offers powerful data analytics capabilities, its interface can be less intuitive, requiring more technical knowledge to navigate. Alteryx, on the other hand, prioritizes maintaining a user-friendly design, making it easier for users at all technical levels to perform complex analytics tasks without extensive training. 

Alteryx is your go-to platform for simplifying complex data workflows. Its intuitive drag-and-drop interface makes tasks like data blending, cleansing, and analysis accessible to both technical and non-technical users. 

For small businesses up to large enterprises, Alteryx empowers your analysts with advanced analytics capabilities without requiring extensive coding knowledge. You can take advantage of a wide range of data sources and benefit from extensive training resources through Alteryx Academy. For automating repetitive tasks and boosting your data analysis capabilities, Alteryx offers a powerful and user-friendly solution.

Self-service data analytics: Quick and precise insights delivery with an end-to-end platform for data discovery, blending, and analysis.

Drag-and-drop workflow: Easy creation and alteration of analytical workflows through an intuitive user interface.

Predictive analytics: With more than 60 pre-built tools, Alteryx allows the harnessing of advanced analytics for spatial and statistical analysis and predictive modeling without any coding required.

Data connectors: Native data connectors to numerous sources such as SQL, Oracle, Excel, and Access and supports cloud-based data from AWS, Google Analytics, Salesforce, etc.

Alteryxreviews

Free trial available

Designer Cloud: Starting at $4,950/user/year

Designer Desktop: $5,195/user/year

Azure logo.

Azure Machine Learning: Best for data privacy

Overall Score

4.40/5

Pricing

2.5/5

General features and interface

4.5/5

Core features

5/5

Advanced features

5/5

Integration and compatibility

5/5

UX

4.5/5

Pros

  • Top-notch security
  • Built-in privacy features
  • Enterprise-level control

Cons

  • Dependency on Microsoft ecosystem
  • Limitations in free tier

Why we chose Azure Machine Learning

As part of the Azure environment, Azure Machine Learning benefits from all the security features used to protect the cloud service at large. Similar to how Office 365 enables increased controls regarding access privileges, data storage and sharing, and identity management, Azure Machine Learning ensures the safeguarding of connected data pipelines and workflows. Its built-in security measures include advanced threat protection, encryption at rest and in transit, and comprehensive compliance certifications, providing a robust framework for data privacy.

When compared to Oracle Autonomous Data Warehouse, another strong contender known for its security features, Azure Machine Learning stands out particularly in the realm of integrated data privacy. Oracle provides excellent data security and compliance, but Azure’s extensive suite of security tools and seamless integration with other Microsoft services offer a more comprehensive approach to data privacy. Azure’s identity management and access controls, along with its ability to monitor and respond to threats in real-time, give users a higher level of confidence in the protection of their data. 

Azure Machine Learning, part of the Microsoft Azure ecosystem, offers a secure and scalable platform for developing and deploying machine learning models.

It integrates with various Azure services and supports multiple development environments, providing you with flexibility in model building and deployment. Advanced tools for automated machine learning, data labeling, and model interpretability make Azure Machine Learning comprehensive for your AI projects. If your enterprise prioritizes data privacy and needs a reliable environment for developing sophisticated machine learning applications, this platform is an excellent choice.

Enterprise-grade MLOps: Build, deploy, and manage machine learning models efficiently at scale, fostering robust operationalization and lifecycle management of your models.

Automated machine learning: Makes the selection and tuning of machine learning models hassle-free, increasing productivity and reducing the possibility of errors.

On-premises, multi-cloud, and at-the-edge deployment: Flexibility to deploy your machine learning models wherever you need them.

Explainability and fairness of models: Includes built-in features for model interpretability and fairness.

Security and compliance: Provides advanced security controls and privacy-preserving features, including differential privacy and confidential computing.

Integrated notebooks: Offers Jupyter Notebooks as part of the service.

Studio

  • Free plan: Available, no Azure subscription required.
  • Standard: $9.99/user/month plus $1 per studio experimentation hour; Azure subscription required. Unlimited modules and storage, experiments can last up to 7 days, with a maximum of 24 hours per module.

Production Web API

  • Web API Dev/Test: Free. Includes 1000 transactions per month, 2 compute hours, and 2 associated web services.
  • Web API Standard S1: $100.13/user/month, includes 100,000 transactions and 25 compute hours per month. Overage rates are $0.50 per 1,000 transactions and $2 per API compute hour.
  • Web API Standard S2: $1,000.06/user/month; includes 2,000,000 transactions and 500 compute hours per month. Overage rates are $0.25 per 1,000 transactions and $1.50 per API compute hour.
  • Web API Standard S3: $9,999.98/user/month; includes 50,000,000 transactions and 12,500 compute hours per month. Overage rates are $0.10 per 1,000 transactions and $1 per API compute hour.
SAP logo.

SAP Integrated Business Planning: Best for supply chain optimization

Overall Score

4.32/5

Pricing

2.9/5

General features and interface

4.5/5

Core features

5/5

Advanced features

5/5

Integration and compatibility

4.3/5

UX

4.2/5

Pros

  • Immediate insights from live data integration
  • Scenario planning
  • Short-term demand sensing for accuracy
  • Single unified data model
  • Supply chain control tower
  • Strong ERP integration

Cons

  • High implementation cost
  • Complex integration

Why we chose SAP Integrated Business Planning

With the full might of SAP’s suite behind it, you can ensure seamless data flow and consistency across business processes. This makes SAP IBP particularly effective for organizations looking to optimize their supply chain operations comprehensively and efficiently.

SAP IBP integrates key planning processes, including demand sensing, inventory optimization, and sales and operations planning, into a single unified platform. 

SAP IBP provides end-to-end supply chain visibility and advanced predictive analytics tailored specifically for supply chain management. While Oracle focuses on data management and processing, SAP IBP offers specialized modules for supply chain operations, including demand-driven replenishment and supply chain control tower capabilities, which are not as deeply embedded in Oracle’s offering.

SAP Integrated Business Planning (IBP) offers a comprehensive solution for managing your supply chain, providing advanced tools for demand planning, inventory optimization, and sales and operations planning. 

It processes real-time data and uses predictive analytics to deliver accurate forecasts and scenario planning. SAP IBP’s collaboration features facilitate coordination across your business units, improving overall supply chain efficiency. If you are seeking to optimize your supply chain operations with extensive customization options and scalability, SAP IBP meets the needs of businesses of all sizes, enhancing your supply chain performance through data-driven insights.

Also read: Best Embedded Analytics Tools 2024

Scenario planning: Enables users to perform ‘what-if’ analysis to predict the impact of different scenarios.

Demand sensing: Utilizes short-term historical data to improve forecast accuracy.

Supply chain control tower: Offers end-to-end visibility and monitoring of the entire supply chain.

Unified data model: Maintains a single data model for enhanced collaboration and consistency across functions.

Integrated with SAP ERP: Seamlessly connects with SAP ERP and S/4HANA for comprehensive planning and execution.

Inventory optimization: Helps in optimizing inventory levels to balance service levels and costs.

Order-based planning: Supports planning at the order level for precise supply chain management.

SAP Integrated Business Planning dashboard screenshot.

SAP IBP Starter Edition:

  • Contract Duration: 3 months
  • Pricing: $31,260.00
  • Includes:
    • SAP Cloud Integration Services
    • SAP Cloud Identity Services
    • SAP IBP modules for demand, inventory, response and supply, sales and operations, and Supply Chain Control Tower
  • User Limit: Up to 10 concurrent users
  • Data Limit: Up to 50 million total planning points

More Specific Versions of SAP IBP:

  • There are diffent tiers depending on specific needs
  • Contract Duration: Customizable based on business needs
  • Pricing: Subscription-based, with detailed pricing available upon request
  • Includes: All features of the Starter edition plus additional functionalities and higher data limits
  • User Limit: Scalable based on subscription tier
  • Support: Premium consulting and integration services available
Google Looker logo.

Looker by Google: Best for data modeling

Overall Score

4.30/5

Pricing

3.3/5

General features and interface

3.9/5

Core features

3.5/5

Advanced features

5/5

Integration and compatibility

5/5

UX

3.5/5

Pros

  • Built-in IDE for data modeling
  • Versatile data access
  • Enhanced collaboration
  • Integration with R

Cons

  • Dependency on LookML
  • Limited pre-built visualization types
  • Performance scaling issues reported

Why we chose Looker by Google

Looker’s secret weapon is its ability to create powerful, scalable data models using its LookML language. It allows teams to curate and centralize business metrics, fostering better data governance. Plus, its in-database architecture means models can handle large datasets without performance trade-offs. Looker’s versatility and adaptability, including its integration capabilities with SQL and other data sources, make it ideal for businesses that need an intuitive data modeling platform.

The platform’s most natural competitor, Tableau, still leaves something to be desired when it comes to data modeling. Tableau’s strengths lie in its visual analytics, but it falls short in its data modeling capabilities. Looker allows for more sophisticated and reusable data models through LookML, ensuring centralized management and consistency across the organization. Looker’s ability to integrate with SQL databases without data extraction enhances its performance, making it more efficient.

Looker, part of Google Cloud, specializes in data modeling capabilities using its proprietary LookML language. 

This platform is ideal if your team needs scalable, centralized business metrics to enhance data governance. Looker processes data within the database itself, maintaining high performance even with large datasets. If you require comprehensive data modeling, Looker integrates smoothly with various SQL databases and other data sources. Its ability to create detailed visualizations and dashboards supports your organization in making strategic, informed decisions.

LookML data modeling: Looker’s proprietary language, LookML, offers a code-based approach to defining business logic and data relationships, providing granular control over how data is queried and visualized.

Data blocks: Pre-modeled pieces of business logic or whole datasets from third-party sources that can be natively integrated into your existing models.

Looker actions: Allows users to take meaningful actions on insights directly from within Looker, like changing data in your database, sending an email, or creating a task in project management software.

Embedded analytics: Looker’s Powered by Looker platform enables you to embed real-time analytics and data visualizations directly into your workflows, applications, or portals.

Looker dashboard.
Looker Reviews
looker business intelligence

Viewer User: $30/user/month

Standard User: $60/user/month

Developer User: $125/user/month

Tableau logo.

Tableau: Best for data visualization

Overall Score

4.23/5

Pricing

2.1/5

General features and interface

4.3/5

Core features

4.8/5

Advanced features

5/5

Integration and compatibility

5/5

UX

4/5

Pros

  • User-friendly interface
  • Wide range of visualization options
  • Powerful data handling
  • Strong community and resources

Cons

  • Data connectivity issues
  • Limited data preparation
  • Costly for large teams

Why we chose Tableau

Tableau is well-known for its ability to turn complex data into comprehensible visual narratives. Its intuitive, drag-and-drop interface makes it accessible for non-technical users while still offering depth for data experts. The large array of visualization options, from simple bar graphs to intricate geographical maps, allows for highly customized presentations of data. With top-notch real-time analytics, mobile-ready dashboards, and secure collaboration tools, Tableau proves to be an invaluable asset for quick, accurate decision-making.

When compared to Microsoft Power BI, another platform known for its data visualization, Tableau excels in providing more sophisticated and customizable visualization options. While Power BI integrates well with other Microsoft products and offers competitive pricing, its visualization capabilities are not as advanced or flexible as Tableau’s. Tableau’s ability to handle large datasets and perform real-time analytics without compromising performance sets it apart. Additionally, its extensive community support and continuous updates ensure that it remains at the forefront of data visualization technology. 

Tableau transforms complex data into clear, comprehensible visual narratives. Its drag-and-drop interface is designed for users of all technical levels, making it easy to create a wide array of visualizations, from simple charts to intricate maps. 

If you need to present data visually in an engaging and understandable way, Tableau should be at the top of your list. The platform supports real-time analytics and mobile-ready dashboards, providing you with immediate access to insights. Collaboration tools make it easier for your teams to work together on data projects, improving overall efficiency and understanding.

  • Data blending: Enables users to blend data from multiple sources, providing a unified view of multiple datasets.
  • Drag-and-drop interface: Users can create complex visualizations using a simple drag-and-drop mechanism.
  • Real-time data analysis: Real-time data analysis allows for up-to-the-minute business insights and decision making.
  • Interactive dashboards: Lets users drill down into charts and graphs for more detail.
  • Tableau Public: A free service that allows users to publish data visualizations to the web. These can be embedded into webpages and blogs, shared via social media or email, and made available for download for other users.
  • Mobile-ready dashboards: Dashboards are optimized for tablets and smartphones, enabling users to access their data anytime, anywhere.

Tableau dashboard of Ellie Roger's team.
asana Tableau Dashboard
Tableau sales and marketing dasbhoard.
Screenshot of a donut chart in Tableau.

Free plan available

Tableau Creator: $75/user/month

Add-ons:

  • Tableau Viewer: $15/user/month
  • Tableau Explorer: $42/user/month
Oracle logo.

Oracle Autonomous Data Warehouse: Best for scalable data management

Overall Score

4.18/5

Pricing

2.5/5

General features and interface

4.2/5

Core features

5/5

Advanced features

5/5

Integration and compatibility

4/5

UX

4/5

Pros

  • Optimized query and workload handling
  • Integrates with Oracle services
  • Elastic scaling
  • Self-managing automation

Cons

  • Dependency on Oracle ecosystem
  • Complex auto-scaling management

Why we chose Oracle Autonomous Data Warehouse

Oracle Autonomous Data Warehouse is designed to take the heavy lifting out of database operations while delivering impressive performance and adaptability. Imagine a system that grows with your business, automatically adjusting its resources based on your needs. 

While IBM brings strong machine learning capabilities to the table, it can’t match Oracle’s seamless scalability and automated management. Oracle goes a step further by baking machine learning right into the system, helping to fine-tune performance, bolster security, and streamline backups.

But it’s not just about handling more data. Oracle’s system plays well with others, integrating smoothly with its cloud ecosystem and a variety of enterprise tools. 

Perhaps most impressively, Oracle allows you to perform sophisticated data analysis and predictive modeling right within the warehouse. This in-database machine learning feature is a game-changer for efficiency and insights.

Oracle Autonomous Data Warehouse is designed to streamline data management processes, providing an efficient and automated platform for your analytics needs. As the industry’s first self-driving database, it runs natively on Oracle Cloud Infrastructure (OCI), automating tasks such as patching, provisioning, tuning, and scaling without the need for human intervention. This platform is particularly suited for enterprises looking to manage vast amounts of data with minimal manual effort, offering high performance and scalability.

You can benefit from its ability to integrate with various cloud environments, including AWS, Azure, and Google Cloud, providing expansive multicloud functionality. The platform supports real-time analytics and advanced machine learning models through its built-in Oracle Machine Learning services, which accelerate model creation and deployment. Additionally, Oracle Autonomous Data Warehouse’s Exadata infrastructure offers high-performance storage at reduced costs, making it a cost-effective solution for large-scale data operations.

Ideal for businesses that need to consolidate data from multiple sources into a single, query-optimized data store, Oracle Autonomous Data Warehouse provides robust support for data integration and analysis. With features like automatic data preparation, AutoML for automated model development, and graph analytics for managing complex data relationships, this platform enhances your ability to derive meaningful insights from your data. For organizations looking to modernize their data architecture and improve data accessibility and performance, Oracle Autonomous Data Warehouse is a powerful choice.

In-database machine learning: Offers in-database machine learning capabilities, allowing users to build and deploy models without moving data​.

Natural language queries: Enables natural language querying with AI, letting users interact with data without needing SQL knowledge​​.

Vector search: Supports vector search for identifying similar data across documents, images, and other unstructured data types​​.

Graph analytics: Includes advanced graph analytics features for uncovering relationships within complex data sets​​.

Spatial features: Provides comprehensive spatial data processing for large-scale location intelligence and geospatial applications​.

Automated threat detection: Uses AI-driven automated threat detection and remediation to enhance data security​.

Oracle Autonomous Data Warehouse screenshot.
Oracle Autonomous Data Warehouse screenshot.
Oracle Autonomous Data Warehouse screenshot.
Oracle Autonomous Data Warehouse screenshot.
Oracle Autonomous Data Warehouse screenshot.
Oracle Autonomous Data Warehouse screenshot.

  1. Compute Costs (ECPU Billing Model):
    • Serverless:
      • ECPU per hour: Pricing starts at approximately $0.1125 per ECPU hour.
    • Dedicated Infrastructure:
      • Exadata Storage per ECPU: Costs vary based on the specific configuration and usage, typically higher than serverless options.
  2. Storage Costs:
    • Serverless:
      • Database Storage: Charged per terabyte (TB) per month.
      • Backup Storage: Charged separately per terabyte (TB) per month.
    • Dedicated Infrastructure:
      • Database Storage and Backup Storage: Provisioned in TB increments, with specific pricing based on configuration.
  3. Minimum Term:
    • For dedicated infrastructure deployments, the minimum subscription term is 48 hours.

Additional Notes:

  • BYOL (Bring Your Own License): Users with existing Oracle licenses can benefit from reduced pricing under the BYOL model.
  • Cost Estimator Tool: Oracle provides an online cost estimator tool to help users calculate their expected monthly expenses based on their specific usage requirements.
Altair logo.

RapidMiner Studio: Best for data mining and aggregation

Overall Score

4.18/5

Pricing

2.5/5

General features and interface

3.9/5

Core features

5/5

Advanced features

4.5/5

Integration and compatibility

5/5

UX

4/5

Pros

  • Excellent data processing capabilities
  • Model validation mechanisms
  • Parallel processing support

Cons

  • Scripting limitations
  • Memory consumption
  • Complex advanced features may be overwhelming for learners

Why we chose RapidMiner Studio

The most compelling attribute of RapidMiner Studio is the level of nuance it provides during data discovery. ETL processes can be defined with numerous granular modifications, making the process of importing and scrubbing data a lot easier. Even messy, unstructured, or poorly organized data can be quickly parsed and processed once the correct automations are in place.

Data almost always has value, but for humans to leverage it meaningfully, it needs to be formatted in a comprehensible way for both users and AI tools. This is RapidMiner’s strong suit: transforming convoluted piles of information into visualizations, dashboards, and prescriptive insights.

KNIME also offers powerful data integration and manipulation capabilities but often requires more manual configuration and coding knowledge. RapidMiner provides a more user-friendly interface and automation features that streamline the ETL process, making it accessible to users with varying levels of technical expertise. Additionally, RapidMiner’s support for handling unstructured data and its ability to produce actionable insights swiftly make it the preferred choice for organizations focused on efficient data mining and aggregation. 

RapidMiner Studio is a premier platform for data mining and predictive analytics. It is suitable for both data scientists and business users, offering extensive tools for data preparation, model building, and validation. 

If your organization needs to perform advanced data analysis and predictive modeling, RapidMiner’s integration capabilities with various data sources and third-party applications enhance its versatility. The platform’s collaborative features allow your teams to share workflows and insights effectively, driving better business outcomes.

Automated data science: Simplifies complex data transformation, model selection, and validation tasks.

Multi-threaded execution: Capitalizing on your machine’s computational capabilities, RapidMiner offers multi-threaded execution for faster data processing and model building.

Rich data preprocessing tools: Provides a vast range of preprocessing operators, allowing users to clean, transform, and enrich their data efficiently.

Predictive modeling: Supports numerous machine learning algorithms, enabling users to create advanced predictive models.

Visual workflow designer: Drag-and-drop visual interface lets users design complex data workflows with ease, minimizing the need for code.

RapidMinerreviews
RapidMinerreviews

Professional: $7,500/user/month

Enterprise: $15,000/user/month

AI Hub: $54,000/user/month

IBM Decision Optimization: Best for machine learning

Overall Score

4.1/5

Pricing

2.1/5

General features and interface

4.5/5

Core features

5/5

Advanced features

5/5

Integration and compatibility

4.4/5

UX

3.8/5

Pros

  • Advanced optimization algorithms
  • Integration with machine learning
  • Scalability
  • Customizable models

Cons

  • Limited documentation
  • Inflexible licensing
  • Requires expertise

Why we chose IBM Decision Optimization

IBM has been a major player in computer technologies for decades, having transitioned from producing hardware to developing cutting-edge machine learning systems. Their expertise in this area has placed them at the forefront of business intelligence and prescriptive analytics. While IBM Watson often receives the most attention, IBM Decision Optimization is an equally impressive suite of BI tools that enable large-scale enterprises to transform their operational data into powerful optimization solutions. It is part of IBM’s extensive suite of business intelligence tools.

Alteryx is a very similar competitor, also offering strong data preparation and predictive analytics but lacking the sophisticated optimization capabilities that IBM provides. A key differentiator is IBM’s use of CPLEX solvers, which allow for complex, large-scale optimization problems to be solved efficiently—a feature Alteryx does not offer. 

IBM also has the advantage of offering seamless integration with Watson Studio. This gives you direct utilization of machine learning models within optimization workflows, providing a streamlined, high-performance solution for real-time data processing and scenario planning. Alteryx, while strong in its domain, requires more manual effort to combine predictive and prescriptive analytics, limiting its efficiency in handling complex optimization scenarios. 

With IBM Decision Optimization, you can tackle complex operational challenges across various sectors, from supply chain management to resource allocation. 

Leveraging advanced algorithms and CPLEX solvers integrated with IBM Cloud Pak for Data, this platform turns intricate data sets into actionable insights. If you run a large enterprise that requires sophisticated scenario analysis and what-if modeling to optimize your operations, IBM Decision Optimization is especially beneficial. By integrating with IBM Watson Studio, you can merge machine learning models with optimization techniques, enhancing your operational efficiency and accuracy.

Also read: Reporting Tools Software Guide 2024

Prescriptive analytics: Uses mathematical and computational sciences to suggest decision options to advantage businesses.

Mixed-integer programming (MIP): Enables users to model and solve problems where the decision variables are a mix of continuous and integer variables.

Constraint programming: Helps solve complex combinatorial problems by specifying the constraints that need to be satisfied. 

Heuristic methods: For complex problems where exact methods might be too slow, IBM Decision Optimization provides fast, high-quality heuristic solutions.

Scenario analysis: Allows businesses to consider a range of outcomes and conditions for multiple scenarios to better manage risks and uncertainties. 

IBM ILOG CPLEX Optimization Studio:

IBM Decision Optimization for Watson Studio:

  • Contact for Pricing: Custom pricing is available based on enterprise requirements and deployment scale​ (GptDemo)​.

IBM Cloud Pak for Data:

  • On-Demand Pricing: Starts at $0.56 per capacity unit hour, with additional costs for storage and data transfer.
  • Subscription Pricing: Annual subscriptions are available at a custom quoted price.
KNIME logo.

KNIME: Best for data science flexibility on a budget

Overall Score

4.11/5

Pricing

3.3/5

General features and interface

3.9/5

Core features

3.5/5

Advanced features

5/5

Integration and compatibility

5/5

UX

3.5/5

Pros

  • Open source
  • Extensive integration options
  • Extensive analytics capabilities
  • Strong community support

Cons

  • Resource-intensive workflows
  • Limited in-built visualizations
  • Complex deployment

Why we chose KNIME

While KNIME lacks the sleek, push-button UIs that most other BI tools present, this isn’t necessarily a drawback, depending on the use case. For those in need of high levels of customization and the ability to shape the models and learning algorithms to their data pipelines, workflows, and native environments, KNIME has a lot to offer.

Additionally, KNIME is free to use for individual users, and its more DIY structure facilitates lower costs than other solutions when adding to the user base. KNIME’s “data sandbox” is perfect for data teams that want to supercharge their efforts but don’t need to offer widespread end-user access to the tools themselves.

When compared to RapidMiner Studio, another competitor known for its strong data mining and aggregation capabilities, KNIME wins in the categories of flexibility and cost-effectiveness. RapidMiner offers a more guided experience with its automation features, but this comes at a higher price point and with less customization. KNIME, in contrast, provides a more open environment where data scientists can build highly tailored workflows without being constrained by pre-built processes.

KNIME (Konstanz Information Miner) provides a highly customizable environment for your data analytics needs, catering to data scientists and analysts who require granular control over their workflows.

Its modular design allows you to build data processes using a variety of nodes for tasks like data preprocessing and machine learning. KNIME’s open-source nature makes it accessible to individual users at no cost, with additional enterprise features available for larger teams. If you prioritize flexibility and innovation in your data science projects, KNIME offers a sandbox environment perfect for experimenting with different models and algorithms.

Also read: Best Data Analysis Software & Tools for 2024

Visual workflow editor: Provides an intuitive, drag-and-drop style visual interface for building data workflows. This makes the process of data manipulation, analysis, and visualization easy to understand and execute.

Extensive integration capabilities: Supports a wide range of data formats and systems, including SQL, NoSQL, Hadoop, and various cloud storage options, enabling seamless data integration from diverse sources.

Open source and customizable: Offers the flexibility to customize the platform according to specific needs. Users can contribute new functionalities via KNIME’s node extension system.

Rich analytics tools: Houses a comprehensive set of tools for data mining and machine learning algorithms, statistical functions, and data visualization, serving as a robust platform for data-driven decision-making.

Contact KNIME for a customized quote.

Prescriptive analytics

A quick breakdown of the four common functions of business intelligence:

Descriptive AnalyticsThe “What”Used to organize data, parse it, and visualize it to identify trends.
Diagnostic AnalyticsThe “Why”Used to analyze trends, examine their progress over time, and establish causality.
Predictive AnalyticsThe “When”Used to compile trend and causality data, and extrapolate upcoming changes to anticipate outcomes.
Prescriptive AnalyticsThe “How”Used to predict possible scenarios, test possible strategies for ROI or loss potential, and recommend actions.

Prescriptive analytics is among the most advanced business applications for machine learning and data science. It requires a significant amount of AI processing and depends on large volumes of reliable data. More importantly, like a human employee, it can be trained to respond to inputs and scenarios over time, improving the recommendations it outputs.

Recent studies, such as this one published in the International Journal of Management Information Systems and Data Science, highlight the transformative impact of integrating machine learning with prescriptive analytics to enhance business decision-making processes and competitive advantage.

For a deeper dive on prescriptive analytics and where it fits into the data analysis ecosystem, check out this article on data analysis software.

Read more: What is Diagnostic Analytics?

“Always tell me the odds”: Why prescriptive analytics matter

Prescriptive analytics isn’t a crystal ball. What it is might be closer in analogy to an independent consultant or a military tactician. It surveys the battlefield and considers numerous scenarios based on likelihood, parameters and circumstantial constraints, intensity of effects on final outcomes, and the options or resources available to the organization.

Then, after simulating the possibilities and comparing current plans to potential alternatives, it makes recommendations to promote the most positive results. 

In short, it doesn’t remove the uncertainty from business planning; it reduces the level of disruption caused by unanticipated events or a lack of forethought.

Forecasting outcomes like this can be used to achieve a number of important business goals:

  • Preventing or mitigating loss
  • Minimizing or avoiding risk factors
  • Optimizing processes, schedules, and routes
  • Improving resource utilization and limiting downtime
  • Anticipating opportunities

With prescriptive analytics, businesses can work proactively, instead of reactively. It’s reassurance and validation when things go according to plan, and it’s a safety net when things take a turn for the catastrophic. Either way, you’ve explored the possibilities via numerous scenarios and simulations, and you’re as prepared as possible for what the future brings.

Choosing the best prescriptive analytics software

Remember, “crazy prepared” is only a negative until everyone needs what you’ve prepared in advance. Hopefully, this list of prescriptive analytics tools will help you find the solution that positions your business as the Batman of your industry. If not, check out our in-depth embedded analytics guide for more insight on how to choose a provider for your use case.

Looking for the latest in Business Intelligence solutions? Check out our Business Intelligence Software Buyer’s Guide.

Frequently Asked Questions (FAQ)

Prescriptive analytics is a branch of data analytics that uses machine learning and computational modeling to suggest actions for optimal outcomes based on given parameters.

To choose the best prescriptive analytics platform for your business, assess your specific needs such as data volume, type of analytics required, scalability, user-friendliness, and budget, and review the features, integrations, support, and customer reviews of potential platforms.

Techniques of prescriptive analytics include optimization, simulation, decision analysis, machine learning, and heuristics. These methods help in recommending actions, predicting outcomes, and finding the best course of action based on data-driven insights.

Examples of prescriptive analytics include supply chain optimization, personalized marketing, financial portfolio management, and healthcare treatment plans. These applications use data to recommend specific actions that achieve desired outcomes.

The four types of data analytics tools are descriptive (e.g., dashboards, reports), diagnostic (e.g., root cause analysis), predictive (e.g., forecasting, regression analysis), and prescriptive (e.g., optimization models, recommendation systems).

Algorithms used in prescriptive analytics include linear programming, mixed-integer programming, constraint satisfaction, genetic algorithms, and machine learning algorithms like reinforcement learning. These algorithms help in determining optimal decisions and actions.

The post Top Prescriptive Analytics Tools & Software (2024) appeared first on TechnologyAdvice.

💾

Learn about the benefits of prescriptive analytics tools and software, and review the top options currently available on the market. See our top tools of 2024.
❌
❌