Nintendo Switch 2 pre-orders open ahead of schedule at one optimistic Dutch retailer

Nintendo has hardly revealed anything about the Switch 2 yet. Nonetheless, one Dutch retailer has already opened pre-orders for the console and is charging people €499 for the privilege of being first in line.

Nintendo should reveal more information about the Switch 2 in a little over a week's time. (Image source: Nintendo)

Nintendo should reveal more information about the Switch 2 in a little over a week’s time. (Image source: Nintendo)

Numerous unknowns remain about the Switch 2, even two months after Nintendoofficially unveiledthe device by way of a short YouTube video. Officially, the company has outlined that it will be revealing more information on April 2 during its next Nintendo Direct event. There are suggestions thata separate eventfocusing on launch titles could occur between now and April 2, though.

Similarly, Nintendo has not confirmed how much the Switch 2 will cost yet. Nonetheless, evidenceemerged last monththat the console could cost upwards of CAD$499.99, which currently converts to around $349.99. With that being said, various analysts anticipate the Switch 2 costing$400 or more.

>>>OXY-003 Battery for Nintendo Game Boy Micro OXY-001 Y76NB01

To that end, a Dutch retailer has opened pre-orders for Nintendo’s long-awaited console. While the original Switch(curr. $299 on Amazon)sells for just under €300 in the Netherlands, Game-stock has priced the Switch 2 at €499 (~$539). However, the disclaimer on its listing clarifies that this pricing is an estimate and not final. Game-stock also expects the Switch 2 to reach at least Belgium and the Netherlands ‘in mid-June’, although this is speculative too. As a result, it appears this early pre-sale is more optimistic than anything else. The full quote from Game-stock is as follows:

Please note: This is a Nintendo Switch 2 pre-order. The exact delivery date, price and how many will be delivered is not yet known. Our current estimate is that the Nintendo Switch 2 will be available in the Netherlands and Belgium in mid-June this year.

Delivery is done in order of order: whoever orders first will receive his console first on the day of release. As soon as there is more information, we will update our page immediately.

If we cannot deliver all orders or if you want to cancel the order, you will immediately receive a refund. The price for the pre-order is 499 euros, if the console is cheaper in the end, you will of course be refunded the difference.

(Image source: Game-stock)

>>>WUP-003 Battery for Nintendo Wii U GamePad WUP003 WUP-012

Source(s)

Game-stock

AirPods Max is getting a big update with lossless audio and ultra-low latency - here’s how it works

Higher quality listening on AirPods Max

AirPods Max in various colors

(Image credit: Apple)

The most recent pair of AirPods Max with USB-C lacked something important for audio fans — a wired listening mode. The cable in the box was only there to charge the cans rather than listen to music.

Now, thanks to anupcoming update from Apple, the USB-C equipped AirPods Max aren’t just getting cabled listening, but you’ll also find a couple of different modes out of the box. Lossless audio (up to 24bit, 48Khz) and low latency mode could make them a much tougher contender for thebest wireless headphones.

>>>CP1154 Battery for Apple Airpods Pro A2084 A2083

Big AirPods updates

You’ll be able to use the cable that comes in the box with your AirPods Max to listen to music, and there’s a new cable you can buy that will let you plug your headphones into a 3.5mm headphone jack — a big bonus for music producers.

Once plugged in, you can listen to music at a much higher, lossless bitrate — 24-bit 48Khz. That’s not quite as high as some of thebest music streaming servicesgo, likeQobuz, but it’s a big step up from the wireless limitations that the Max has been subjected to before.

That means you’ll be able to get more detail out of yourApple Musiclistening. It’s also going to apply to much higher-quality Spatial Audio mixes. Apple also thinks it will be a big bonus to music producers, who can now get much better sound out of their AirPods Max and connect to their mixing hardware physically.

Apple says, “Using the USB-C cable, AirPods Max will become the only headphones that enable musicians to both create and mix in Personalised Spatial Audio with head tracking.”

Also coming to the AirPods Max and their new wired listening is a low latency mode. That will reduce the time it takes for audio to reach the headphones, perfect for gaming and live-streaming applications.

>>>020-00098 Battery for Apple Airpods 1 2 1st 2nd Wireless Charge A1596

The update is coming in April for the USB-C model AirPods Max only. The extra USB-C to 3.5mm headphone jack costs $39 and is available today.

Amazon just gave Surface Laptop 7 a ‘frequently returned’ label — here’s what’s going on

Is ARM to blame?

Surface Laptop 7 from the front

(Image credit: Future)

Update

Buyer beware. TheQualcommSnapdragon X Elite-poweredMicrosoft SurfaceLaptop 7 now features a “frequently returned” label onAmazon.

It’s surprising label for a product with over 400 user reviews, most of them in the 4- or 5-star range. However,Amazonis claiming the laptop is returned often. The disclaimers tells prospective buyers to “check the product details and customer reviews to learn more about this item.”

Amazon doesn’t specify why the laptops are being returned but we can hazard some guesses based on the user reviews.

What the Surface Laptop 7 Amazon reviews are saying

(Image credit: Amazon)

In our own 4-starMicrosoft Surface Laptop 7 review, Dave LeClair said that “Microsoft got it right with the latest version of the Surface Laptop 7.” Though he did ding the laptop for underbaked AI features and a lack of ports.

Amazon customers had different complaints.

We read through every review 3 stars and under and several of the 5-star ratings.

The biggest complaint we kept seeing regarded compatibility. The 5-star reviews were concerned with software compatibility but many noted that you could find replacement programs in the Microsoft Store. Though many 3-star and under reviewers were upset that the laptop refused to connect to some of thebest VPN services.

3 stars and lower

The lower rating issues for the Surface Laptop 7 focus more on external peripherals, especially printers and scanners, which don’t seem to connect to the Qualcomm-powered system. Some also claimed that the USB-C architecture and how it connects to the Snapdragon X Elite was limiting what products could be attached to the laptop.

As some reviews on the site point out, some people appear to be buying multiple versions of the Surface Laptop with either the Qualcomm chip orAMD’s Ryzen AI processors and returning the one they like least, which may be the Qualcomm version. Plus, theIntelversions are starting to make their way into the wild.

What configurations have the ‘Frequently returned’ label?

Not every Qualcomm Surface laptop has the ‘Frequently returned item’ label. Here are the ones that are affected.

As far as we can tell, the Snapdragon X Plus variant, which has 10 cores, does not have this label on any of its configurations available from Amazon.

>>>916TA135H Battery for Microsoft Surface Laptop Go 1943

Microsoft Surface Laptop (2024)

ChipsetScreen sizeRAM and StorageFrequently Returned label
Snapdragon X Elite15-inch16GB / 1 TBYes
Snapdragon X Elite15-inch32GB / 1 TBYes
Snapdragon X Elite15-inch16GB / 256 GBYes
Snapdragon X Elite13.8-inch16GB / 1 TBYes
Snapdragon X Elite13.8-inch32GB / 1 TBYes
Snapdragon X Elite13.8-inch16GB / 1 TBYes
Snapdragon X Elite13.8-inch16GB/ 512 GBNo
Snapdragon X Elite13.8-inch32GB / 256GBYes

>>>96BTA016H Battery for Microsoft Surface Pro 8 1982 1983

x86 vs ARM: Customers may not know the difference

There are currently three different versions of Microsoft AI PCs available to customers interested in not just the Surface but Copilot + PCs in general.

You can choose from Snapdragon X Elite (or Plus), Ryzen AI and Intel Lunar Lake laptops from 2024 or 2025. It’s a wealth of options and not everyone knows the difference between x86 or ARM.

What they want is to pull the laptop out of the box and know that it works with their programs, especially for laptops that cost close to $2,000.

Reading through the Surface listing on Amazon, this isn’t really mentioned beyond hyping up the power of the Qualcomm chips. There is one mention of apps with a graphic that reads, “With 87% of app usage now with native ARM versions, you’re getting a top-notch, efficient and secure experience.”

You can scroll that graphic to see apps like PhotoShop, Excel,Discordand more. However, nowhere does it discuss peripheral connectivity or what apps aren’t available.

GPU compatibility dilemma brewing as more high-end power supplies ditch 8-pin connectors in favor of new 16-pin

MSI’s PSUs appear to have been engineered with the RTX 40-and RTX 50-series GPUs as the primary target.

MSI PSU

(Image credit: MSI)

Since the16-pin connectorwas introduced and later revised, melting adapters and PSU/GPU-side connectors on some of Nvidia’s mainstream GeForce gaming GPUs have been rampant. Two of MSI’s recently launched high-wattage PSUs interestingly feature two12V-2x6connectors while providing only one standard 8-pin connector (viaOC3D). Unless you get a 12V-2×6 to 8-pin adapter, these PSUs are incompatible with most modern-day GPUs from AMD, Intel, and even Nvidia (RTX 30 and prior), apart from a fewspecific variants.

The industry (or, more specifically, Nvidia) made the shift towards 16-pin connectors to reduce cable clutter and meet the ever-increasing power demand of new GPUs. After the first GPU meltdown wave hit, the 16-pin standard was revised with pin length changes and is now known as 12V-2×6.Reports of meltingRTX 50-series GPU connectors and adapters have been making rounds on the internet, even with the improved standard. This time, it is believed Nvidia’s power delivery design might be partially responsible.

It seems the industry is catching on to the 16-pin standard, even at the cost of reduced compatibility with other GPUs. MSI’s MPG A1000GS and MPG A1250GS power supplies, targeted at high-wattage GPUs, include two native 16-pin connectors and only a single 8-pin (6+2) connector. If you buy an additional 6+2 connector separately, you can only use a single 8-pin for the CPU, potentially hamstringing CPU performance.

Theoretically, a 225W TGP GPU or lower, like theIntel Arc B580,can easily be powered by one 8-pin connector. However, AMD’sRadeon RX 9070series cannot, as these GPUs typically require two or three 8-pin connectors.

>>>BTY-M551 Battery for MSI Vector 17 HX A13V GP68 GP78 HX GP68HX

The Shift From 8-pin To 16-pin Power Connectors

To be clear, this isn’t the first time we’ve seen such an implementation.Galax’s GH1300also shipped with a dual 16-pin design, but that PSU was a one-off and unleashed to tame theRTX 4090 HOFthat could quickly chug up 1kW of power. As of writing, there isn’t any RTX 50 series GPU with multiple 16-pin connectors, so the full potential of these PSUs remains untapped. Nvidia recommends a 1,000W (minimum) PSU for theRTX 5090. This raises concerns about the MPG A1000GS, which is borderline on recommended specs yet ships with two 12V-2×6 connectors.

>>>BTY-M59 Battery for MSI BTY-M59

All things considered, equipping PSUs with multiple 16-pin connectors is counterintuitive for most consumers, especially given the standard’s original goal to simplify cabling. For maximum user flexibility, a PSU design should include one 12V-2×6 connector alongside multiple 8-pin connectors to accommodate high-power GPUs from either brand.

Gmail’s new search results prioritize relevant emails over recent ones

Search results won’t appear in chronological order anymore.

Gmail app on Pixel Tablet

What you need to know

  • Gmail is rolling out a new, upgraded version of mail search that can prioritize “most relevant” results.
  • Users can still manually sort “most recent” results to get the list in chronological order.
  • The feature is rolling out now for personal Google accounts, and will eventually come to business accounts in the future.

Google is upgrading Gmail’s search smarts with AI to prioritize relevant results over recent keyword hits, it announced in ablog post, March 20. While the old version of search in Gmail showed the most recent keyword results in emails, the upgraded version will “now factor in elements like recency, most-clicked emails and frequent contacts.”

The AI-powered search feature is arriving inGmailas “Most relevant” results. Users can now toggle between “Most relevant” and “Most recent,” depending on the situation. Previously, Gmail defaulted to showing “Most recent” search results, and that was the only option available.

“With this update, the emails you’re looking for are far more likely to be at the top of your search results — saving you valuable time and helping you find important information more easily,” the company explains in the blog post.

You can see an example of how Gmail search results are changing in the graphic below, which shows the old results on the left and the new ones on the right.

>>>GWRL2 Battery for Google Pixel 9 Pro Fold

(Image credit: Google)

The new version of search is rolling out worldwide in Gmail’s web client, as well as the iOS and Android mobile apps. The feature will begin exclusive to personal Google accounts, though the company says it will expand upgraded Gmail search to business accounts “in the future.”

Google has been steadily upgrading Gmail with artificial intelligence to make it easier to use, and more secure. At the end of last year, Gmail became equipped witha new AI model designed to thwart phishing scams in emails. This addition following similar scam-stopping AI features being added toGoogle MessagesandGoogle Phone.

>>>GT6SS Battery for Google Pixel Fold

Time will tell whether users prefer the new version of Gmail search over the old one. For people who constantly have to scroll to find old emails, “Most relevant” results may be more helpful. For others, switching back to “Most recent” could be a better option.

Toshiba has opened a new lab for advancing HDD tech

Toshiba HDD Innovation Lab

(Image credit: Toshiba)


  • Toshiba’s new European HDD Innovation Lab can improve storage tech
  • Lab offers architecture testing, proof-of-concept setups, and benchmarking
  • Toshiba claims combining dozens of HDDs can boost overall performance

Toshiba Electronics Europe has opened a newHDD Innovation Labat its Düsseldorf site, expanding its storage evaluation services across Europe and the Middle East.

The new facility (it already has a smaller one in Dubai) is designed to support customers and partners in optimizing hard disk drive configurations for a range of applications, including cloud storage, surveillance systems, and NAS environments.

Toshiba’s lab will focus on assessing HDD setups for broader IT systems such as storage area networks (SAN), providing a space where hardware configurations can be tested and refined. It will be able to evaluate customer-specific architectures and offer a platform for proof-of-concept testing and performance benchmarking.

>>>PS0231UA1BRS Battery for Toshiba Dynabook PS0231UA1BRS

Combining hard drives

“This new HDD Innovation Lab represents a significant leap forward in providing bespoke solutions and advancing HDD technology,” said Rainer Kaese, senior manager for HDD business development.

“It demonstrates Toshiba’s commitment to drive the industry forward and support customers and partners with technical expertise and resources. We look forward to strengthening existing collaborations and exploring the future business opportunities the new facility will bring.”

To carry out these evaluations, the lab brings together servers, JBoDs, chassis, controllers, cables, and a variety of software tools. It also includes equipment to accurately measure energy consumption.

While SSDs have a clear speed advantage over HDDs, they are expensive and, according to Kaese viaBlocks & Files, “The flash industry is not be able to manufacture enough capacity to satisfy the growing demand, and still will not be for a significant while.”

The solution to that problem, Kaese suggested, is to bunch HDDS together.

“We have demonstrated that 60 HDDs in ZFS software defined storage can fill the entire speed of a 100GbE network,” he said, adding, “[We] found that a typical configuration of four HDDs (ie. in small Soho NAS) can fill the 10GbE networks. 12 HDDs match the 25GbE of Enterprise networks, and 60 HDDs would require high end 100GbE network speed to unleash the full performance of the many combined HDDs.”

>>>PA5368U-1BRS Battery for Toshiba Dynabook Satellite Pro L50-G

Beyond technical testing, the HDD Innovation Lab aims to support knowledge sharing. Insights from evaluations will be passed directly to customers, and Toshiba says it will conduct its own internal assessments of its HDD product lines, publishing the findings as whitepapers and lab reports.

DisplayPort vs. HDMI: Which is better for gaming?

We look at bandwidth, resolution, refresh rate and more and discuss the differences between DisplayPort and HDMI connections.

DisplayPort vs HDMI

(Image credit: Amazon)

Thebest gaming monitorsandbest graphics cardsare packed with features, but one aspect that often gets overlooked is the number and type of DisplayPort vs. HDMI connections. What are the differences between the two ports and is using one for connecting to your system definitively better?

You might think it’s a simple matter of hooking up whatever cable comes with your monitor to your PC and calling it a day, but there are differences that can often mean a loss of refresh rate, color quality, or both if you’re not careful. Here’s what you need to know about DisplayPort vs. HDMI connections.

If you’re looking tobuy a new PC monitororbuy a new graphics card, you’ll want to consider the capabilities of both sides of the connection — the video output of your graphics card and the video input on your display — before making any purchases. OurGPU benchmarkshierarchy will tell you how the various graphics cards rank in terms of performance, but it doesn’t dig into the connectivity options, which is something we’ll cover here. 

>>>Acer Nitro 16 AN16-41 AN16-51 Replacement Battery

The Major Display Connection Types 

From left to right: Composite, VGA, DVI, HDMI, and DisplayPort.(Image credit: Shutterstock)

The latest display connectivity standards areDisplayPortand HDMI (High-Definition Multimedia Interface). DisplayPort first appeared in 2006, while HDMI came out in 2002. Both are digital standards, meaning all the data about the pixels on your screen is represented as 0s and 1s as it zips across your cable, and it’s up to the display to convert that digital information into an image on your screen.

Earlier digital monitors used DVI (Digital Visual Interface) connectors, and going back even further we had analog VGA (Video Graphics Array) — along with component RGB, S-Video, composite video, EGA, and CGA. You don’t want to use VGA or any of those others in the 2020s. They’re old, meaning, any new GPU likely won’t even support the connector, and even if they did, you’d be using an analog signal that’s prone to interference. Yuck.

DVI is the bare minimum you want to use today, and even that has limitations. It has a lot in common with early HDMI, just without audio support. It works fine for gaming at1080p, or1440presolution if you have a dual-link connection.Dual-link DVI-Dis basically double the bandwidth ofsingle-link DVI-Dvia extra pins and wires, and most modern GPUs with a DVI port support dual-link. But the truly modern graphics cards likeNvidia’s Blackwell RTX 50-seriesandAda Lovelace RTX 40-series,AMD’s RDNA 4 RX 9000-seriesandRDNA 3 RX 7000-series, andIntel’s Battlemage Arc B-seriesandArc Alchemist A-series GPUsalmost never include DVI connectors these days. Basically, DVI-D has been deprecated since the pre-Covid days, so if you have an older monitor that needs a DVI-D connection, it’s time to start thinking about an upgrade.

If you’re wondering about Thunderbolt 2/3/4, it routes DisplayPort over the Thunderbolt connection. Thunderbolt 2 supports DisplayPort 1.2, and Thunderbolt 3 supports DisplayPort 1.4 video. Thunderbolt 4 also uses DisplayPort 1.4, with the requirement that devices support up to two simultaneous 4K60 signals. It’s also possible to route HDMI 2.0 over Thunderbolt 3 with the right hardware.

Some newer displays also have the option of using a USB Type-C connector for video. Supported bandwidths and resolutions depend on the particular monitor, but while the connector might be easier to insert, it can also be inadvertently pulled out if you’re not careful. We’re not going to dig into the Type-C options here, though if they begin to catch on we may revisit the subject. (Note: Several years later, the Type-C option still hasn’t caught on except with portable displays.)

For newer displays, it’s best to go with DisplayPort or HDMI. But is there a clear winner between the two? Let’s dig into the details.

Modern GPU with 2x DP and 2x HDMI ports.
>>>Acer Switch V 10 SW5-017 Replacement Battery

DisplayPort vs. HDMI: Specs and Resolutions 

Not all DisplayPort and HDMI ports are created equal. The DisplayPort and HDMI standards are backward compatible, meaning you can plug in an HDTV from the mid-00s and it should still work with a brand new RTX 50-series or RX 9000-series graphics card. However, the connection between your display and graphics card will end up using the best option supported by both the sending and receiving ends of the connection. That could mean thebest 4K gaming monitorwith 240 Hz and HDR support will end up running at4Kand 24 Hz on an older graphics card!

Here’s a quick overview of the major DisplayPort and HDMI revisions, their maximum signal rates and the GPU families that first added support for the standard.

>>>Acer Chromebook Spin 311 C721 R721T Replacement Battery

DisplayPort vs. HDMI Specs

Header Cell – Column 0Max Transmission RateMax Data RateUncompressed Resolution/Refresh Rate Support (24 bpp)Max DSC Resolution/Refresh RateGPU Introduction
DisplayPort Versions     
1.0-1.1a10.8 Gbps8.64 Gbps1080p @ 144 HzN/AAMD HD 3000 (R600)
   4K @ 30 Hz Nvidia GeForce 9 (Tesla)
1.2-1.2a21.6 Gbps17.28 Gbps1080p @ 240 HzN/AAMD HD 6000 (Northern Islands)
  Row 4 – Cell 24K @ 75 Hz Nvidia GK100 (Kepler)
   5K @ 30 Hz  
1.332.4 Gbps25.92 Gbps1080p @ 360 HzN/AAMD RX 400 (Polaris)
   4K @ 98 Hz Nvidia GM100 (Maxwell 1)
   5K @ 60 Hz  
   8K @ 30 Hz  
1.4-1.4a32.4 Gbps25.92 Gbps4K @ 98 Hz4K @ 240 HzAMD RX 400 (Polaris)
  Row 11 – Cell 28K @ 30 Hz8K @ 60 HzNvidia GM200 (Maxwell 2)
2.0-2.180.0 Gbps77.37 Gbps4K @ 240 Hz4K @ 500+ HzAMD RX 7000 (54 Gbps), Intel Arc A-series (40 Gbps)
   8K @ 85 Hz8K @ 240 HzNvidia RTX 50 (Blackwell)
HDMI Versions     
1.0-1.2a4.95 Gbps3.96 Gbps1080p @ 60 HzN/AAMD HD 2000 (R600)
     Nvidia GeForce 9 (Tesla)
1.3-1.4b10.2 Gbps8.16 Gbps1080p @ 144 HzN/AAMD HD 5000
   1440p @ 75 Hz Nvidia GK100 (Kepler)
   4K @ 30 Hz  
   4K 4:2:0 @ 60 Hz  
2.0-2.0b18.0 Gbps14.4 Gbps1080p @ 240 HzN/AAMD RX 400 (Polaris)
   4K @ 60 Hz Nvidia GM200 (Maxwell 2)
   8K 4:2:0 @ 30 Hz  
2.148.0 Gbps42.6 Gbps4K @ 144 Hz4K @ 240 HzNvidia RTX 30 (Ampere), AMD RX 5000 (RDNA)
   8K @ 30 Hz8K @ 120 HzPartial 2.1 VRR on Nvidia Turing

Note that there are two bandwidth columns: transmission rate and data rate. The DisplayPort and HDMI digital signals use bitrate encoding of some form — 8b/10b for most of the older standards, 16b/18b for HDMI 2.1, and 128b/132b for DisplayPort 2.x. 8b/10b encoding for example means for every 8 bits of data, 10 bits are actually transmitted, with the extra bits used to help maintain signal integrity (eg, by ensuring zero DC bias).

That means only 80% of the theoretical bandwidth is available for data use with 8b/10b. 16b/18b encoding improves that to 88.9% efficiency, while 128b/132b encoding yields 97% efficiency. There are still other considerations, like the auxiliary channel on HDMI, but that’s not a major factor for PC use.

Also note the maximum supported uncompressed and compressed (DSC stands for Display Stream Compression) modes. DSC had some issues with the earliest versions, but GPUs in the post-2018 timeframe seem to work fine with the feature.

 Let’s Talk More About Bandwidth

Generic data cable 3D rendering

To understand the above chart, we need to go deeper. What all digital connections — DisplayPort, HDMI and even DVI-D — end up coming down to is the required bandwidth. Every pixel on your display has three components: red, green, and blue (RGB) — alternatively: luma, blue chroma difference, and red chroma difference (YCbCr/YPbPr) can be used. Whatever your GPU renders internally (typically 16-bit floating point RGBA, where A is the alpha/transparency information), that data gets converted into a signal for your display.

The standard in the past has been 24-bit color, or eight bits each for the red, green and blue color components.HDRand high color depth displays have bumped that to10-bitcolor, with 12-bit and 16-bit options as well, though the latter two are mostly in the professional space. Generally speaking, display signals use either 24 bits per pixel (bpp) or 30 bpp, with thebest HDR monitorsopting for 30 bpp. Multiply the color depth by the number of pixels and the screen refresh rate and you get the minimum required bandwidth. We say ‘minimum’ because there are a bunch of other factors as well.

Display timings are relatively complex calculations. The VESA governing body defines the standards, and there’seven a handy spreadsheetthat spits out the actual timings for a given resolution. A 1920×1080 monitor at a 60 Hz refresh rate, for example, uses 2,000 pixels per horizontal line and 1,111 lines once all the timing stuff is added. That’s because display blanking intervals need to be factored in. (These blanking intervals are partly a holdover from the analog CRT screen days, but the standards still include it even with digital displays.)

Using the VESA spreadsheet and running the calculations gives the following bandwidth requirements. Look at the following table and compare it with the first table; if the required data bandwidth is less than the max data rate that a standard supports, then the resolution can be used.

Common Resolution Bandwidth Requirementsb

 Color DepthRefresh Rate (Hz)Required Data Bandwidth
1920 x 10808-bit603.20 Gbps
1920 x 108010-bit604.00 Gbps
1920 x 10808-bit1448.00 Gbps
1920 x 108010-bit14410.00 Gbps
2560 x 14408-bit605.63 Gbps
2560 x 144010-bit607.04 Gbps
2560 x 14408-bit14414.08 Gbps
2560 x 144010-bit14417.60 Gbps
3840 x 21608-bit6012.54 Gbps
3840 x 216010-bit6015.68 Gbps
3840 x 21608-bit14431.35 Gbps
3840 x 216010-bit14439.19 Gbps
3840 x 21608-bit24056.45 Gbps (~19 DSC)
3840 x 216010-bit24070.56 Gbps (~24 DSC)
7680 x 43208-bit6049.99 Gbps (~17 DSC)
7680 x 432010-bit6062.49Gbps (~21 DSC)
7680 x 43208-bit120103.62 Gbps (~35 DSC)
7680 x 432010-bit120129.53 Gbps (~43 DSC)
7680 x 43208-bit240223.48 Gbps (~75 DSC)
7680 x 432010-bit240279.35 Gbps (~93 DSC)

The above figures are foruncompressedsignals, however, and DisplayPort 1.4 added the option ofDisplay Stream Compression1.2a (DSC), which is also part of HDMI 2.1. In short, DSC helps overcome bandwidth limitations, which are becoming increasingly problematic as resolutions and refresh rates increase. For example, basic 24 bpp at 8K and 60 Hz needs 49.65 Gbps of data bandwidth, or 62.06 Gbps for 10 bpp HDR color — the former could be supported by DisplayPort 2.1 UHBR13.5, while the latter would require UHBR20 (which few monitors currently support, and which is only available on RTX 50-series and one port for AMD’s professional W7900 GPU).

DSC can provide up to a 3:1 compression ratio by converting to YCgCo and using delta PCM encoding. It provides a “visually lossless” (and sometimes even truly lossless, depending on what you’re viewing) result. Using DSC, 8K 120 Hz HDR is suddenly viable, with a bandwidth requirement of ‘only’ 42.58 Gbps. DisplayPort 1.4 can also run 4K at 240 Hz using DSC.

There’s a catch with DSC, however: Support can be a bit hit and miss, particularly on older GPUs. We’ve tested a bunch of graphics cards using a Samsung Odyssey Neo G8 32, which supports up to 4K at 240 Hz over DisplayPort 1.4 or HDMI 2.1. On DisplayPort connections, most of the latest GPUs are fine, but cards from 2016 and earlier may not even allow the use of 240 Hz. We’ve also seen video signal corruption on occasion, where dropping to 120 Hz (still with DSC) often fixes the problem. In short, cable quality and the DSC hardware implementation still factor into the equation.

Both HDMI and DisplayPort can also carry audio data, which requires bandwidth as well, though it’s a minuscule amount compared to the video data. DisplayPort and HDMI currently use a maximum of 36.86 Mbps for audio, or 0.037 Gbps if we keep things in the same units as video. Earlier versions of each standard can use even less data for audio. One important note is that HDMI supports audio pass through, while DisplayPort does not. If you’re planning to hook up your GPU to an amplifier, HDMI provides a better solution.

That’s a lengthy introduction to a complex subject, but if you’ve ever wondered why the simple math (resolution * refresh rate * color depth) doesn’t match published specs, it’s because of all the timing standards, encoding, audio, and more. Bandwidth isn’t the only factor, but in general, the standard with a higher maximum bandwidth is ‘better.’

DisplayPort: The PC Choice 

DP 8K cable, now supplanted by DP80.

Currently DisplayPort 2.1 is the most capable version of the DisplayPort standard. The DisplayPort 2.0 spec came out in June 2019, and Intel’s Arc Alchemist GPUs along with AMD’s RDNA 3 GPUs supported the standard. It was later revised to DisplayPort 2.1, but all DP2.0 capable hardware should still be compatible. Nvidia finally added DisplayPort 2.1b with its Blackwell RTX 50-series GPUs.

So, there are now cards with DisplayPort 2.1 support, but they’re still of different levels. Intel’s Arc GPUs support UHBR10 (Ultra-High Bitrate 10 Gbps per lane), for a 40 Gbps maximum connection speed (not including 128b/132b encoding). AMD opted for the faster UHBR13.5 (54 Gbps total), but neither company supports the potential 20 Gbps per lane variant — except for AMD allowing UHBR20 on a single port for its professional Radeon Pro W7900 graphics card. Nvidia went with full UHBR20 (80 Gbps) support on all outputs for it’s 50-series and future solutions. But perhaps the bigger issue now isn’t GPU support.

There still aren’t many displays that support DisplayPort 2.1. Those are starting to appear, but it’s the old chicken and egg scenario. GPUs with DisplayPort 2.1 support have now been around for three years, monitors that can use DP2.1 have been lagging behind. Probably that’s because DisplayPort 1.4 remains sufficient for up to 4K 240 Hz and 8K 60Hz with DSC, and HDMI 2.1 support is there for people that need up to 48 Gbps.

There are DP2.1 monitors now, supposedly with UHBR20 support. Curiously, we’ve seen 4K 240 Hz OLED monitors that will still enable DSC if you want to use the full capabilities of the display. That should be possible without DSC, but at least one monitor we’ve tested (MSI MPG272UX OLED) drops the maximum 4K refresh rate to 120 Hz if we disable DSC in the monitor OSD (on-screen display).

One advantage of DisplayPort is that variable refresh rates (VRR) have been part of the standard since DisplayPort 1.2a. We also like the robust DisplayPort connector (but not mini-DisplayPort), which has hooks that latch into place to keep cables secure. It’s a small thing, but we’ve definitely pulled loose more than a few HDMI cables by accident. DisplayPort can also connect multiple screens to a single port via Multi-Stream Transport (MST), and the DisplayPort signal can be piped over a USB Type-C connector that also supports MST.

One area where there has been some confusion is in regards to licensing and royalties. DisplayPort was supposed to be a less expensive standard, but both HDMI and DisplayPort have various associated brands, trademarks, and patents that have to be licensed. With technologies like HDCP (High-bandwidth Digital Content Protection), DSC, and more, companies have to pay a royalty for DP just like HDMI. The current rate appears to be $0.20 per product with a DisplayPort interface, with a cap of $7 million per year. HDMI charges $0.15 per product, or $0.05 if the HDMI logo is used in promotional materials.

Because the standard has evolved over the years, not all DisplayPort cables will work properly at the latest speeds. The original Display 1.0-1.1a spec allowed for RBR (reduced bit rate) and HBR (high bit rate) cables, capable of 5.18 Gbps and 8.64 Gbps of data bandwidth, respectively. DisplayPort 1.2 introduced HBR2, doubled the maximum data bit rate to 17.28 Gbps and is compatible with standard HBR DisplayPort cables. HBR3 with DisplayPort 1.3-1.4a increased things again to 25.92 Gbps, and added the requirement of DP8K DisplayPort certified cables.

Finally, with DisplayPort 2.1 there are three new transmission modes: UHBR10 (ultra high bit rate), UHBR13.5 and UHBR20. The number refers to the bandwidth of each lane, and DisplayPort uses four lanes, so UHBR10 offers up to 40 Gbps of transmission rate, UHBR13.5 can do 54 Gbps and UHBR20 peaks at 80 Gbps. DP2.1 uses 128b/132b encoding, meaning data bit rates of 38.69 Gbs, 52.22 Gbps, and 77.37 Gbps. And now there are new cables to meet those standards.

Officially, the maximum length of a DisplayPort cable is up to 3m (9.8 feet), which is one of the potential drawbacks, particularly for consumer electronics use. Newer versions with higher bandwidths can cut that length even more. As an example, checking the official DisplayPort certification list, the longest DP80 certified cable right now is only 1.2m (3.94 ft) long, and many are only 0.8 or 1.0 meters. Getting a cable that’s only 3 feet or less in length generally means you have to have your PC on top of your desk.

With a maximum data rate of 25.92 Gbps, DisplayPort 1.4 can handle 4K resolution 24-bit color at 98 Hz, and dropping to 4:2:2 YCbCr gets it to 144 Hz with HDR. Alternatively, DSC allows up to 4K and 240 Hz, even with HDR. Keep in mind that 4K HDR monitors running at 144 Hz or more carry premium pricing, so gamers will more likely be looking at something like a 144Hz display at 1440p. That only requires 14.08 Gbps for 24-bit color or 17.60 Gbps for 30-bit HDR, which DP 1.4 can easily handle.

If you’re wondering about 8K content in the future, the reality is that even though it’s doable right now via DSC and DisplayPort 1.4a or HDMI 2.1b, the displays and PC hardware needed to drive such displays aren’t generally within reach of consumer budgets. Top-tier GPUs like the GeForce RTX 4090 and GeForce RTX 5090 sort of overcome that limitation, but 8K pixel densities often outstrip modest human eyesight. By the time 8K becomes a viable resolution, both in price and in the GPU performance required to run it adequately, we’ll likely have gone through another generation or three of GPU hardware.

HDMI: Ubiquitous Consumer Electronics 

(Image credit: HDMI.org)

Updates to HDMI have kept the standard relevant for over 20 years. The earliest versions of HDMI have become outdated, but later versions have increased bandwidth and features.

HDMI 2.0b and earlier are ‘worse’ in some ways compared to DisplayPort 1.4, but if you’re not trying to run at extremely high resolutions or refresh rates, you probably won’t notice the difference. Full 24-bit RGB color at 4K 60 Hz has been available since HDMI 2.0 released in 2013, and higher resolutions and/or refresh rates are possible with 4:2:0 YCbCr output — though you generally don’t want to use that with PC text, as it can make the edges look fuzzy.

For AMD FreeSync users, HDMI has also supported VRR via an AMD extension since 2.0b, but HDMI 2.1 is where VRR became part of the official standard. AMD and Nvidia all support HDMI 2.1 with VRR, starting with Turing and RDNA 2. (Intel has also supported the standard since its first Alchemist GPUs.) Nvidia opted to call its HDMI 2.1 VRR solution “G-Sync Compatible,” and you can find a list of all the officially tested andsupported displays on Nvidia’s site.

One major advantage of HDMI is that it’s ubiquitous. Millions of devices with HDMI shipped in 2004 when the standard was young, and it’s now found everywhere. These days, consumer electronics devices like TVs often include support for three or more HDMI ports. TVs and consumer electronics hardware have been shipping HDMI 2.1 devices for a while, before PCs even had support. Finding a TV with a DisplayPort input, by contrast, remains very uncommon.

HDMI cable requirements have changed over time, just like DisplayPort. One of the big advantages is that high quality HDMI cables can be up to 15m (49.2 feet) in length — five times longer than DisplayPort. That may not be important for a display sitting on your desk, but it can definitely matter for home theater use. Originally, HDMI had two categories of cables: category 1 or standard HDMI cables are intended for lower resolutions and/or shorter runs, and category 2 or “High Speed” HDMI cables are capable of 1080p at 60 Hz and 4K at 30 Hz with lengths of up to 15m.

More recently, HDMI 2.0 introduced “Premium High Speed” cables certified to meet the 18 Gbps bit rate, and HDMI 2.1 has created a fourth class of cable, “Ultra High Speed” HDMI that can handle up to 48 Gbps. HDMI also provides for routing Ethernet signals over the HDMI cable, though this is rarely used in the PC space.

We mentioned licensing fees earlier, and while HDMI Technology doesn’t explicitly state the cost, this website details the variousHDMI licencing feesas of 2014. The short summary: for a high volume business making a lot of cables or devices, it’s $10,000 annually, and $0.05 per HDMI port provided HDCP (High Definition Content Protection) is used and the HDMI logo is displayed in marketing material. In other words, the cost to end users is easily absorbed in most cases — unless some bean counter comes down with a case of extreme penny pinching.

Like DisplayPort, HDMI also supports HDCP to protect the content from being copied. That’s a separate licensing fee, naturally (though it reduces the HDMI fee). HDMI has supported HDCP since the beginning, starting at HDCP 1.1 and reaching HDCP 2.2 with HDMI 2.0. HDCP can cause issues with longer cables, and ultimately it appears to annoy consumers more than the pirates. At present, known hacks / workarounds to strip HDCP 2.2 from video signals can be found.

HDMI 2.1 allows for up to 48 Gbps signaling rates, and it also supports DSC. Theoretically, that means resolutions and refresh rates of up to 4K at 480 Hz or 8K at 120 Hz are supported over a single connection and cable. We’re not aware of any 4K 480 Hz displays yet, though there are prototype 8K 120 Hz TVs that have been shown atCES.

DisplayPort vs. HDMI: The Bottom Line for Gamers 

HDMI and DisplayPort connections

We’ve covered the technical details of DisplayPort and HDMI, but which one is actually better for gaming? Some of that will depend on the hardware you already own or intend to purchase. Both standards are capable of delivering a good gaming experience, but if you want a great gaming experience, right now DisplayPort 1.4 is generally better than HDMI 2.0, HDMI 2.1 technically beats DP 1.4, and DisplayPort 2.1 trumps HDMI 2.1 (provided you have UHBR13.5 or higher support). The problem is that you’ll need support for the desired standard from both your graphics card and your display for things to work right.

For most Nvidia gamers, your best option right now is a DisplayPort 1.4 connection to a G-Sync certified (compatible or official) display. Alternatively, HDMI 2.1 with a newer display works as well. Both the RTX 30-series and 40-series cards support the same connection standards, for better or worse. Most graphics cards will come with three DisplayPort connections and a single HDMI output, though you can find models with two HDMI and two (or three) DisplayPort connections as well — only four active outputs at a time are supported. RTX 50-series GPUs meanwhile can benefit from DisplayPort 2.1 monitors, so if you’re planning on picking up a 4K 240 Hz display with an RTX 5080 or 5090, that’s a potent combination.

AMD gamers have a few more options. You can find DisplayPort 2.1 monitors and TVs, if you look hard enough. The Asus ROG Swift PG32UXQR for example supports DisplayPort 2.1. HDMI 2.1 connectivity is also sufficient, and there are more displays available. Keep in mind that maximum bandwidth of both the RDNA 3 and RDNA 4 GPUs is 54 Gbps over DisplayPort 2.1, or 48 Gbps over HDMI 2.1, so it’s not a huge difference. Most AMD RX 7900-series cards that we’ve seen include two DisplayPort 2.1 ports, and either two HDMI 2.1 or a single HDMI 2.1 alongside a USB Type-C connection. The newer RX 9070-series GPUs typically have triple DP2.1 and a single HDMI port.

Intel’s GPUs support DP2.1 UHBR10, with the Battlemage B-series parts adding a single UHBR13.5 port. VRR is also supported, if you have an appropriate Adaptive Sync monitor. Basically, you don’t want to try using a G-Sync display that isn’t Adaptive Sync compatible with either AMD or Intel GPUs.

What if you already have a monitor that isn’t running at higher refresh rates or doesn’t have G-Sync or FreeSync capability, and it has both HDMI and DisplayPort inputs? Assuming your graphics card also supports both connections (and it probably does if it’s a card made in the past eight years), in many instances the choice of connection won’t really matter.

2560×1440 at a fixed 144 Hz refresh rate and 24-bit color works just fine on DisplayPort 1.2 or higher, as well as HDMI 2.0 or higher. Anything lower than that will also work without trouble on either connection type. About the only caveat is that sometimes HDMI connections on a monitor will default to a limited RGB range, but you can correct that in the AMD, Intel, or Nvidia display options. (This is because old TV standards used a limited color range, and some modern displays still think that’s a good idea. News flash: it’s not.)

Other use cases might push you toward DisplayPort as well, like if you want to use MST to have multiple displays daisy chained from a single port. That’s not a very common scenario, but DisplayPort does make it possible. Home theater use on the other hand continues to prefer HDMI, and the auxiliary channel can improve universal remote compatibility. If you’re hooking up your PC to a TV, HDMI is usually required, as there aren’t many TVs that have a DisplayPort input.

You can do 4K at 60 Hz on both standards without DSC, so it’s only 8K or 4K at refresh rates above 60 Hz where you actually run into limitations on recent GPUs. We’ve used AMD, Intel, and Nvidia GPUs at 4K and 98 Hz (8-bit RGB) with most models going back to the 2016 era, and 4:2:2 chroma can push even higher refresh rates if needed. Modern gaming monitors like the Samsung Odyssey Neo G8 32 with 4K and up to 240 Hz are also available, with DisplayPort 1.4 and HDMI 2.1 connectivity.

Ultimately, while there are certain specs advantages to DisplayPort and some features on HDMI that can make it a better choice for consumer electronics use, the two standards end up overlapping in many areas. The VESA standards group in charge of DisplayPort has its eyes on PC adoption growth, whereas HDMI is defined by a consumer electronics consortium and thinks about TVs first. But DisplayPort and HDMI end up with similar capabilities.

Older Pixels just got a big performance boost, while the Pixel 9a is lacking a key feature

Better GPU performance for Tensor and no satellites for the 9a

Google Pixel 9 in green Wintergreen color showing AI features on screen

The Google Pixel 9(Image credit: Philip Berne / Future)


  • Pixel models from the Pixel 6 onwards are achieving improved benchmark results
  • This is likely down do newer GPU drivers being added in recent software updates
  • In less positive news, the Pixel 6a lacks satellite communication features

There’s both good and bad Pixel news today, but the good news will affect more people than the bad, so let’s start there.

Reddit usersare finding that Pixel phones with Tensor chipsets (meaning everything from theGoogle Pixel 6onwards) are achieving much higher GPU scores on Geekbench 6 than they did at launch. This is widely being attributed to the Android 16 beta, butAndroid Authorityreports seeing similarly upgraded performance onAndroid 15.

So chances are you don’t need to grab a beta version of Android to see improvements, but rather that recent stable software updates have massively boosted GPU performance.

The exact boost varies depending on model, but Android Authority claims itsPixel 6aunit saw a nearly 23% GPU performance increase, while elsewhere there are reports of a 62% improvement for thePixel 7a, a 31% improvement for thePixel 8, and even a 32% improvement for the recentGoogle Pixel 9.

Android Authority speculates thatGoogleachieved this through including newer GPU drivers in recent Android updates, as while all recent Pixels use anArmMali GPU, they don’t always ship with the latest available GPU driver version.

How much impact these performance improvements will have in the real world remains to be seen, but they’re nice to see, and could help extend the lifespan of older Pixel models.

>>>GMSB3 Battery for Google Pixel 6

No Satellite SOS for the Pixel 9a

The Google Pixel 9a(Image credit: Google)

Now for the bad news, and this relates specifically to the newGoogle Pixel 9a, which we’ve learned doesn’t support Satellite SOS. Google confirmed as much toAndroid Authority, and this is a feature found on other Google Pixel 9 models which allows you to contact emergency services in areas without Wi-Fi or cell signal.

So it’s a potentially life-saving tool, and while Google didn’t say why it’s absent here, it’s likely because the Pixel 9a uses an older Exynos Modem 5300, rather than the 5400 used by the rest of the Pixel 9 series.

While this is a feature that you’ll hopefully never need to use, it would be reassuring to have, and this isn’t the only omission in the Pixel 9a, as we also recently learned thatit lacks several AI toolsoffered by the rest of the Pixel 9 line.

In fact, this phone has had a slightly troubled launch, with not just these omissions emerging, but also a delay in sales of the phone while Google investigatesa “component quality issue”.

>>>GZE8U Battery for Google Pixel 7

Still, the silver lining there is that this delay allowed time for these omissions to be uncovered, so you might think twice about buying the Google Pixel 9a. Certainly, we’d wait until we’ve had a chance to put it through a full review before purchasing one.

Which is better for astrophotography? Sony, Canon or Nikon?

Three giants in the camera industry — but which one has the best astrophotography features for you?

The milky way galaxy as seen over a mountain range.

The Milky Way is a popular target for astrophotographers. (Image credit: Getty Images)

It can be a complex decision to choose the right camera brand for astrophotography. With new technology and an ever-increasing rate of camera development, the major photographic brands that we see regularly popping up, like Canon,Nikon, andSony, each bring district strengths to the table. Telling these strengths apart and knowing what to pay attention to can sometimes be difficult. In astrophotography, key factors likelow-light performance, high ISO capabilities, noise reduction, and any included specific astro-centric features are paramount. If you are in the market for a camera, it might be worth checking out these brands in ourcamera deals hub. In this article, we will discuss how the brands compare and answer common tech queries about each system.

Sony

Sony excels at handling high ISO with minimal noise. (Image credit: Getty Images)

Firstly, Sony recently broke ground with its cutting-edge sensor technology. Sony’s mirrorless systems, especially the full-frame options, excel in handling high ISO settings with minimal noise, making them popular for astrophotographers and people taking images in low-light environments. Their backside illuminated sensors (often referred to as BI sensors) allow more light to hit the pixel area, improving sensitivity and reducing noise, which is crucial for capturing faint celestial objects. Sony also has excellent real-time star tracking and pixel shift technology to provide clear, high-detail astro images, but much of this comes at a price, with good full-frame Sony options coming in at thousands of dollars.

Canon

Canon’s exceptional color science makes them a favorite among astrophotographers.(Image credit: Getty Images)

Canon has long been a favorite for astrophotography thanks to its exceptional color science and sensor performance at higher ISO values, although they tend to produce slightly more noise than Sony’s comparable models. Canon’s strength lies in its wide range ofcompatible lensesand dedicated low-light features in select models such as theCanon EOS R5andEOS R6. The affordability of many of their models mean that second-hand Canon options can often be purchased for a fraction of the price of many newer options.

>>>LP-E5 Battery for Canon EOS 450D 500D 1000D KISSX2 KISSX3

Nikon

Nikon has recently stepped up its game with mirrorless Z-series cameras, like theNikon Z6andZ7. Nikon’s full-frame cameras boast excellent dynamic range and superb low-light performance, making them competitive for astrophotography. Like Sony, Nikon’s backside illumination sensor technology helps with reducing noise at high ISOs, though Nikon’s high ISO noise reduction isn’t always as aggressive as Sony’s. Nikon also offers specialized features such as ‘star-eater’ suppression, designed to maintain image quality during long exposures which will come in handy for photographers who are pointing towards the night sky.

‘Star eater’ suppression is designed to maintain image quality during long exposures.(Image credit: Getty Images)

>>>EN-EL18D Battery for Nikon Z9 D4 D4S D5 D6

Best entry-level cameras for astrophotography

If you are just getting started in astrophotography, you will want to look at setups that allow experimentation — a mixture of affordability and awide range of lensesis important to look at. TheNikon D5600is a good starting point — plenty of lens combinations are available and the 24.2 MP sensor has an ISO range of 100-25,600. It is also good at low light autofocus, which, although, as a budding astrophotographer, you’ll rarely use, it is good to have as an option if you are taking photos of a supermoon, for example.

Low light autofocus is a good feature to have when photographing a supermoon.(Image credit: jamesvancouver via Getty Images)

We would also recommend something like the ultra-portableSony Alpha a6400, which, although still an APS-C sensor, offers options for interchangeable lenses and features real-time tracking, where it uses AI behavior models to detect and capture objects in the distance automatically.

CameraSensorISO RangeMegapixelPriceAstro-features
Sony Alpha a6400APS-C100-32,00024.2$900Real-time tracking
Canon EOS Rebel T7i (800D)APS-C100-25,60024.2$750Dual Pixel AF
Nikon D5600APS-C100-25,60024.2$700Low-light AF

Best mid-range cameras for astrophotography

Long exposure shooting is unavoidable in astrophotography (Image credit: Getty Images)

If you have learned the ropes and you are looking to take a step up from entry-level gear, mid-range cameras offer enhanced performance, better noise control, and several more advanced features that you are likely to use in other forms of photography, too. They are also more likely to feature full-frame sensors, which means that you will produce larger images and more detail to work with in the edit.

TheSony Alpha A7 IIIis an excellent choice for serious astrophotographers on a mid-range budget. Its full-frame sensor performs excellently in low-light scenarios, minimizing noise and maximizing detail in dark environments. TheNikon’s Z6is also an option to consider – it strikes a good balance between price and performance and like the A7 III, it uses a backside illumination sensor to allow more light to reach inside the camera, making it great for night sky photography. It also features in-body stabilization and excellent star tracking to help with long-exposure shots.

CameraSensorISO RangeMegapixelPriceAstro features
Sony Alpha A7 IIIFull-frame100-51,20024.2$2000Excellent ISO handling
Canon EOS RFull-frame100-40,00030.3$1800RF lens mount offers lots of choice
Nikon Z6Full-frame100-51,20024.5$1600Star tracking

Should you switch camera brands?

Once you have committed to a camera brand, and bought the associated lenses and accessories, it is sometimes quite hard to go back to the drawing board. However, if other brands or cameras have specific attributes that will help you in astrophotography, it is well worth having a look at the implications of switching. We will take each switch option and weigh it up.

Switching to Canon

If you are moving from Sony to Canon, expect to see a difference in image color science, with Canon often praised for its color accuracy. Switching to Canon can be a money saver, thanks to the high availability of lenses and bodies.

Switching away from either Sony or Nikon towards Canon may only be necessary for those looking for a switch to a camera for life full of professional specs like theCanon EOS R5.

One thing that is worth reiterating is that Canon’s widespread popularity means lens options (new and used) are plentiful — so worth considering for those who want to save money.

Switching to Sony

Sony’s high ISO noise reduction is impressive for capturing stars and deep sky objects.(Image credit: Kimberley Lane)

Although it will hit your wallet a little harder, switching to Sony offers better high ISO noise reduction and sensor technology, making it ideal for those focused on capturing deep-space objects. Switching would require an investment in Sony’s full-frame mirrorless lens system, but the superior noise handling at high ISOs can be a game-changer — these are some of the best cameras you can buy at each price point.

Switching to Nikon

Nikon offers a comparable dynamic range to its competitors and solid low-light performance in its mirrorless Z-series, but the shift away from Sony or Canon may not bring a significant leap in performance unless you value Nikon’s lens ecosystem or prefer Nikon’s ergonomics and handling.

Some photographers prefer Nikon’s ergonomics and handling.

Nikon’s dynamic range however is often praised, and this can benefit landscape astrophotographers who want more flexibility in post-processing. Their growing range of Z-mount lenses can make the switch worth it, especially for those prioritizing dynamic range over specialized astro features.

Summary

Deciding on whether to switch depends on your astrophotography goals.(Image credit: Getty Images)

The decision to switch between Sony, Canon, and Nikon largely depends on your specific astrophotography goals. Each brand offers its own set of strengths, whether it’s Sony’s superior ISO handling, Canon’s specialized low-light features, or Nikon’s dynamic range. Cost, lens availability, and compatibility with your existing astrophotography setup (includingtelescopes) should also factor into the decision. It is worth looking at theused marketto ensure you are getting the best setup for your dollar. It will also come down to the ergonomics and usability of each camera; which menu system and which interfaces you feel most comfortable with.

Windows 11 Insider builds offer FAQs based on your PC’s specs

Still not as competitive as WinSAT back in the old days.

Windows 11 logo on blue bloom background.

(Image credit: Microsoft)

Microsofthas released a new feature in its latest 26120.3576 and 22635.5090 Insider builds that helps users understand their PC’s capabilities (viaphantomofearthon X).

From its looks, this feature isn’t cutting-edge by any means, as a simple search on the internet will likely land you with many more detailed and fine-tuned answers. In addition, it also tells you if you have the latest Windows release installed on your system, but oddly asks you to verify the information in the Windows Update section.

Back in the Vista days, Microsoft released WinSAT, a tool designed to assess the capabilities of your hardware through a set of tests. It would report back the score on a scale between 1.0 and either 5.9 or 9.9, depending on the Windows version. While this number wasn’t definitive, it let people know where their PC stood. WinSAT also benchmarked individual components such as the CPU, memory, graphics, and disk enabling users to identify potential bottlenecks.

>>>G3HTA049H Battery for Microsoft Surface Book 2 15 inch 1835

The new FAQ feature spans two different Windows Insider versions: build 26120.3576for Windows Insiders in the Beta Channel for Windows 11 23H2 and build 22635.5090for Windows Insiders in the Beta and Dev Channels for Windows 11 24H2. Release notes for both releases mention the inclusion of a new “Frequently Asked Questions” list within “Device specifications” under System > About. The FAQs are disabled by default, though can be activated by enabling the “55305888” Feature ID.

Known FAQs (non-exhaustive) detailing your Windows version, followed by how much RAM you have and your system’s graphical capabilities have been compiled in aGitHub Gist. The first question allows users to check if their Windows is up-to-date but confusingly asks them to verify again within the Windows Update section. That’s most certainly a bug. The second question outlines what applications can be run on your system based on its RAM capacity, using a few common ranges such as <4GB, 4GB-8GB, 8GB-16GB, and >16GB. Lastly, the graphics section goes over the graphical capabilities of your system, or lack thereof if you don’t have a dedicated GPU.

>>>916TA135H Battery for Microsoft Surface Laptop Go 1943

This approach is not as quantifiable as WinSAT and omits assessment of key components such as the CPU and storage. Given Microsoft’s strong push for AI, an AI-driven hardware analysis utility would have been a significant overhaul to WinSAT. (Maybe I’m just reading too much into what’s supposed to be a supplementary feature.) Hopefully, the official release will include more detailed insights that provide users with more than just the basics.