We are specialists in quality batteries. We have batteries, chargers and accessories for everything you can think of. Low prices, big inventory, expert advice. Find your battery here!
US President-elect Donald Trump has claimed what he calls a “magnificent victory” over incumbent US Vice-President Kamala Harris in the US presidential election, eight years after his first term as president.
As world leaders congratulate Trump on his win, Americans across the country are either mourning or celebrating the results – and some even appear to be considering leaving the country.
“Move to New Zealand” is trending on Google Search, with Google Trends data showing interest from US internet users spiked on election night, November 6.
Most of these searches came from US citizens living in Oregon on the West Coast, followed by those living in Vermont, Hawaii, Colorado and Montana.
The top five most popular related search queries included, “Immigrate to New Zealand”, “New Zealand immigration”, “How can I move to New Zealand”, “How to move to Canada from US”, and “New Zealand citizenship”.
These were followed by searches like “Can an American move to New Zealand” and “Cost of living in New Zealand”.
Google Trends searches are ranked between 0 and 100, with 0 meaning not enough data and 100 indicating peak popularity. At 9.08pm on November 6, search interest in “Move to New Zealand” sat at 100.
Meanwhile, others have taken to social media to share their desire to leave the US following the election results, with one writing, “Any good places you can recommend I move to in NZ? LOL”.
One Kiwi wrote on social media platform X, “For the US homies. Feel free to move to NZ. I got spare beds, a warm place and plenty of help.”
Another shared a link to the New Zealand Immigration website, simply writing, “Just leaving this here for the real ones”, alongside an American flag emoji.
Search interest in moving from the US to Australia has also risen, according to the search engine – with most searches coming from the state of Maine, followed by Washington, Colorado and Oregon.
Searchers queried, “Is it hard to move to Australia”, “How to immigrate to Australia”, and “Moving to Australia from USA”.
It’s reminiscent of the 2020 US election, when many Americans took to Google following the first “chaotic” debate between then-candidate Joe Biden and Trump.
Apple News lets users follow news with a widget that updates in real time
The feature leverages the power of Dynamic Island on iPhone 14 and later
In order to make it easier to follow the results of the US elections, Apple has implemented a very useful measure: the company is displaying the results in real time on the iPhone’s lock screen. In this way, we can see the progress of the count at a glance.
And not only that, since users of the most recent iPhones will be able to see the count directly from the Dynamic Island (available on the iPhone 14 Pro and later models) while doing other things with the phone. Of course, to see this information on the iPhone screen, you must have previously registered.
The feature takes advantage of the potential of iOS Live Activities
The feature we are talking about is part of Apple News, a free app available in the United States, Canada, the United Kingdom and Australia that compiles relevant news about the topics that interest us most. As reported by the specialized media AppleInsider, the application is taking advantage of the potential of Live Activities to show real-time information about the United States elections.
Live Activities are commonly used in all kinds of apps to show relevant information, thus avoiding the user having to enter the corresponding app to consult the information. For example, they are available in applications such as Uber, showing the time left until the driver picks us up and information of interest, such as the license plate and model of the vehicle.
The widget displayed by Apple News shows users the electoral count of both candidates for the US presidency, as well as the results of the elections for the Senate and the House of Representatives. Without a doubt, it is a very convenient way to be updated at all times.
The widget is not only available on the iPhone: it also appears on the iPad and the Apple Watch. However, as we mentioned at the beginning of the article, it is necessary to enable the tracking of the results. This is something that can be done through Apple News, where Apple has included a special banner for the elections.
At this point, it is interesting to note that this is not the first time that Apple News has used Live Activities in this way: at the beginning of the year they did the same to report the results of the caucuses, where the candidates of each party for the final elections are chosen.
The iPhone 16 Pro Max (Image credit: Future / Lance Ulanoff) Apple, it seems, is all about ‘intelligence’ these days, as following on from the launch of Apple Intelligence, the company might be about to roll out a feature dubbed Battery Intelligence. But unlike Apple Intelligence’s suite of tools, this new feature will seemingly have just one job.
9to5Mac has found a framework called ‘BatteryIntelligence’ in code from the latest iOS 18.2 beta, and while the feature isn’t active in the beta, it’s reportedly designed to estimate how long it will take to charge your phone.
According to 9to5Mac, Battery Intelligence won’t just estimate the time to reach 100%, but it’ll also let you choose to get an estimate for when your battery will reach 80%, and perhaps other charge levels.
It’s easy to see how this could be a useful tool – if you don’t have long to charge, you’ll be able to get an immediate indication of how fully charged your phone will be able to get in a given time. Or, if you don’t want to charge your phone above a certain level (in order to preserve battery health), but haven’t set a charging limit, you’ll know to check on it after however much time the estimate says.
The iPhone 16 Pro Max (Image credit: Future / Lance Ulanoff)
However, as this feature isn’t currently enabled, it’s unclear whether it will actually launch as part of iOS 18.2. It might land with iOS 18.3 or beyond instead, and it’s always possible that Apple will choose not to roll this tool out at all.
Still, there’s a good chance that Battery Intelligence will launch, and since the framework for it is present in iOS 18.2 beta code, we’d think it’s likely we’ll see it sooner rather than later.
Either way, there’s plenty more to get excited about in iOS 18.2, with numerous new AI tools coming, including Image Playground, Genmoji,ChatGPTintegration, and Visual Intelligence.
iOS 18.2 will also add improvements to the Camera Control button and the Find My feature, and it should be coming soon, with iOS 18.2 reportedly landing in early December.
The next Pixel update has been accidentally teased early by a Verizon support page that detailed all the fixes coming to the Google Pixel 9 line-up – and older Pixel 6,7 and 8 devices.
First spotted by 9to5Google the patch should have supposedly launched on November 5 according to the Verizon page, however, the update isn’t yet live at the time of writing. That said, we expect it’ll roll out sometime soon to bring the following improvements to your Pixel 9 smartphone via version AP3A.241105.008:
Security
Provides the most up-to-date Android security patches on your device.
Bluetooth
Addressed an issue with Bluetooth range under certain conditions.
Camera
Addressed an issue with camera tilt when zooming between cameras under certain conditions.
Sensors
Addressed an issue that occasionally prevented Adaptive brightness from activating in certain conditions.
Touch
Addressed an issue when pressing the keyboard dismiss button in certain conditions.
User interface
General improvements for performance and stability in certain UI transitions and animations.
Display & Graphics
Resolved an issue that caused white dots to flash under certain conditions.
(Image credit: Google) The AP3A.241105.007 update for the Google Pixel 6, Google Pixel 7, and Google Pixel 8 devices includes fewer stability fixes and instead just includes these details:
The most up to date Android security patches on your device.
General improvements for performance and stability in certain UI transitions and animations.
So nothing too major for any of the Pixel phones unlike the arrival of Android 15 last month. However, if your Google Pixel 9, Pixel 9 Pro, or Pixel 9 Pro Fold has had problems with any of the listed issues, you’ll appreciate that they’ve finally been addressed.
As for non-Pixel updates, Samsung users are still waiting for the full Android 15 launch, though a One UI 7 beta has been tipped to land later this month with new icon styles, lock screen live widgets, and math and physics support in Circle to Search among other tools. The non-beta rollout is expected sometime in early 2025.
The ideal assistant when you see way too many message notifications
(Image credit: Google) Step away from your computer or phone for a few minutes, and you might return to an avalanche of notifications about new messages. If you’re using Google Chat, you won’t have to comb through them all to find out what you missed anymore, thanks to Google’s Gemini AI assistant. You can pull up Gemini from the Google Chat sidebar and ask the AI to summarize the conversation and dig into the most important bits.
The new feature expands Gemini’s presence from other Google Workspace applications like Docs and Drive into Google Chat. If you click on the “Ask Gemini” icon at the top of the Google Chat interface, a chat window for the AI will appear where you can ask about what’s been said in group chats, direct messages, and spaces.
If you ask the AI to “catch me up,” you’ll get a complete conversation summary, which you can ask for in bullet points. If it’s too short, you can request a longer summary too. You can also ask for more specific details, like any requests for help, key takeaways, or other decisions made in the thread. You can even ask about other people’s tasks or what a person said about specific topics. The demo below shows how it works.
Gemini can’t sort through your entire conversational history, just the current view. It also is restricted to Google Chat. That means no emails or files in Google Drive, despite Gemini having a presence and access to those applications in other circumstances. Google claims this is deliberate as it maintains focus on the current chat in context without pulling in irrelevant information. Plus, tightly constraining data sources reduce the risk of Gemini hallucinating. You also won’t be able to pull up Gemini in Google Chat without a subscription to Gemini Business, Enterprise, Education, or Education Premium.
Even with those (likely temporary) limits to availability, bringing Gemini to Google Chat fits with how Google is working to embed Gemini across all of its platforms and services. That includes Gemini Extensions to take up Google Assistant’s role with Google Messages, Maps, and pretty much everything Android does.
The Eye Tracking accessibility feature follows the movement of your eyes as you gaze at the screen, select items, and activate controls.
Most people navigate their iPhone or iPad with their fingers, but Apple recently added eye tracking to its suite of accessibility features, which also supporthead-andvoice-based gestures.
Known as Eye Tracking, this new accessibility option is available iniOS 18andiPadOS 18on iPhone 12 and up and follows the movement of your eyes as you look at the screen, select items, and activate controls. An onscreen pointer shows where your eyes are looking, and a dwell control lets you perform a specific action by holding your gaze on an item for a few seconds.
Eye Tracking is designed for people with physical disabilities, but anyone can use it. It doesn’t require any extra hardware or accessories to activate physical buttons, swipes, and other gestures across all your apps.
Tapping into artificial intelligence, Eye Tracking uses the front-facing camera to track your eye movements. You can set up and calibrate the feature relatively quickly. Any data used to set up and control Eye Tracking is processed through machine learning and secured on the device and isn’t shared with Apple. Now, here’s how this works.
First, you’ll need to update your iPhone to iOS 18 and your iPad to iPadOS 18, if you haven’t already. Head toSettings > General > Software Updateand install the new OS. To enable Eye Tracking, go toSettings > Accessibility > Eye Trackingand enableEye Tracking.
You’ll be prompted to calibrate the tracking by following a dot around the screen. For best results, make sure your face is well lit and the phone’s camera has a clear view. The phone should also be on a stable surface about a foot and a half away from your face.
Follow the dot with the gaze of your eyes. When you’re done, a checkmark appears on the screen, indicating that you’ve successfully completed the setup. Whenever you turn off Eye Tracking and then turn it back on, you’ll need to go through the calibration again.
You’re then returned to the Eye Tracking settings screen, where you can get a feel for eye tracking. Move your gaze slowly up and then back down. You’ll see an outline around each icon, setting, or other item as you adjust your gaze.
To rest on a certain control, keep your gaze on that area as outlined by the cursor. A circular icon will then appear. Rest your gaze on the control you wish to activate or change. When that icon is fully circled, then the control will activate or deactivate.
Change Eye Tracking Settings
At the Eye Tracking settings screen, you can tweak its behavior. The Smoothing option controls the movement of the pointer. Move the slider to the right to make the movement smoother. Note that moving it too far to the right could decrease its responsiveness.
Snap to Item automatically snaps the pointer to the nearest item as you gaze around the screen. The Zoom on Keyboard Keys option zooms in when the pointer is resting on one of the keys on the keyboard. Turn the switch off or on for each of the options to test them.
Snap to Item automatically snaps the pointer to the nearest item as you gaze around the screen. The Zoom on Keyboard Keys option zooms in when the pointer is resting on one of the keys on the keyboard. Turn the switch off or on for each of the options to test them.
Turn on Auto-Hide to automatically hide the cursor when you look away. Tap the plus (+) and minus (–) keys to set the amount of time you need to gaze at the screen to display the cursor again. Move the slider to decrease the visibility of the pointer when you’re in Auto-Hide mode.
You can also customize the onscreen pointer to increase or decrease the size and change the color. Head to Settings > Accessibility > Pointer Control and move the slider on the Pointer Size to alter the size of the pointer. Tap Color to change the color to white, blue, red, green, yellow, or orange.
Enable and Tweak Dwell Control
EnablingDwell Controlin the Eye Tracking settings menu allows you to hold your gaze on a switch or other object to perform the designated action. For example, holding your gaze on a switch will turn it on or off. To further customize Dwell Control, tapAssistiveTouchto bring up the AssistiveTouch menu.
Choose the Dwell Control icon to turn the feature off or on, switch the default action between Tap and Pause Dwell, adjust the distance you can gaze while dwelling on an item, and set up specific actions when you dwell in any of the four corners of the screen.
How to Use Eye Tracking
Once everything is set up, you’ll want to take Eye Tracking for a spin on your iPhone. Go to the Home screen and point your gaze in different areas of the screen to see how the cursor moves. Rest your gaze on a specific icon to open the associated app.
You’ll need to use AssistiveTouch to perform more types of actions with Eye Tracking. To activate this, hold your gaze on the AssistiveTouch circle moving around the screen. After the menu appears, direct your eyes toward the icon for the action you want to perform. For example, to return to the Home screen, hold your gaze on the Home icon.
Recalibrate Eye Tracking
If the pointer isn’t responding accurately to your eye movements or gaze, try recalibrating the feature. Return toSettings > Accessibility > Eye Trackingand turn the switch off and back on.
As you go through the setup again, keep your iPhone as steady as possible and 1.5 feet away from your face. Also, don’t try to anticipate where the calibration circle is going to appear. Keep your gaze steady on the current spot and move your eyes as the circle moves to the next spot.
Samsung’s next premium set of smartphones have been pictured before being release early next year. While the Galaxy S25 and Galaxy S25 Plus are hard to tell apart from the current Galaxy S24 and Galaxy S24 Plus, the Galaxy S25 Ultra should be easy to tell apart from the outgoing Galaxy S24 Ultra.
The Galaxy S25 Ultra should look a fair bit different from Samsung’s current ‘Ultra’ smartphone. (Image source: via Roland Quandt)
The Galaxy S25 series is likely a way off from being released. Ultimately, Samsung has not confirmed anything yet. However, the volume of Galaxy S25, Galaxy S25 Plus and Galaxy S25 Ultra leaks recently implies that a repeat of an official introduction in January should be expected.
To that end, it now appears that third-parties are getting their accessories in order for the replacement of theGalaxy S24,Galaxy S24 PlusandGalaxy S24 Ultraearly next year. Specifically, Roland Quandt has shared images of what look like unofficial clear cases on X (formerly Twitter). As is often the case, it is unclear who has produced these cases.
Nonetheless, they match existing expectations for the Galaxy S25 series, which could be available exclusively with Qualcomm’s recentSnapdragon 8 Elitechipset. As such, the Galaxy S25 and Galaxy S25 Plus are the spitting image of their predecessors. Thus, both devices should launch with three rear-facing cameras arranged vertically with thin display bezels and a flat overall design.
Meanwhile, the Galaxy S25 Ultra will represent another modest redesign for Samsung’s ‘Ultra’ smartphones. Based on all available information, next year’s Galaxy S Ultra will move away from its predecessor’s boxy design, albeit while retaining a familiar camera design and a flat display. Supposedly, the Galaxy S25 Ultra will be smaller than theGalaxy S24 Ultratoo, all in the aid of improved ergonomics. Incidentally, colour options for all three models recently leaked online, whichwe have covered separately.
Samsung Galaxy S25. (Image source via Roland Quandt)
Xiaomi has now confirmed roughly when it will be releasing its recent Redmi Note 14 series internationally. At the same time, the company has also revealed where the Redmi Note 14, Redmi Note 14 Pro and Redmi Note 14 Pro Plus, a leaker has provided a specific date for their release.
The Redmi Note 14 Pro Plus will be the most powerful option in the Redmi Note 14 range. (Image source: Xiaomi)
Xiaomi has released three Redmi Note 14 series smartphones to date, which serve as direct replacements for last year’s Redmi Note 13 5G, Redmi Note 13 Pro 5G and Redmi Note 13 Pro Plus 5G. There has been evidence that Xiaomi will eventually introduce corresponding 4G models though, just as it has with the Redmi Note 13 series.
With that being said, the Redmi Note 14, Redmi Note 14 5G, Redmi Note 14 Pro 5G and Redmi Note 14 Pro Plus 5G have been exclusive to China since their simultaneous releases in late September. However, Xiaomi has now offered the first hint about their international debut.
Somewhat unsurprisingly, the company is remaining light on specifics for now. Nonetheless, BW Businessworld reports that Xiaomi intends to launch Redmi Note 14 models before the end of the year in India. To that end, outgoing Xiaomi India president Muralikrishnan B outlined as much in an accompanying interview:
After 2022, we slowed down to an annual [Redmi Note] cycle… now, with improved efficiency and manufacturing alignment, we’re ready to return to a dual-launch approach.
According to BW Businessworld, this Indian-specific launch will occur in December. While the magazine has not provided a date yet, leaker Sanju Choudhary suggests that all three devices could officially arrive on December 26. Although pricing remains unofficial for now, please see our Redmi Note 14 5G, Redmi Note 14 Pro 5G and Redmi Note 14 Pro Plus 5G launch articles to see how the trio compare against Xiaomi’s current mid-range models.
Samsungis gearing up to launch theGalaxy S25 Ultra, powered by a customized version of Qualcomm’s latestSnapdragon 8 Eliteprocessor, specifically tailored as a “For Galaxy” edition. This chipset, built on a 3nm process, promises substantial improvements over previous generations, with Qualcomm touting a 45% CPU performance increase and a 44% boost in power efficiency.
Galaxy S25 Ultra on Geekbench with improved scores
These upgrades align with the Galaxy S25 Ultra’s recent benchmark scores on Geekbench, where the device, identified by model number “SM-S938U,” achieves single-core and multi-core scores of 3,148 and 10,236, respectively.
This reflects a nearly 35% boost in single-core and a 45% jump in multi-core performance over the Galaxy S24 Ultra, which scores around 2,200 and 7,100 points in the respective tests.
For those unfamiliar, the S25 Ultrapreviously surfacedon the platform with single-core and multi-core scores of 3069 and 9080, respectively. So there have been some sort of improvement in heat dissipation or performance optimization that enables the phone reach higher scores.
Samsung had reportedly beentestingits own Exynos 2500 chipset for the Galaxy S25 line but appears to have committed to Qualcomm’s Snapdragon 8 Elite for mass production, potentially limiting the Exynos 2500 to select markets. While the Galaxy S25 Ultra’s early scores fall just below those achieved by theOnePlus 13‘s standard Snapdragon 8 Elite (3,296 in single-core), Samsung’s “For Galaxy” chip tuning might improve the performance and stability even further by the time the phone hits the market.
Moreover, Samsung has retained its configuration of 12 GB RAM for the Galaxy S25 Ultra, consistent since the Galaxy S22 Ultra. Samsung is expected to maintain its established design language for the Galaxy S25 and S25 Plus, while rumors suggest a significant redesign for the Ultra model. The S25 Ultra could finally come with rounded corners and a completely flat display, similar to the other two models.
Want to use a monitor with your laptop but keep the lid closed? Here’s a quick guide on how to set it up.
(Image credit: Future)
If you’re not a fan of your laptop’s screen, you may prefer to connect up one of the best monitors instead. That way, you can get a more comfortable typing experience and a nicer viewing experience at the same time. Essentially, you’ll have all the advantages of a desktop computer setup, but with the option to disconnect your laptop when you need to work on the go.
There’s just one problem: you don’t want to be distracted by seeing the same thing on two separate screens. Well, actually, that’s not a problem at all. Because it’s pretty easy to set up your laptop so it carries on working even when closed. And that’s the case whether you use a Windows laptop or a MacBook (or any of our best graphic design laptops).
In this short article, we’ll explain how to do both, in turn. Be aware, though, that keeping your lid closed may lead to your laptop heating up more than usual, so you may want to invest in a cooling pad or similar solution.
How to use a monitor with a closed Windows laptop Here’s how to use a monitor with a closed Windows laptop. Firstly, connect up your laptop, keyboard and mouse to your monitor in the usual way. Then you’ll need to tweak the settings of your Windows system.
Open the Control Panel. You can find this by typing ‘control panel’ into your Windows search; the little magnifying glass at the bottom of your screen.
Choose Hardware and Sound from the list that appears. Another list will appear; click on Power Options.
(Note: if you’re still using Windows 10, rather than 11, you can get to this point by clicking on the battery icon at the bottom right hand corner of your screen.)
In the new box, there’s a list of options on the left-hand side. Click on Choose what closing the lid does. Another box will appear, as shown below.
In the centre, there are three options. Next toWhen I close the lid,change the dropdowns (for both battery and plugged-in modes) from Sleep toDo nothing. Finally, click onSave changesat the bottom of the box.
The picture from your laptop should now appear on your monitor even when you close the lid. If it looks a little funny, you can adjust the display settings by going to toSettings > System > Displayand then tweaking the settings to suit. (A shortcut to this is pressing theWindows key + I).
How to use a monitor with a closed MacBook
Using one of the latest MacBooks with a monitor is easy: you don’t have to change single setting inmacOS. The only requirement is that you’ll need to plug your laptop into a power socket; otherwise, it may enter sleep mode after you close the lid.
Once you’ve done that, simply connect up your MacBook, keyboard and mouse to your monitor in the usual way. (If you’re having trouble with the first of these, read our article onHow to connect a monitor to MacBook Pro.) Then, once your MacBook’s display appears on the monitor, close the lid. All done.
However, if you have an older MacBook and this doesn’t work, then you’ll have to tweak the settings like this:
1. ChooseApplemenu > System Settings, then clickBatteryin the sidebar.
2. ClickOptions, then turn onPrevent automatic sleeping on power adapter when the display is off.
(Image credit: Future) Finally, if the picture from your MacBook looks funny on your monitor, click the Apple menu in the top-right corner of the screen, and go to System Settings or System Preferences > Displays. You’ll then be able to fiddle with the settings to get the picture right.