
Search results for 'Rural' - Page: 4
| NewstalkZB - 28 Aug (NewstalkZB) Two men have been charged with the cultivation of cannabis from a rural address in Hawke’s Bay after a pre-planned police search uncovered what they allege is a “significant” operation.
The search warrant was carried out at a Puketitiri Rd address northwest of Napier, situated near the Kaweka Forest Park, on Friday, August 22.
The police said they seized what they described as a “significant amount of cannabis”.
A significant amount of cannabis was found during a police search warrant at a Puketitiri address. Photo / NZME
Several patrol vehicles were seen heading back to town on Puketitiri Rd after 6pm on Friday.
The men, aged 33 and 44, are due to reappear in Hastings District Court on September 16.
“If you have concerns about illegal drug use in your community, please call 111 if there is an immediate public safety risk, or contact us via 105 online or by phone to make a report,” the police said. Read...Newslink ©2025 to NewstalkZB |  |
|  | | NewstalkZB - 28 Aug (NewstalkZB) A man swept away and killed in floodwaters during the Auckland Anniversary weekend floods was out to dinner with his wife the night he died, the inquest into the disaster has heard.
Dave Young, a father and grandfather, was one of four men who died on January 27, 2023. He was swept away in the rural town of Onewhero when a stream burst its banks on Allen and Eyre Rd.
Counsel assisting bereft families told the inquest yesterday Young’s widow Jane Young believed her husband would not have died if an emergency mobile alert had been issued and they had been made aware of the danger.
Jane Young is expected to present her witness evidence later this week.
Today, while counsel assisting families cross-examined Waikato Civil Defence Emergency Management group controller Aaron Tregoweth on why a mobile alert was not sent, more details of the Youngs’ night have emerged.
They had been to a restaurant, which was filled with other diners, and nobody in the building was aware of anything “other than it was raining”, families’ counsel told the inquest.
“No weather alerts, such as text message warnings, were issued that evening,” counsel said.
“‘If we had received weather warnings, we could have come home earlier or not gone out at all. We were completely unaware of the danger. We just thought it was heavy rain and not a major threat to life and property that needed to be taken very seriously.”
Dave Young, a father and grandfather, was one of four men who died on January 27, 2023. Photo / Supplied
Tregoweth told the inquest he had done a “basic assessment” of the unfolding weather event, “and I did not think [the need] was met”.
Tregoweth said there was “a very high bar” for issuing a mobile alert, “we’re talking about a near-source tsunami, evacuate-now type information”.
There is a checklist to use for when an emergency mobile alert (EMA) should be issued. Tregoweth said the checklist reflected the seriousness of its use.
Tregoweth said this could be an opportunity to review whether the checklist and threshold for issuing an EMA was appropriate.
Julian Snowball, Waikato Civil Defence Emergency Management (Waikato CDEM) group controller, yesterday told the inquest an EMA would have gone out to too wide an area where warnings would not have been necessary that night.
`If we had received weather warnings, we could have come home earlier or not gone out at all,` Dave Young`s widow`s evidence to the inquest is expected to say. Photo / Supplied
Jane Glover, counsel assisting bereft families, questioned this, given mobile alerts can be sent to specific areas.
Snowball said sending a mobile alert to a specific geographic area was more difficult in rural areas where there were fewer people and fewer cell towers.
He said the Waikato CDEM knew there was significant flooding happening, “so the potential for an emergency was there in terms of the severity ... but probably the most appropriate thing to do was to close the road [from where Young was swept away], not issue an EMA”.
“An EMA wasn’t sent out because it was not a widespread event [the flooding in Onewhero], but localised,” Snowball said.
“You couldn’t target a specific area in Onewhero without EMA overspill. I am pretty confident it could have gone to Tuakau and Port Waikato.”
He also said he believed the threshold for issuing a mobile alert was not met, and “even with the benefit of hindsight, I don’t think it was met”.
He conceded to Glover that the thresholds for issuing mobile alerts could be reviewed.
However, he questioned whether there would be appetite for it from the public, pointing to frustration from some just under a month ago when emergency alerts were issued early in the morning due to a tsunami threat from a magnitude 8.0 earthquake in Russia.
Snowball also yesterday admitted there were gaps in the information Waikato CDEM was receiving from police, St John, and Fire and Emergency New Zealand from on the ground.
“Decisions are made or not made [by Waikato CDEM] based on... Read...Newslink ©2025 to NewstalkZB |  |
|  | | RadioNZ - 27 Aug (RadioNZ) Australian police have named the two officers who were shot dead in rural Victoria yesterday. Read...Newslink ©2025 to RadioNZ |  |
|  | | BBCWorld - 27 Aug (BBCWorld)Two officers were killed and another injured in a shooting at a rural property in Victoria on Tuesday. Read...Newslink ©2025 to BBCWorld |  |
|  | | BBCWorld - 27 Aug (BBCWorld)Australian police say `heavily armed` man escaped into the bush after the shooting in Victoria state. Read...Newslink ©2025 to BBCWorld |  |
|  | | Stuff.co.nz - 24 Aug (Stuff.co.nz) Rural NZ is a land of milk, beef and money thanks to strong global demand, but the PM says the Government hasn’t forgotten urban centres which are doing it tough. Read...Newslink ©2025 to Stuff.co.nz |  |
|  | | Stuff.co.nz - 24 Aug (Stuff.co.nz) Property values rose 2.1% in this semi-rural spot, while they dipped across the country. Read...Newslink ©2025 to Stuff.co.nz |  |
|  | | PC World - 24 Aug (PC World)I recently moved to a much more rural area, so getting Starlink set up was one of my top priorities. My area is an internet dead zone where you might get a bit of 4G on a nearby hill, but that’s about it. No cellular for phone calls, and the best I can hope from a landline connection is 3 Mbps. As a modern man with a modern family full of modern devices, I need fast internet—so I readied Starlink even before my kids’ beds.
It worked pretty well, too. At first I heard a bunch of buzzing noises that I was not expecting, but that sort of coil whine is apparently pretty typical. A few minutes later, I was online!
But it wasn’t all smooth sailing. Having Starlink isn’t like having fiber internet, and I ran into several surprises along the way. Here are all the things I wish I’d known before getting Starlink at home.
Starlink is better when it’s mounted
As soon as I had Starlink working, I messaged my friends saying “Space internet installed!” with the following image:
Jon Martindale / Foundry
That’s right. The Starlink dish is propped up in the cardboard box it came in, sitting on some steps leading to a lawn that was never intended as its permanent home. It worked well enough for the first night—but that’s as long as I would ever want it to be there.
Turns out, Starlink performs best when the dish is mounted in a location that’s free from obstructions and oriented in a way that maintains a connection with as many Starlink satellites as possible. The Starlink app makes the whole process pretty straightforward, with dynamically adjusting on-screen graphics that help you rotate the dish into its optimal facing. My ground-mounted performance was (obviously) bad, so taking the time to get it into a better position was worthwhile.
But I’m no handyman. I can build a PC, sure, but hoofing up a ladder and drilling into red brick isn’t something I’m super comfortable doing—so I brought in a local professional TV antenna installer.
Within a couple of hours, he had the dish mounted by my roof. Performance jumped from 50 Mbps to nearly 200 Mbps downstream. A huge improvement with better coverage, less chance of someone just wandering into my yard and stealing the dish, and no chance of my kids riding their bikes over the cable. That’s a win-win-win.
…but Starlink can be ugly when mounted
Personally, I think the Starlink dish looks pretty cool. Its a unique sight compared to all those rounded satellite dishes that you’ve likely seen in urban centers over the last several decades. It’s more modern.
Jon Martindale / Foundry
But the makeshift pipe-mount system I used? Eh, that leaves a lot to be desired. There are more attractive first-party mounts you can buy at additional cost, but a giant pipe on an unpainted brace is cheaper. Unfortunately, my wife is even less of a fan.
It’s not like I’m going to be looking at it much up there. But if the exterior aesthetics of your home are important to you, it’s probably worth spending some more time (and money) than I did to get it mounted in a way that gets you great performance while looking good.
Starlink’s upload speed is still lacking
One aspect of fiber internet that’s easy to overlook is that it isn’t just blazingly fast for downloads—you can get upload speeds that are often as fast as your download speeds. That makes quick work when uploading work documents, personal photos, YouTube videos, and more.
Jon Martindale / Foundry
As I said at the start, though, Starlink isn’t fiber. I’m getting around 150 Mbps average download speeds with peaks up to 300 Mbps, but my upload speeds are decidedly slower. I’ve seen some people post screenshots of 50 Mbps uploads, but I’ve yet to see mine break 30 Mbps. More often than not, it’s closer to 15 Mbps.
To be fair, 15 Mbps is plenty for sending photos over messaging apps and streaming my webcam during Discord D&D sessions, but it’s a lot more noticeable when I’m trying to send long videos to friends and family. And I don’t think I’d get far trying to livestream my gaming on Twitch at anything over 1080p with this kind of internet.
Starlink’s router is underwhelming
This might sound like a humblebrag, but the bundled Starlink Gen 3 router—a tri-band Wi-Fi 6 router with a claimed coverage of just over 3,000 square feet—wasn’t enough for my new house. Truth is, my place is about half of that, yet I still had trouble getting signal everywhere due to walls, obstructions, and other sources of interference.
Could I have place the Starlink router in a better spot for better coverage? Yeah, maybe. And there’s even a mesh system I could’ve employed if I was married to Starlink’s hardware.
TP-Link
But, fortunately, I have a much better TP-Link Archer GE800 Wi-Fi 7 router, so I didn’t need to bother. It’s complete overkill for a civilian gamer like myself, but it does offer fantastic coverage in my wonky-walled home, and I already know my way around it from the past year of faithful operation. (Lean more about why you should get your own router.)
Props to Starlink for making the bridging process super simple, though. Just plug them in, switch the router to bypass mode in the app, a quick router reboot, and it was good to go in less than 10 minutes.
There’s no planning for a global outage
Two days after I got my Starlink dish mounted, my service went down. My wife had just left the house and closed the door the very second my PC connection dropped, so I thought it was her fault. Maybe she knocked the mount loose by slamming the door too hard?
But as it turns out, it wasn’t anything so innocuous. In fact, the entire global Starlink network had gone down.
Jon Martindale / Foundry
I managed to text a few friends from my board game group to see if they could send me tips on how to get it working again. They sent me screengrabs from Reddit, Twitter, DownDetector. It confirmed that it wasn’t just my router or my dish. Indeed, all of Starlink was down.
Apparently something like this has happened a few times before, but I also have friends who’ve had Starlink for years who claim there’s never been any outages as far as they know. So I’m not expecting this to happen again anytime soon, but tech is tech and it can fail. Even the magic of space internet can stop working from time to time.
Your friends will judge you for Starlink
Since November 2024, people around the world have been protesting against Elon Musk and those who support him. Many Tesla owners have added stickers to their vehicles, promising that their Teslas were bought before the CEO went crazy, all to fend off potential attacks.
It hasn’t gotten that bad for Starlink, but I do have to put up with friends who ask if I couldn’t have found another way to get online. Indeed, if I could have, I would have! But while Amazon is working on Project Kuiper, its own low-Earth-orbit network of broadband satellites, that’s still years away from being fully operational and may take even longer to catch up to Starlink. Plus, as far as billionaire CEOs go, it’s more a lateral move than anything to go from Musk to Bezos.
There are other providers with geostationary satellites that might have bandwidth, but the latency is poor. Eutelsat might be a legitimate option for me at some point, but not yet. Ultimately, the performance and viability of Starlink trumps my own misgivings about supporting a Musk-related company. Until that changes, I’ll have to swallow my pride and the condescension of a few friends.
Starlink: Incredible tech, flawed execution
There’s no denying it: Starlink feels like the kind of Jetsons-era future tech that has always captivated me. It just works, it’s nearly flawless, and it doesn’t have many real competitors. It’s really cool that I get super-fast, low-latency internet in a place that’s otherwise barely online.
But I wish I’d paid more for a better-looking, less-obvious mounting system. I wish I’d had a better backup solution in place just in case it went down. I wish it wasn’t tied to one of the most odious CEOs in the world.
For now, it’s the best solution available and a joy to use. It’s hard not to see how it could be even better, though. Read...Newslink ©2025 to PC World |  |
|  | | RadioNZ - 23 Aug (RadioNZ) While some findings confirm long-held concerns, others don`t go far enough, say rural folk. Read...Newslink ©2025 to RadioNZ |  |
|  | | PC World - 21 Aug (PC World)Modern notebooks with integrated AI hardware are changing the way artificial intelligence is used in everyday life. Instead of relying on external server farms, these large language models, image generators, or transcription systems run directly on the user’s own device.
This is made possible by the combination of powerful CPUs, dedicated graphics processors and, at the center of this development, a Neural Processing Unit (NPU). An NPU is not just an add-on, but a specialized accelerator designed precisely for the calculation of neural networks.
It enables offline AI tools such as GPT4All or Stable Diffusion not only to start, but also to react with high performance, low energy consumption and constant response time. Even with complex queries or multimodal tasks, the working speed remains stable. The days when AI was only conceivable as a cloud service are now over.
Work where others are offline
As soon as the internet connection is interrupted, classic laptops begin to idle. An AI PC, on the other hand, remains operational, whether in airplane mode above the clouds, deep in the dead zones of rural regions, or in an overloaded train network without a stable network.
In such situations, the structural advantage of locally running AI systems becomes apparent. Jan.ai or GPT4All can be used to create, check and revise texts, intelligently summarize notes, pre-formulate emails and categorize appointments.
Foundry
With AnythingLLM, contracts or meeting minutes can be searched for keywords without the documents leaving the device. Creative tasks such as creating illustrations via Stable Diffusion or post-processing images with Photo AI also work, even on devices without a permanent network connection.
Even demanding projects such as programming small tools or the automated generation of shell scripts are possible if the corresponding models are installed. For frequent travelers, project managers, or creative professionals, this creates a comprehensive option for productive working, completely independent of infrastructure, network availability, or cloud access. An offline AI notebook does not replace a studio, but it does prevent downtime.
Sensitive content remains local
Data sovereignty is increasingly becoming a decisive factor in personal and professional lives. Anyone who processes business reports, develops project ideas, or analyzes medical issues cannot afford to have any uncertainties when processing data.
Public chatbots such as Gemini, ChatGPT, or Microsoft Copilot are helpful, but are not designed to protect sensitive data from misuse or unwanted analysis.
Local AI solutions, on the other hand, work without transmitting data to the internet. The models used, such as LLaMA, Mistral or DeepSeek, can be executed directly on the device without the content leaving the hardware.
This opens up completely new fields of application, particularly in areas with regulatory requirements, such as healthcare, in a legal context, or in research. AnythingLLM goes one step further. It combines classic chat interaction with a local knowledge base of Office documents, PDFs and structured data. This turns voice AI into an interactive analysis tool for complex amounts of information, locally, offline and in compliance with data protection regulations.
NPU notebooks: new architecture, new possibilities
While traditional notebooks quickly reach their thermal or energy limits in AI applications, the new generation of copilot PCs rely on specialized AI hardware. Models such as the Surface Laptop 6 or the Surface Pro 10 integrate a dedicated NPU directly into the Intel Core Ultra SoC, supplemented by high-performance CPU cores and integrated graphics.
The advantages are evident in typical everyday scenarios. Voice input via Copilot, Gemini or ChatGPT can be analyzed without delay, image processing with AI tools takes place without cloud rendering, and even multimodal tasks, such as analyzing text, sound, and video simultaneously run in real time. Microsoft couples the hardware closely with the operating system.
IDG
Windows 11 offers native NPU support, for example for Windows Studio Effects, live subtitles, automatic text recognition in images or voice focus in video conferences. The systems are designed so that AI does not function as an add-on, but is an integral part of the overall system as soon as it is switched on, even without an internet connection.
Productive despite dead zones
The tools for offline AI are now fully developed and stable in everyday use. GPT4All from Nomic AI is particularly suitable for beginners, with a user-friendly interface, uncomplicated model management and support for numerous LLMs. Ollama is aimed at technically experienced users and offers terminal-based model management with a local API connection, ideal for providing your own applications or workflows directly with AI support. LM Studio, on the other hand, is characterized by its GUI focus. Models from Hugging Face can be simply be searched in the app, downloaded, and activated with a click.
The LM Studio chatbot not only provides access to a large selection of AI models from Huggingface.com, but also allows the AI models to be fine-tuned. There is a separate developer view for this.
LM Studio
Jan.ai is particularly versatile. The minimalist interface hides a highly functional architecture with support for multiple models, context-sensitive responses, and elegant interaction.
Local tools are also available in the creative area. With suitable hardware, Stable Diffusion delivers AI-generated images within a few seconds, while applications such as Photo AI automatically improve the quality of screenshots or video frames. A powerful NPU PC turns the mobile device into an autonomous creative studio, even without Wi-Fi, cloud access, or GPU calculation on third-party servers.
What counts on the move
The decisive factor for mobile use is not just whether a notebook can run AI, but how confidently it can do this offline. In addition to the CPU and GPU, the NPU plays a central role. It processes AI tasks in real time, while at the same time conserving battery power and reducing the load on the overall system.
Devices such as the Galaxy Book with an RTX 4050/4070 or the Surface Pro 10 with a Intel Core Ultra 7 CPU demonstrate that even complex language models such as Phi-2, Mistral, or Qwen run locally, with smooth operation and without the typical latencies of cloud services.
Copilot as a system assistant complements this setup, provided the software can access it. When travelling, you can compose emails, structure projects, prepare images or generate text modules, regardless of the network. Offline AI on NPU notebooks also transforms the in-flight restaurant, the waiting gate, or the remote holiday home into a productive workspace.
Requirements and limitations
The hardware requirements are not trivial however. Models such as LLaMA2 or Mistral require several gigabytes of RAM, 16 GB RAM is the lower minimum. Those working with larger prompts or context windows should plan for 32 or 64 GB. The SSD memory requirement also increases, as many models use between 4 and 20 GB.
NPUs take care of inference, but depending on the tool, additional GPU support may be necessary, for example for image generation with Stable Diffusion.
Sam Singleton
Integration into the operating system is also important. Copilot PCs ensure deep integration between hardware, AI libraries, and system functions. Anyone working with older hardware will have to accept limitations.
The model quality also varies. Local LLMs do not yet consistently reach the level of GPT-4, but they are more controllable, more readily available and more data protection-friendly. They are the more robust solution for many applications, especially when travelling.
Offline AI under Linux: openness meets control
Offline AI also unfolds its potential on Linux systems—often with even greater flexibility. Tools such as Ollama, GPT4All, or LM Studio offer native support for Ubuntu, Fedora, and Arch-based distributions and can be installed directly from the terminal or as a flatpack. The integration of open models such as Mistral, DeepSeek, or LLaMA works smoothly, as many projects rely on open source frameworks such as GGML or llama.cpp.
Browser interface for Ollama: Open-Web-UI is quickly set up as a Python program or in a Docker container and provides a user interface.IDG
Anyone working with Docker or Conda environments can build customized model set-ups, activate GPU support or fine-tune inference parameters. This opens up various scenarios, especially in the developer environment: Scripting, data analysis, code completion, or testing your own prompt structures.
In conjunction with tiling desktops, reduced background processes and optimized energy management, the Linux notebook becomes a self-sufficient AI platform, without any vendor lock-in, with maximum control over every file and every computing operation.
Offline instead of delivered
Offline AI on NPU notebooks is not a stopgap measure, but a paradigm shift. It offers independence, data protection, and responsiveness, even in environments without a network. Thanks to specialized chips, optimized software, and well thought-out integration in Windows 11 and the latest Linux kernel, new freedom is created for data-secure analyses, mobile creative processes, or productive work beyond the cloud.
The prerequisite for this is an AI PC that not only provides the necessary performance, but also anchors AI at a system level. Anyone relying on reliable intelligence on the move should no longer hope for the cloud, but choose a notebook that makes it superfluous. Read...Newslink ©2025 to PC World |  |
|  |  |
|
 |
 | Top Stories |

RUGBY
Injury problems for Ireland ahead of their test against the All Blacks in Chicago on November 2 - Mack Hansen has been ruled out with a foot problem and fellow backs Bundee Aki and Robbie Henshaw are in doubt More...
|

BUSINESS
One of our largest electricity companies says lines charges are the biggest factor driving up power bills More...
|

|

 | Today's News |

 | News Search |
|
 |