Skip to content

Commit

Permalink
Merge pull request #1238 from gboeing/release
Browse files Browse the repository at this point in the history
Release v2.0.0
  • Loading branch information
gboeing authored Nov 24, 2024
2 parents 3927440 + 67d2ab0 commit dfa64e2
Show file tree
Hide file tree
Showing 19 changed files with 57 additions and 57 deletions.
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,19 +23,19 @@ repos:
- id: trailing-whitespace

- repo: https://github.com/pre-commit/mirrors-prettier
rev: "v3.1.0"
rev: "v4.0.0-alpha.8"
hooks:
- id: prettier
types_or: [markdown, yaml]

- repo: https://github.com/igorshubovych/markdownlint-cli
rev: "v0.42.0"
rev: "v0.43.0"
hooks:
- id: markdownlint
args: [--disable=MD013]

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.7.3"
rev: "v0.8.0"
hooks:
- id: ruff
args: [--fix]
Expand Down
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Changelog

## 2.0.0 (in development)
## 2.0.0 (2024-11-24)

Read the v2 [migration guide](https://github.com/gboeing/osmnx/issues/1123)

Expand Down
2 changes: 1 addition & 1 deletion LICENSE.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
The MIT License (MIT)

Copyright (c) 2016-2024 Geoff Boeing https://geoffboeing.com/
Copyright (c) 2016-2025 Geoff Boeing https://geoffboeing.com/

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@

**OSMnx** is a Python package to easily download, model, analyze, and visualize street networks and other geospatial features from OpenStreetMap. You can download and model walking, driving, or biking networks with a single line of code then analyze and visualize them. You can just as easily work with urban amenities/points of interest, building footprints, transit stops, elevation data, street orientations, speed/travel time, and routing.

OSMnx 2.0 is in beta: read the [migration guide](https://github.com/gboeing/osmnx/issues/1123).
OSMnx 2.0 is released: read the [migration guide](https://github.com/gboeing/osmnx/issues/1123).

## Citation

Expand Down
4 changes: 2 additions & 2 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@

# project info
author = "Geoff Boeing"
copyright = "2016-2024, Geoff Boeing" # noqa: A001
copyright = "2016-2025, Geoff Boeing" # noqa: A001
project = "OSMnx"

# go up two levels from current working dir (/docs/source) to package root
Expand All @@ -36,10 +36,10 @@
"matplotlib",
"networkx",
"numpy",
"osgeo",
"pandas",
"rasterio",
"requests",
"rio-vrt",
"scipy",
"shapely",
"sklearn",
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ OSMnx |version|

**OSMnx** is a Python package to easily download, model, analyze, and visualize street networks and other geospatial features from OpenStreetMap. You can download and model walking, driving, or biking networks with a single line of code then analyze and visualize them. You can just as easily work with urban amenities/points of interest, building footprints, transit stops, elevation data, street orientations, speed/travel time, and routing.

OSMnx 2.0 is in beta: read the `migration guide`_.
OSMnx 2.0 is released: read the `migration guide`_.

.. _migration guide: https://github.com/gboeing/osmnx/issues/1123

Expand Down
2 changes: 1 addition & 1 deletion docs/source/user-reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ User Reference

This is the User Reference for the OSMnx package. If you are looking for an introduction to OSMnx, read the :doc:`getting-started` guide. This guide describes the usage of OSMnx's public API.

OSMnx 2.0 is in beta: read the `migration guide`_.
OSMnx 2.0 is released: read the `migration guide`_.

.. _migration guide: https://github.com/gboeing/osmnx/issues/1123

Expand Down
14 changes: 7 additions & 7 deletions osmnx/_http.py
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ def _config_dns(url: str) -> None:
"""
Force socket.getaddrinfo to use IP address instead of hostname.
Resolves the URL's domain to an IP address so that we use the same server
Resolves URL's hostname to an IP address so that we use the same server
for both 1) checking the necessary pause duration and 2) sending the query
itself even if there is round-robin redirecting among multiple server
machines on the server-side. Mutates the getaddrinfo function so it uses
Expand Down Expand Up @@ -309,30 +309,30 @@ def _parse_response(response: requests.Response) -> dict[str, Any] | list[dict[s
Value will be a dict if the response is from the Google or Overpass
APIs, and a list if the response is from the Nominatim API.
"""
# log the response size and domain
domain = _hostname_from_url(response.url)
# log the response size and hostname
hostname = _hostname_from_url(response.url)
size_kb = len(response.content) / 1000
msg = f"Downloaded {size_kb:,.1f}kB from {domain!r} with status {response.status_code}"
msg = f"Downloaded {size_kb:,.1f}kB from {hostname!r} with status {response.status_code}"
utils.log(msg, level=lg.INFO)

# parse the response to JSON and log/raise exceptions
try:
response_json: dict[str, Any] | list[dict[str, Any]] = response.json()
except JSONDecodeError as e: # pragma: no cover
msg = f"{domain!r} responded: {response.status_code} {response.reason} {response.text}"
msg = f"{hostname!r} responded: {response.status_code} {response.reason} {response.text}"
utils.log(msg, level=lg.ERROR)
if response.ok:
raise InsufficientResponseError(msg) from e
raise ResponseStatusCodeError(msg) from e

# log any remarks if they exist
if isinstance(response_json, dict) and "remark" in response_json: # pragma: no cover
msg = f"{domain!r} remarked: {response_json['remark']!r}"
msg = f"{hostname!r} remarked: {response_json['remark']!r}"
utils.log(msg, level=lg.WARNING)

# log if the response status_code is not OK
if not response.ok:
msg = f"{domain!r} returned HTTP status code {response.status_code}"
msg = f"{hostname!r} returned HTTP status code {response.status_code}"
utils.log(msg, level=lg.WARNING)

return response_json
6 changes: 3 additions & 3 deletions osmnx/_nominatim.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,8 @@ def _nominatim_request(
return cached_response_json

# pause then request this URL
domain = _http._hostname_from_url(url)
msg = f"Pausing {pause} second(s) before making HTTP GET request to {domain!r}"
hostname = _http._hostname_from_url(url)
msg = f"Pausing {pause} second(s) before making HTTP GET request to {hostname!r}"
utils.log(msg, level=lg.INFO)
time.sleep(pause)

Expand All @@ -138,7 +138,7 @@ def _nominatim_request(
# handle 429 and 504 errors by pausing then recursively re-trying request
if response.status_code in {429, 504}: # pragma: no cover
msg = (
f"{domain!r} responded {response.status_code} {response.reason}: "
f"{hostname!r} responded {response.status_code} {response.reason}: "
f"we'll retry in {error_pause} secs"
)
utils.log(msg, level=lg.WARNING)
Expand Down
2 changes: 1 addition & 1 deletion osmnx/_osm_xml.py
Original file line number Diff line number Diff line change
Expand Up @@ -445,7 +445,7 @@ def _sort_nodes(G: nx.MultiDiGraph, osmid: int) -> list[int]:
G_.remove_edges_from(nx.find_cycle(G_))
G_ = truncate.largest_component(G_)
ordered_nodes = _sort_nodes(G_, osmid)
msg = f"Had to remove a cycle from way {str(osmid)!r} for topological sort"
msg = f"Had to remove a cycle from way {osmid!r} for topological sort"
utils.log(msg, level=lg.WARNING)

return ordered_nodes
10 changes: 5 additions & 5 deletions osmnx/_overpass.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

import numpy as np
import requests
from requests.exceptions import ConnectionError
from requests.exceptions import ConnectionError as RequestsConnectionError

from . import _http
from . import projection
Expand Down Expand Up @@ -179,7 +179,7 @@ def _get_overpass_pause(
**settings.requests_kwargs,
)
response_text = response.text
except ConnectionError as e: # pragma: no cover
except RequestsConnectionError as e: # pragma: no cover
# cannot reach status endpoint: log error and return default duration
msg = f"Unable to query {url}, {e}"
utils.log(msg, level=lg.ERROR)
Expand Down Expand Up @@ -462,8 +462,8 @@ def _overpass_request(
# pause then request this URL
if pause is None:
this_pause = _get_overpass_pause(settings.overpass_url)
domain = _http._hostname_from_url(url)
msg = f"Pausing {this_pause} second(s) before making HTTP POST request to {domain!r}"
hostname = _http._hostname_from_url(url)
msg = f"Pausing {this_pause} second(s) before making HTTP POST request to {hostname!r}"
utils.log(msg, level=lg.INFO)
time.sleep(this_pause)

Expand All @@ -482,7 +482,7 @@ def _overpass_request(
if response.status_code in {429, 504}: # pragma: no cover
this_pause = error_pause + _get_overpass_pause(settings.overpass_url)
msg = (
f"{domain!r} responded {response.status_code} {response.reason}: "
f"{hostname!r} responded {response.status_code} {response.reason}: "
f"we'll retry in {this_pause} secs"
)
utils.log(msg, level=lg.WARNING)
Expand Down
2 changes: 1 addition & 1 deletion osmnx/_version.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
"""OSMnx package version information."""

__version__ = "2.0.0rc2"
__version__ = "2.0.0"
6 changes: 3 additions & 3 deletions osmnx/convert.py
Original file line number Diff line number Diff line change
Expand Up @@ -171,10 +171,10 @@ def graph_to_gdfs(
u, v, k, data = zip(*G.edges(keys=True, data=True))

if fill_edge_geometry:
coords = {n: (G.nodes[n]["x"], G.nodes[n]["y"]) for n in G}
node_coords = {n: (G.nodes[n]["x"], G.nodes[n]["y"]) for n in G}
edge_geoms = (
edata.get("geometry", LineString((coords[u], coords[v])))
for u, v, _, edata in G.edges(keys=True, data=True)
d.get("geometry", LineString((node_coords[u], node_coords[v])))
for u, v, _, d in G.edges(keys=True, data=True)
)
gdf_edges = gpd.GeoDataFrame(data, crs=crs, geometry=list(edge_geoms))

Expand Down
16 changes: 8 additions & 8 deletions osmnx/elevation.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ def _build_vrt_file(raster_paths: Iterable[str | Path]) -> Path:

# build the VRT file if it doesn't already exist in the cache
if not vrt_path.is_file():
msg = f"Building VRT for {len(raster_paths):,} rasters at {vrt_path!r}..."
msg = f"Building VRT for {len(raster_paths):,} rasters at {str(vrt_path)!r}..."
utils.log(msg, level=lg.INFO)
vrt_path.parent.mkdir(parents=True, exist_ok=True)
build_vrt(vrt_path, raster_paths)
Expand Down Expand Up @@ -193,7 +193,7 @@ def add_node_elevations_raster(
# divide nodes into equal-sized chunks for multiprocessing
size = int(np.ceil(len(nodes) / cpus))
args = ((nodes.iloc[i : i + size], filepath, band) for i in range(0, len(nodes), size))
with mp.get_context(mp.get_start_method()).Pool(cpus) as pool:
with mp.get_context().Pool(cpus) as pool:
results = pool.starmap_async(_query_raster, args).get()
elevs = {k: v for kv in results for k, v in kv}

Expand Down Expand Up @@ -247,9 +247,9 @@ def add_node_elevations_google(
# round coordinates to 6 decimal places (approx 5 to 10 cm resolution)
node_points = pd.Series({n: f"{d['y']:.6f},{d['x']:.6f}" for n, d in G.nodes(data=True)})
n_calls = int(np.ceil(len(node_points) / batch_size))
domain = _http._hostname_from_url(settings.elevation_url_template)
hostname = _http._hostname_from_url(settings.elevation_url_template)

msg = f"Requesting node elevations from {domain!r} in {n_calls} request(s)"
msg = f"Requesting node elevations from {hostname!r} in {n_calls} request(s)"
utils.log(msg, level=lg.INFO)

# break the series of coordinates into chunks of batch_size
Expand All @@ -268,7 +268,7 @@ def add_node_elevations_google(
raise InsufficientResponseError(str(response_json))

# sanity check that all our vectors have the same number of elements
msg = f"Graph has {len(G):,} nodes and we received {len(results):,} results from {domain!r}"
msg = f"Graph has {len(G):,} nodes and we received {len(results):,} results from {hostname!r}"
utils.log(msg, level=lg.INFO)
if not (len(results) == len(G) == len(node_points)): # pragma: no cover
err_msg = f"{msg}\n{response_json}"
Expand All @@ -278,7 +278,7 @@ def add_node_elevations_google(
df_elev = pd.DataFrame(node_points, columns=["node_points"])
df_elev["elevation"] = [result["elevation"] for result in results]
nx.set_node_attributes(G, name="elevation", values=df_elev["elevation"].to_dict())
msg = f"Added elevation data from {domain!r} to all nodes."
msg = f"Added elevation data from {hostname!r} to all nodes."
utils.log(msg, level=lg.INFO)

return G
Expand All @@ -305,8 +305,8 @@ def _elevation_request(url: str, pause: float) -> dict[str, Any]:
return cached_response_json

# pause then request this URL
domain = _http._hostname_from_url(url)
msg = f"Pausing {pause} second(s) before making HTTP GET request to {domain!r}"
hostname = _http._hostname_from_url(url)
msg = f"Pausing {pause} second(s) before making HTTP GET request to {hostname!r}"
utils.log(msg, level=lg.INFO)
time.sleep(pause)

Expand Down
6 changes: 3 additions & 3 deletions osmnx/io.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def save_graph_geopackage(
gdf_nodes.to_file(filepath, layer="nodes", driver="GPKG", index=True, encoding=encoding)
gdf_edges.to_file(filepath, layer="edges", driver="GPKG", index=True, encoding=encoding)

msg = f"Saved graph as GeoPackage at {filepath!r}"
msg = f"Saved graph as GeoPackage at {str(filepath)!r}"
utils.log(msg, level=lg.INFO)


Expand Down Expand Up @@ -129,7 +129,7 @@ def save_graphml(
data[attr] = str(value)

nx.write_graphml(G, path=filepath, encoding=encoding)
msg = f"Saved graph as GraphML file at {filepath!r}"
msg = f"Saved graph as GraphML file at {str(filepath)!r}"
utils.log(msg, level=lg.INFO)


Expand Down Expand Up @@ -239,7 +239,7 @@ def load_graphml(
G = _convert_node_attr_types(G, default_node_dtypes)
G = _convert_edge_attr_types(G, default_edge_dtypes)

msg = f"Loaded graph with {len(G)} nodes and {len(G.edges)} edges from {source!r}"
msg = f"Loaded graph with {len(G)} nodes and {len(G.edges)} edges from {str(source)!r}"
utils.log(msg, level=lg.INFO)
return G

Expand Down
8 changes: 4 additions & 4 deletions osmnx/plot.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,9 @@
from matplotlib import cm
from matplotlib import colormaps
from matplotlib import colors
from matplotlib.axes._axes import Axes # noqa: TCH002
from matplotlib.figure import Figure # noqa: TCH002
from matplotlib.projections.polar import PolarAxes # noqa: TCH002
from matplotlib.axes._axes import Axes # noqa: TC002
from matplotlib.figure import Figure # noqa: TC002
from matplotlib.projections.polar import PolarAxes # noqa: TC002

mpl_available = True

Expand Down Expand Up @@ -934,7 +934,7 @@ def _save_and_show(
fig.savefig(fp, dpi=dpi, bbox_inches=extent, format=ext, facecolor=fc, transparent=True)
fig.set_frameon(False) # and turn it back off again

msg = f"Saved figure to disk at {fp!r}"
msg = f"Saved figure to disk at {str(fp)!r}"
utils.log(msg, level=lg.INFO)

if show:
Expand Down
2 changes: 1 addition & 1 deletion osmnx/routing.py
Original file line number Diff line number Diff line change
Expand Up @@ -366,7 +366,7 @@ def shortest_path(
# if multi-threading, calculate shortest paths in parallel
else:
args = ((G, o, d, weight) for o, d in zip(orig, dest))
with mp.get_context(mp.get_start_method()).Pool(cpus) as pool:
with mp.get_context().Pool(cpus) as pool:
paths = pool.starmap_async(_single_shortest_path, args).get()

return paths
Expand Down
14 changes: 7 additions & 7 deletions osmnx/simplification.py
Original file line number Diff line number Diff line change
Expand Up @@ -392,17 +392,17 @@ def simplify_graph( # noqa: C901, PLR0912
path_attributes[attr] = [edge_data[attr]]

# consolidate the path's edge segments' attribute values
for attr in path_attributes:
if attr in edge_attr_aggs:
for attr_name, attr_values in path_attributes.items():
if attr_name in edge_attr_aggs:
# if this attribute's values must be aggregated, do so now
agg_func = edge_attr_aggs[attr]
path_attributes[attr] = agg_func(path_attributes[attr])
elif len(set(path_attributes[attr])) == 1:
agg_func = edge_attr_aggs[attr_name]
path_attributes[attr_name] = agg_func(attr_values)
elif len(set(attr_values)) == 1:
# if there's only 1 unique value, keep that single value
path_attributes[attr] = path_attributes[attr][0]
path_attributes[attr_name] = attr_values[0]
else:
# otherwise, if there are multiple uniques, keep one of each
path_attributes[attr] = list(set(path_attributes[attr]))
path_attributes[attr_name] = list(set(attr_values))

# construct the new consolidated edge's geometry for this path
path_attributes["geometry"] = LineString(
Expand Down
8 changes: 4 additions & 4 deletions tests/test_osmnx.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
import pandas as pd
import pytest
from lxml import etree
from requests.exceptions import ConnectionError
from requests.exceptions import ConnectionError as RequestsConnectionError
from shapely import Point
from shapely import Polygon
from shapely import wkt
Expand Down Expand Up @@ -282,10 +282,10 @@ def test_elevation() -> None:
ox.settings.elevation_url_template = (
"https://api.opentopodata.org/v1/aster30m?locations={locations}&key={key}"
)
_ = ox.elevation.add_node_elevations_google(G, batch_size=100, pause=0.01)
_ = ox.elevation.add_node_elevations_google(G, batch_size=100, pause=0.5)

# same thing again, to hit the cache
_ = ox.elevation.add_node_elevations_google(G, batch_size=100, pause=0.01)
_ = ox.elevation.add_node_elevations_google(G, batch_size=100, pause=0)

# add node elevations from a single raster file (some nodes will be null)
rasters = list(Path("tests/input_data").glob("elevation*.tif"))
Expand Down Expand Up @@ -462,7 +462,7 @@ def test_endpoints() -> None:
# This should fail because we didn't provide a valid endpoint
ox.settings.overpass_rate_limit = False
ox.settings.overpass_url = "http://NOT_A_VALID_ENDPOINT/api/"
with pytest.raises(ConnectionError, match="Max retries exceeded with url"):
with pytest.raises(RequestsConnectionError, match="Max retries exceeded with url"):
G = ox.graph_from_place(place1, network_type="all")

ox.settings.overpass_rate_limit = default_overpass_rate_limit
Expand Down

0 comments on commit dfa64e2

Please sign in to comment.