Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,13 @@
- Only versions 2.0.0 and newer can be upgraded to this version. For older versions, please upgrade to 2.0.0 first.
### Migrations and checks
#### Schema migrations
- [#1185](https://github.com/LayerManager/layman/issues/1185) Add new text column `file_path` in `publications` table in prime DB schema. Add constraint that `file_path` can be non-null only when `geodata_type` is `raster`.
#### Data migrations
### Changes
- [#1168](https://github.com/LayerManager/layman/issues/1168) Extend [PATCH Workspace Layer](doc/rest.md#patch-workspace-layer) with ability of appending data to existing time-series layer.
- When publishing a layer or map to Micka via CSW, Layman sends the creating user (Layman username) in the SOAP request header (`CreateUser`), so the metadata record in Micka is associated with the user who created the publication.
- [#1185](https://github.com/LayerManager/layman/issues/1185) POST Workspace [Layers](doc/rest.md#post-workspace-layers) supports import of raster layers from an existing server-side directory via the file_path parameter, including ImageMosaic timeseries layers.
- [#1185](https://github.com/LayerManager/layman/issues/1185)[GET Workspace Layer](doc/rest.md#get-workspace-layer) returns `file_path` key for raster layers published using this parameter.

## v2.3.0
2025-12-02
Expand Down
24 changes: 21 additions & 3 deletions doc/rest.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,7 @@ Processing chain consists of few steps:
- for vector layers import vector file (if sent) to PostgreSQL database as new table into workspace schema
- files with invalid byte sequence are first converted to GeoJSON, then cleaned with iconv, and finally imported to database.
- for raster layers normalize and compress raster file to GeoTIFF with overviews (pyramids); NoData values are normalized as transparent
- if `file_path` parameter is used raster files are used directly from GeoServer data directory without normalization
- save bounding box into PostgreSQL
- for vector layers publish the vector table as new layer (feature type) within appropriate WFS workspace of GeoServer
- for vector layers
Expand All @@ -115,7 +116,7 @@ Processing chain consists of few steps:
- for layers with QML style:
- create QGS file on QGIS server filesystem with appropriate style
- publish the layer on GeoServer through WMS cascade from QGIS server
- for raster layers publish normalized GeoTIFF as new layer (coverage) on GeoServer WMS workspace
- for raster layers publish GeoTIFF as new layer (coverage) on GeoServer WMS workspace (normalized if `file` parameter is used, original if `file_path` parameter is used)
- generate thumbnail image
- publish metadata record to Micka (it's public if and only if read access is set to EVERYONE; the creating user is sent as CreateUser in the CSW SOAP request)
- update thumbnail of each [map](models.md#map) that points to this layer
Expand All @@ -140,7 +141,7 @@ Body parameters:
- used if specified, otherwise generated
- it's meant mostly for testing purposes
- *file*, file(s) or file name(s)
- exactly one of `file` or `external_table_uri` must be set
- exactly one of `file`, `file_path`, or `external_table_uri` must be set
- one of following options is expected:
- GeoJSON file
- ShapeFile files (at least three files: .shp, .shx, .dbf)
Expand All @@ -162,8 +163,24 @@ Body parameters:
- if published file has empty bounding box (i.e. no features), its bounding box on WMS/WFS endpoint is set to the whole World
- attribute names are [laundered](https://gdal.org/en/stable/drivers/vector/pg.html#layer-creation-options) to be safely stored in DB
- if QML style is used in this request, it must list all attributes contained in given data file
- *file_path*, string
- exactly one of `file`, `file_path`, or `external_table_uri` must be set
- relative path to a directory that already exists on the server
- the path must be relative to the root of the GeoServer data directory
- the referenced directory must be physically located inside the GeoServer data directory
- the directory must contain at least one GeoTIFF file (with extension `.tif` or `.tiff`)

- for raster layers:
- supported only for GeoTIFF files (`.tif` or `.tiff` extension)
- raster files are not normalized when using `file_path` parameter
- this may result in different styling behavior compared to layers published via `file` parameter
- may point to a directory containing a single raster file (published as a single coverage)
- may point to a directory containing multiple raster files:
- if `time_regex` parameter is provided, files are treated as a time series and published as an ImageMosaic
Comment thread
jirik marked this conversation as resolved.
- if `time_regex` parameter is not provided and directory contains multiple raster files, an error is raised

- *external_table_uri*, string
- exactly one of `file` or `external_table_uri` must be set
- exactly one of `file`, `file_path`, or `external_table_uri` must be set
- [connection URI](https://www.postgresql.org/docs/15/libpq-connect.html#id-1.7.3.8.3.6) is required, usual format is `postgresql://<username>:<password>@<host>:<port>/<dbname>?schema=<schema_name>&table=<table_name>&geo_column=<geo_column_name>`
- `host` part and query parameters `schema` and `table` are mandatory
- URI scheme is required to be `postgresql`
Expand Down Expand Up @@ -329,6 +346,7 @@ JSON object with following structure:
- *status*: Status information about publishing style. See [GET Workspace Layer](#get-workspace-layer) **wms** property for meaning.
- *error*: If status is FAILURE, this may contain error object.
- **original_data_source**: String. Either `file` if layer was published from file, or `database_table` if layer was published from external database table
- **file_path**: String. Available only for raster layers published using `file_path` parameter. Relative path to the directory containing raster files, relative to the root of the GeoServer data directory.
- *metadata*
- *identifier*: String. Identifier of metadata record in CSW instance.
- *record_url*: String. URL of metadata record accessible by web browser, probably with some editing capabilities.
Expand Down
16 changes: 11 additions & 5 deletions src/layman/common/prime_db_schema/publications.py
Original file line number Diff line number Diff line change
Expand Up @@ -173,6 +173,7 @@ def get_publication_infos_with_metainfo(workspace_name=None, pub_type=None, *,
ST_YMAX(p.bbox) as ymax,
p.srid as srid,
PGP_SYM_DECRYPT(p.external_table_uri, p.uuid::text)::json external_table_uri,
p.file_path,
(select rtrim(concat(case when u.id is not null then w.name || ',' end,
string_agg(COALESCE(w2.name, r.role_name), ',' ORDER BY COALESCE(w2.name, r.role_name)) || ',',
case when p.everyone_can_read then %s || ',' end
Expand Down Expand Up @@ -308,10 +309,11 @@ def get_publication_infos_with_metainfo(workspace_name=None, pub_type=None, *,
'used_in_maps': layer_maps or [],
'_wfs_wms_status': settings.EnumWfsWmsStatus(wfs_wms_status) if wfs_wms_status else None,
'_is_public_workspace': is_public_workspace,
'file_path': file_path,
}
for id_publication, workspace_name, publication_type, publication_name, title, description, uuid,
geodata_type, style_type, image_mosaic, updated_at, created_at, xmin, ymin, xmax, ymax,
srid, external_table_uri, read_users_roles, write_users_roles, map_layers, layer_maps, wfs_wms_status, is_public_workspace, _
srid, external_table_uri, file_path, read_users_roles, write_users_roles, map_layers, layer_maps, wfs_wms_status, is_public_workspace, _
in values}

infos = {key: {**value,
Expand Down Expand Up @@ -501,8 +503,8 @@ def insert_publication(workspace_name, info):
check_publication_info(workspace_name, info)

insert_publications_sql = f'''insert into {DB_SCHEMA}.publications as p
(id_workspace, name, title, description, type, uuid, style_type, geodata_type, everyone_can_read, everyone_can_write, updated_at, image_mosaic, external_table_uri, wfs_wms_status) values
(%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, current_timestamp, %s, PGP_SYM_ENCRYPT(%s::text, %s::text), %s )
(id_workspace, name, title, description, type, uuid, style_type, geodata_type, everyone_can_read, everyone_can_write, updated_at, image_mosaic, external_table_uri, file_path, wfs_wms_status) values
(%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, current_timestamp, %s, PGP_SYM_ENCRYPT(%s::text, %s::text), %s, %s )
returning id
;'''

Expand All @@ -527,6 +529,7 @@ def insert_publication(workspace_name, info):
info.get("image_mosaic"),
external_table_uri,
info.get("uuid"),
info.get("file_path"),
info.get("wfs_wms_status")
)
pub_id = db_util.run_query(insert_publications_sql, data)[0][0]
Expand Down Expand Up @@ -597,6 +600,7 @@ def update_publication(workspace_name, info, is_part_of_user_delete=False):
updated_at = current_timestamp,
image_mosaic = coalesce(%s, image_mosaic),
external_table_uri = PGP_SYM_ENCRYPT(%s::text, uuid::text),
file_path = coalesce(%s, file_path),
geodata_type = coalesce(%s, geodata_type)
where id_workspace = %s
and name = %s
Expand All @@ -611,6 +615,7 @@ def update_publication(workspace_name, info, is_part_of_user_delete=False):
access_rights_changes['write']['EVERYONE'],
info.get("image_mosaic"),
external_table_uri,
info.get("file_path"),
info.get("geodata_type"),
id_workspace,
info.get("name"),
Expand Down Expand Up @@ -671,11 +676,12 @@ def set_bbox(workspace, publication_type, publication, bbox, crs, ):

def set_geodata_type(workspace, publication_type, publication, geodata_type, ):
query = f'''update {DB_SCHEMA}.publications set
geodata_type = %s
geodata_type = %s,
file_path = CASE WHEN %s = %s THEN NULL ELSE file_path END
Comment thread
jirik marked this conversation as resolved.
where type = %s
and name = %s
and id_workspace = (select w.id from {DB_SCHEMA}.workspaces w where w.name = %s);'''
params = (geodata_type, publication_type, publication, workspace,)
params = (geodata_type, geodata_type, settings.GEODATA_TYPE_UNKNOWN, publication_type, publication, workspace,)
db_util.run_statement(query, params)


Expand Down
4 changes: 2 additions & 2 deletions src/layman/layer/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def get_layer_patch_keys():
('layman.layer.prime_db_schema.table', InternalSourceTypeDef(info_items=[
'access_rights', 'name', 'workspace', 'title', 'uuid', 'bounding_box', 'style_type', 'native_crs',
'native_bounding_box', 'geodata_type', 'updated_at', 'id', 'type', 'image_mosaic', 'table_uri',
'original_data_source', 'wfs_wms_status', 'used_in_maps', 'description', 'created_at', 'is_public_workspace', ]),),
'original_data_source', 'wfs_wms_status', 'used_in_maps', 'description', 'created_at', 'is_public_workspace', 'file_path', ]),),
('layman.layer.filesystem.input_chunk', InternalSourceTypeDef(info_items=['file', ]),),
('layman.layer.filesystem.input_file', InternalSourceTypeDef(info_items=['file', ]),),
('layman.layer.filesystem.input_style', InternalSourceTypeDef(info_items=[]),),
Expand Down Expand Up @@ -128,7 +128,7 @@ def get_layer_patch_keys():
settings.GEODATA_TYPE_RASTER: {
'name', 'uuid', 'layman_metadata', 'url', 'title', 'description', 'updated_at', 'wms', 'thumbnail', 'file', 'metadata',
'style', 'access_rights', 'bounding_box', 'native_crs', 'native_bounding_box', 'image_mosaic',
'original_data_source', 'geodata_type', 'used_in_maps',
'original_data_source', 'geodata_type', 'used_in_maps', 'file_path',
},
settings.GEODATA_TYPE_UNKNOWN: {
'name', 'uuid', 'layman_metadata', 'url', 'title', 'description', 'updated_at', 'wms', 'thumbnail', 'file', 'metadata',
Expand Down
96 changes: 74 additions & 22 deletions src/layman/layer/filesystem/gdal.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,63 @@ def get_layer_info(workspace, layername, *, extra_keys=None):
return get_layer_info_by_uuid(publ_uuid, extra_keys=extra_keys) if publ_uuid else {}


def to_gs_path(path):
if path.startswith(settings.GEOSERVER_DATADIR):
return os.path.relpath(path, settings.GEOSERVER_DATADIR)
return path


def add_file_extra_keys(file_dict, gdal_path, gdal_paths_for_stats, extra_keys, normalized_gdal_path=None):
if '_file.color_interpretations' in extra_keys:
file_dict['color_interpretations'] = get_color_interpretations(gdal_path)
if '_file.mask_flags' in extra_keys:
file_dict['mask_flags'] = get_mask_flags(gdal_path)
norm_file_dict = {}
if '_file.normalized_file.stats' in extra_keys:
norm_file_dict['stats'] = get_file_list_statistics(gdal_paths_for_stats)
normalized_path = normalized_gdal_path
if normalized_path is None:
normalized_path = gdal_paths_for_stats[0] if isinstance(gdal_paths_for_stats, list) and len(gdal_paths_for_stats) > 0 else gdal_path
if '_file.normalized_file.mask_flags' in extra_keys:
norm_file_dict['mask_flags'] = get_mask_flags(normalized_path)
if '_file.normalized_file.color_interpretations' in extra_keys:
norm_file_dict['color_interpretations'] = get_color_interpretations(normalized_path)
if '_file.normalized_file.nodata_value' in extra_keys:
norm_file_dict['nodata_value'] = get_nodata_value(normalized_path)
if norm_file_dict:
file_dict['normalized_file'] = norm_file_dict


def get_file_path_layer_info(file_path_info_list, extra_keys):
file_path_info = file_path_info_list[0]
gdal_path = file_path_info['gdal']
paths_dict = {
os.path.basename(info['gdal']): {
'normalized_absolute': info['absolute'],
'normalized_geoserver': to_gs_path(info['gdal']),
}
for info in file_path_info_list
}

result = {
'_file': {
'paths': paths_dict,
},
}
file_dict = result['_file']
gdal_paths_for_stats = [info['gdal'] for info in file_path_info_list]
normalized_gdal_path = gdal_paths_for_stats[0] if gdal_paths_for_stats else gdal_path
add_file_extra_keys(file_dict, gdal_path, gdal_paths_for_stats, extra_keys, normalized_gdal_path=normalized_gdal_path)
return result


def get_layer_info_by_uuid(publ_uuid, *, extra_keys=None):
extra_keys = extra_keys or []
file_path_info_list = input_file.get_file_path_info(publ_uuid)

if file_path_info_list:
return get_file_path_layer_info(file_path_info_list, extra_keys)

gdal_paths = get_normalized_raster_layer_main_filepaths(publ_uuid)
gs_directory = get_normalized_raster_layer_dir(publ_uuid, geoserver=True)
result = {}
Expand All @@ -43,25 +98,8 @@ def get_layer_info_by_uuid(publ_uuid, *, extra_keys=None):
input_file_info = input_file.get_layer_info_by_uuid(publ_uuid)
result['_file']['file_type'] = input_file_info['_file']['file_type']
input_file_gdal_path = next(iter(input_file_info['_file']['paths'].values()))['gdal']
if '_file.color_interpretations' in extra_keys:
file_dict['color_interpretations'] = get_color_interpretations(input_file_gdal_path)
if '_file.mask_flags' in extra_keys:
file_dict['mask_flags'] = get_mask_flags(input_file_gdal_path)
norm_file_dict = {}
if '_file.normalized_file.stats' in extra_keys:
stats = get_file_list_statistics(gdal_paths)
norm_file_dict['stats'] = stats
if '_file.normalized_file.mask_flags' in extra_keys:
mask_flags = get_mask_flags(gdal_paths[0])
norm_file_dict['mask_flags'] = mask_flags
if '_file.normalized_file.color_interpretations' in extra_keys:
color_interpretations = get_color_interpretations(gdal_paths[0])
norm_file_dict['color_interpretations'] = color_interpretations
if '_file.normalized_file.nodata_value' in extra_keys:
nodata_value = get_nodata_value(gdal_paths[0])
norm_file_dict['nodata_value'] = nodata_value
if norm_file_dict:
file_dict['normalized_file'] = norm_file_dict
normalized_gdal_path = gdal_paths[0] if len(gdal_paths) > 0 else input_file_gdal_path
add_file_extra_keys(file_dict, input_file_gdal_path, gdal_paths, extra_keys, normalized_gdal_path=normalized_gdal_path)
return result


Expand Down Expand Up @@ -461,14 +499,28 @@ def get_bbox_from_files(filepaths):
return result


def get_layer_filepaths(publ_uuid):
file_path_info_list = input_file.get_file_path_info(publ_uuid)
if file_path_info_list:
return [info['gdal'] for info in file_path_info_list]
return get_normalized_raster_layer_main_filepaths(publ_uuid)


def get_layer_filepath(publ_uuid):
file_path_info_list = input_file.get_file_path_info(publ_uuid)
if file_path_info_list:
return file_path_info_list[0]['gdal']
return get_normalized_raster_layer_main_filepaths(publ_uuid)[0]


def get_bbox(publ_uuid):
filepaths = get_normalized_raster_layer_main_filepaths(publ_uuid)
filepaths = get_layer_filepaths(publ_uuid)
result = get_bbox_from_files(filepaths)
return result


def get_crs(publ_uuid):
filepath = get_normalized_raster_layer_main_filepaths(publ_uuid)[0]
filepath = get_layer_filepath(publ_uuid)
data = open_raster_file(filepath, gdalconst.GA_ReadOnly)
spatial_reference = osr.SpatialReference(wkt=data.GetProjection())
auth_name = spatial_reference.GetAttrValue('AUTHORITY')
Expand All @@ -478,7 +530,7 @@ def get_crs(publ_uuid):


def get_normalized_ground_sample_distance_in_m(publ_uuid, *, bbox_size):
filepath = get_normalized_raster_layer_main_filepaths(publ_uuid)[0]
filepath = get_layer_filepath(publ_uuid)
raster_size = get_raster_size(filepath)
pixel_size = [bbox_size / raster_size[idx] for (idx, bbox_size) in enumerate(bbox_size)]
distance_value = sum(pixel_size) / len(pixel_size)
Expand Down
Loading