diff --git a/bindings/java/src/README.md b/bindings/java/src/README.md
index c2e43908..66338cf7 100644
--- a/bindings/java/src/README.md
+++ b/bindings/java/src/README.md
@@ -243,6 +243,7 @@ Class | Method | HTTP request | Description
*PackagesApi* | [**packagesUploadDart**](docs/PackagesApi.md#packagesUploadDart) | **POST** /packages/{owner}/{repo}/upload/dart/ | Create a new Dart package
*PackagesApi* | [**packagesUploadDeb**](docs/PackagesApi.md#packagesUploadDeb) | **POST** /packages/{owner}/{repo}/upload/deb/ | Create a new Debian package
*PackagesApi* | [**packagesUploadDocker**](docs/PackagesApi.md#packagesUploadDocker) | **POST** /packages/{owner}/{repo}/upload/docker/ | Create a new Docker package
+*PackagesApi* | [**packagesUploadGeneric**](docs/PackagesApi.md#packagesUploadGeneric) | **POST** /packages/{owner}/{repo}/upload/generic/ | Create a new Generic package
*PackagesApi* | [**packagesUploadGo**](docs/PackagesApi.md#packagesUploadGo) | **POST** /packages/{owner}/{repo}/upload/go/ | Create a new Go package
*PackagesApi* | [**packagesUploadHelm**](docs/PackagesApi.md#packagesUploadHelm) | **POST** /packages/{owner}/{repo}/upload/helm/ | Create a new Helm package
*PackagesApi* | [**packagesUploadHex**](docs/PackagesApi.md#packagesUploadHex) | **POST** /packages/{owner}/{repo}/upload/hex/ | Create a new Hex package
@@ -269,6 +270,7 @@ Class | Method | HTTP request | Description
*PackagesApi* | [**packagesValidateUploadDart**](docs/PackagesApi.md#packagesValidateUploadDart) | **POST** /packages/{owner}/{repo}/validate-upload/dart/ | Validate parameters for create Dart package
*PackagesApi* | [**packagesValidateUploadDeb**](docs/PackagesApi.md#packagesValidateUploadDeb) | **POST** /packages/{owner}/{repo}/validate-upload/deb/ | Validate parameters for create Debian package
*PackagesApi* | [**packagesValidateUploadDocker**](docs/PackagesApi.md#packagesValidateUploadDocker) | **POST** /packages/{owner}/{repo}/validate-upload/docker/ | Validate parameters for create Docker package
+*PackagesApi* | [**packagesValidateUploadGeneric**](docs/PackagesApi.md#packagesValidateUploadGeneric) | **POST** /packages/{owner}/{repo}/validate-upload/generic/ | Validate parameters for create Generic package
*PackagesApi* | [**packagesValidateUploadGo**](docs/PackagesApi.md#packagesValidateUploadGo) | **POST** /packages/{owner}/{repo}/validate-upload/go/ | Validate parameters for create Go package
*PackagesApi* | [**packagesValidateUploadHelm**](docs/PackagesApi.md#packagesValidateUploadHelm) | **POST** /packages/{owner}/{repo}/validate-upload/helm/ | Validate parameters for create Helm package
*PackagesApi* | [**packagesValidateUploadHex**](docs/PackagesApi.md#packagesValidateUploadHex) | **POST** /packages/{owner}/{repo}/validate-upload/hex/ | Validate parameters for create Hex package
@@ -359,6 +361,12 @@ Class | Method | HTTP request | Description
*ReposApi* | [**reposUpstreamDockerPartialUpdate**](docs/ReposApi.md#reposUpstreamDockerPartialUpdate) | **PATCH** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Partially update a Docker upstream config for this repository.
*ReposApi* | [**reposUpstreamDockerRead**](docs/ReposApi.md#reposUpstreamDockerRead) | **GET** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Retrieve a Docker upstream config for this repository.
*ReposApi* | [**reposUpstreamDockerUpdate**](docs/ReposApi.md#reposUpstreamDockerUpdate) | **PUT** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Update a Docker upstream config for this repository.
+*ReposApi* | [**reposUpstreamGenericCreate**](docs/ReposApi.md#reposUpstreamGenericCreate) | **POST** /repos/{owner}/{identifier}/upstream/generic/ | Create a Generic upstream config for this repository.
+*ReposApi* | [**reposUpstreamGenericDelete**](docs/ReposApi.md#reposUpstreamGenericDelete) | **DELETE** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Delete a Generic upstream config for this repository.
+*ReposApi* | [**reposUpstreamGenericList**](docs/ReposApi.md#reposUpstreamGenericList) | **GET** /repos/{owner}/{identifier}/upstream/generic/ | List Generic upstream configs for this repository.
+*ReposApi* | [**reposUpstreamGenericPartialUpdate**](docs/ReposApi.md#reposUpstreamGenericPartialUpdate) | **PATCH** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Partially update a Generic upstream config for this repository.
+*ReposApi* | [**reposUpstreamGenericRead**](docs/ReposApi.md#reposUpstreamGenericRead) | **GET** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Retrieve a Generic upstream config for this repository.
+*ReposApi* | [**reposUpstreamGenericUpdate**](docs/ReposApi.md#reposUpstreamGenericUpdate) | **PUT** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Update a Generic upstream config for this repository.
*ReposApi* | [**reposUpstreamGoCreate**](docs/ReposApi.md#reposUpstreamGoCreate) | **POST** /repos/{owner}/{identifier}/upstream/go/ | Create a Go upstream config for this repository.
*ReposApi* | [**reposUpstreamGoDelete**](docs/ReposApi.md#reposUpstreamGoDelete) | **DELETE** /repos/{owner}/{identifier}/upstream/go/{slug_perm}/ | Delete a Go upstream config for this repository.
*ReposApi* | [**reposUpstreamGoList**](docs/ReposApi.md#reposUpstreamGoList) | **GET** /repos/{owner}/{identifier}/upstream/go/ | List Go upstream configs for this repository.
@@ -511,6 +519,11 @@ Class | Method | HTTP request | Description
- [Format](docs/Format.md)
- [FormatSupport](docs/FormatSupport.md)
- [FormatSupportUpstream](docs/FormatSupportUpstream.md)
+ - [GenericPackageUpload](docs/GenericPackageUpload.md)
+ - [GenericPackageUploadRequest](docs/GenericPackageUploadRequest.md)
+ - [GenericUpstream](docs/GenericUpstream.md)
+ - [GenericUpstreamRequest](docs/GenericUpstreamRequest.md)
+ - [GenericUpstreamRequestPatch](docs/GenericUpstreamRequestPatch.md)
- [GeoIpLocation](docs/GeoIpLocation.md)
- [GoPackageUpload](docs/GoPackageUpload.md)
- [GoPackageUploadRequest](docs/GoPackageUploadRequest.md)
diff --git a/bindings/java/src/docs/FormatSupport.md b/bindings/java/src/docs/FormatSupport.md
index fe82f333..4097c5a9 100644
--- a/bindings/java/src/docs/FormatSupport.md
+++ b/bindings/java/src/docs/FormatSupport.md
@@ -7,6 +7,7 @@ Name | Type | Description | Notes
**dependencies** | **Boolean** | If true the package format supports dependencies |
**distributions** | **Boolean** | If true the package format supports distributions |
**fileLists** | **Boolean** | If true the package format supports file lists |
+**filepaths** | **Boolean** | If true the package format supports filepaths |
**metadata** | **Boolean** | If true the package format supports metadata |
**upstreams** | [**FormatSupportUpstream**](FormatSupportUpstream.md) | |
**versioning** | **Boolean** | If true the package format supports versioning |
diff --git a/bindings/java/src/docs/GenericPackageUpload.md b/bindings/java/src/docs/GenericPackageUpload.md
new file mode 100644
index 00000000..2b0bab41
--- /dev/null
+++ b/bindings/java/src/docs/GenericPackageUpload.md
@@ -0,0 +1,106 @@
+
+# GenericPackageUpload
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**architectures** | [**List<Architecture>**](Architecture.md) | | [optional]
+**cdnUrl** | **String** | | [optional]
+**checksumMd5** | **String** | | [optional]
+**checksumSha1** | **String** | | [optional]
+**checksumSha256** | **String** | | [optional]
+**checksumSha512** | **String** | | [optional]
+**dependenciesChecksumMd5** | **String** | A checksum of all of the package's dependencies. | [optional]
+**dependenciesUrl** | **String** | | [optional]
+**description** | **String** | A textual description of this package. | [optional]
+**displayName** | **String** | | [optional]
+**distro** | [**Distribution**](Distribution.md) | | [optional]
+**distroVersion** | [**DistributionVersion**](DistributionVersion.md) | | [optional]
+**downloads** | **java.math.BigInteger** | | [optional]
+**epoch** | **java.math.BigInteger** | The epoch of the package version (if any). | [optional]
+**extension** | **String** | | [optional]
+**filename** | **String** | | [optional]
+**files** | [**List<PackageFile>**](PackageFile.md) | | [optional]
+**format** | **String** | | [optional]
+**formatUrl** | **String** | | [optional]
+**freeableStorage** | **java.math.BigInteger** | Amount of storage that will be freed if this package is deleted | [optional]
+**fullyQualifiedName** | **String** | | [optional]
+**identifierPerm** | **String** | Unique and permanent identifier for the package. | [optional]
+**identifiers** | **Map<String, String>** | Return a map of identifier field names and their values. | [optional]
+**indexed** | **Boolean** | | [optional]
+**isCancellable** | **Boolean** | | [optional]
+**isCopyable** | **Boolean** | | [optional]
+**isDeleteable** | **Boolean** | | [optional]
+**isDownloadable** | **Boolean** | | [optional]
+**isMoveable** | **Boolean** | | [optional]
+**isQuarantinable** | **Boolean** | | [optional]
+**isQuarantined** | **Boolean** | | [optional]
+**isResyncable** | **Boolean** | | [optional]
+**isSecurityScannable** | **Boolean** | | [optional]
+**isSyncAwaiting** | **Boolean** | | [optional]
+**isSyncCompleted** | **Boolean** | | [optional]
+**isSyncFailed** | **Boolean** | | [optional]
+**isSyncInFlight** | **Boolean** | | [optional]
+**isSyncInProgress** | **Boolean** | | [optional]
+**license** | **String** | The license of this package. | [optional]
+**name** | **String** | The name of this package. | [optional]
+**namespace** | **String** | | [optional]
+**namespaceUrl** | **String** | | [optional]
+**numFiles** | **java.math.BigInteger** | | [optional]
+**originRepository** | **String** | | [optional]
+**originRepositoryUrl** | **String** | | [optional]
+**packageType** | **java.math.BigInteger** | The type of package contents. | [optional]
+**policyViolated** | **Boolean** | Whether or not the package has violated any policy. | [optional]
+**rawLicense** | **String** | The raw license string. | [optional]
+**release** | **String** | The release of the package version (if any). | [optional]
+**repository** | **String** | | [optional]
+**repositoryUrl** | **String** | | [optional]
+**securityScanCompletedAt** | [**OffsetDateTime**](OffsetDateTime.md) | The datetime the security scanning was completed. | [optional]
+**securityScanStartedAt** | [**OffsetDateTime**](OffsetDateTime.md) | The datetime the security scanning was started. | [optional]
+**securityScanStatus** | [**SecurityScanStatusEnum**](#SecurityScanStatusEnum) | | [optional]
+**securityScanStatusUpdatedAt** | [**OffsetDateTime**](OffsetDateTime.md) | The datetime the security scanning status was updated. | [optional]
+**selfHtmlUrl** | **String** | | [optional]
+**selfUrl** | **String** | | [optional]
+**signatureUrl** | **String** | | [optional]
+**size** | **java.math.BigInteger** | The calculated size of the package. | [optional]
+**slug** | **String** | The public unique identifier for the package. | [optional]
+**slugPerm** | **String** | | [optional]
+**spdxLicense** | **String** | The SPDX license identifier for this package. | [optional]
+**stage** | **java.math.BigInteger** | The synchronisation (in progress) stage of the package. | [optional]
+**stageStr** | **String** | | [optional]
+**stageUpdatedAt** | [**OffsetDateTime**](OffsetDateTime.md) | The datetime the package stage was updated at. | [optional]
+**status** | **java.math.BigInteger** | The synchronisation status of the package. | [optional]
+**statusReason** | **String** | A textual description for the synchronous status reason (if any | [optional]
+**statusStr** | **String** | | [optional]
+**statusUpdatedAt** | [**OffsetDateTime**](OffsetDateTime.md) | The datetime the package status was updated at. | [optional]
+**statusUrl** | **String** | | [optional]
+**subtype** | **String** | | [optional]
+**summary** | **String** | A one-liner synopsis of this package. | [optional]
+**syncFinishedAt** | [**OffsetDateTime**](OffsetDateTime.md) | The datetime the package sync was finished at. | [optional]
+**syncProgress** | **java.math.BigInteger** | Synchronisation progress (from 0-100) | [optional]
+**tagsAutomatic** | [**Tags**](Tags.md) | | [optional]
+**tagsImmutable** | [**Tags**](Tags.md) | | [optional]
+**typeDisplay** | **String** | | [optional]
+**uploadedAt** | [**OffsetDateTime**](OffsetDateTime.md) | The date this package was uploaded. | [optional]
+**uploader** | **String** | | [optional]
+**uploaderUrl** | **String** | | [optional]
+**version** | **String** | The raw version for this package. | [optional]
+**versionOrig** | **String** | | [optional]
+**vulnerabilityScanResultsUrl** | **String** | | [optional]
+
+
+
+## Enum: SecurityScanStatusEnum
+Name | Value
+---- | -----
+AWAITING_SECURITY_SCAN | "Awaiting Security Scan"
+SECURITY_SCANNING_IN_PROGRESS | "Security Scanning in Progress"
+SCAN_DETECTED_VULNERABILITIES | "Scan Detected Vulnerabilities"
+SCAN_DETECTED_NO_VULNERABILITIES | "Scan Detected No Vulnerabilities"
+SECURITY_SCANNING_DISABLED | "Security Scanning Disabled"
+SECURITY_SCANNING_FAILED | "Security Scanning Failed"
+SECURITY_SCANNING_SKIPPED | "Security Scanning Skipped"
+SECURITY_SCANNING_NOT_SUPPORTED | "Security Scanning Not Supported"
+
+
+
diff --git a/bindings/java/src/docs/GenericPackageUploadRequest.md b/bindings/java/src/docs/GenericPackageUploadRequest.md
new file mode 100644
index 00000000..d5906cee
--- /dev/null
+++ b/bindings/java/src/docs/GenericPackageUploadRequest.md
@@ -0,0 +1,15 @@
+
+# GenericPackageUploadRequest
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**filepath** | **String** | The full filepath of the package including filename. |
+**name** | **String** | The name of this package. | [optional]
+**packageFile** | **String** | The primary file for the package. |
+**republish** | **Boolean** | If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate. | [optional]
+**tags** | **String** | A comma-separated values list of tags to add to the package. | [optional]
+**version** | **String** | The raw version for this package. | [optional]
+
+
+
diff --git a/bindings/java/src/docs/GenericUpstream.md b/bindings/java/src/docs/GenericUpstream.md
new file mode 100644
index 00000000..04d7a034
--- /dev/null
+++ b/bindings/java/src/docs/GenericUpstream.md
@@ -0,0 +1,62 @@
+
+# GenericUpstream
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**authMode** | [**AuthModeEnum**](#AuthModeEnum) | The authentication mode to use when accessing this upstream. | [optional]
+**authSecret** | **String** | Secret to provide with requests to upstream. | [optional]
+**authUsername** | **String** | Username to provide with requests to upstream. | [optional]
+**available** | **String** | | [optional]
+**canReindex** | **String** | | [optional]
+**createdAt** | [**OffsetDateTime**](OffsetDateTime.md) | The datetime the upstream source was created. | [optional]
+**disableReason** | [**DisableReasonEnum**](#DisableReasonEnum) | | [optional]
+**disableReasonText** | **String** | Human-readable explanation of why this upstream is disabled | [optional]
+**extraHeader1** | **String** | The key for extra header #1 to send to upstream. | [optional]
+**extraHeader2** | **String** | The key for extra header #2 to send to upstream. | [optional]
+**extraValue1** | **String** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extraValue2** | **String** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**hasFailedSignatureVerification** | **String** | | [optional]
+**indexPackageCount** | **String** | The number of packages available in this upstream source | [optional]
+**indexStatus** | **String** | The current indexing status of this upstream source | [optional]
+**isActive** | **Boolean** | Whether or not this upstream is active and ready for requests. | [optional]
+**lastIndexed** | **String** | The last time this upstream source was indexed | [optional]
+**mode** | [**ModeEnum**](#ModeEnum) | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional]
+**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
+**pendingValidation** | **Boolean** | When true, this upstream source is pending validation. | [optional]
+**priority** | **java.math.BigInteger** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**slugPerm** | **String** | | [optional]
+**updatedAt** | [**OffsetDateTime**](OffsetDateTime.md) | | [optional]
+**upstreamPrefix** | **String** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstreamUrl** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
+**verifySsl** | **Boolean** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+
+
+## Enum: AuthModeEnum
+Name | Value
+---- | -----
+NONE | "None"
+USERNAME_AND_PASSWORD | "Username and Password"
+TOKEN | "Token"
+
+
+
+## Enum: DisableReasonEnum
+Name | Value
+---- | -----
+N_A | "N/A"
+UPSTREAM_POINTS_TO_ITS_OWN_REPOSITORY | "Upstream points to its own repository"
+MISSING_UPSTREAM_SOURCE | "Missing upstream source"
+UPSTREAM_WAS_DISABLED_BY_REQUEST_OF_USER | "Upstream was disabled by request of user"
+
+
+
+## Enum: ModeEnum
+Name | Value
+---- | -----
+PROXY_ONLY | "Proxy Only"
+CACHE_AND_PROXY | "Cache and Proxy"
+
+
+
diff --git a/bindings/java/src/docs/GenericUpstreamRequest.md b/bindings/java/src/docs/GenericUpstreamRequest.md
new file mode 100644
index 00000000..f3915d5e
--- /dev/null
+++ b/bindings/java/src/docs/GenericUpstreamRequest.md
@@ -0,0 +1,40 @@
+
+# GenericUpstreamRequest
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**authMode** | [**AuthModeEnum**](#AuthModeEnum) | The authentication mode to use when accessing this upstream. | [optional]
+**authSecret** | **String** | Secret to provide with requests to upstream. | [optional]
+**authUsername** | **String** | Username to provide with requests to upstream. | [optional]
+**extraHeader1** | **String** | The key for extra header #1 to send to upstream. | [optional]
+**extraHeader2** | **String** | The key for extra header #2 to send to upstream. | [optional]
+**extraValue1** | **String** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extraValue2** | **String** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**isActive** | **Boolean** | Whether or not this upstream is active and ready for requests. | [optional]
+**mode** | [**ModeEnum**](#ModeEnum) | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional]
+**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
+**priority** | **java.math.BigInteger** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**upstreamPrefix** | **String** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstreamUrl** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
+**verifySsl** | **Boolean** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+
+
+## Enum: AuthModeEnum
+Name | Value
+---- | -----
+NONE | "None"
+USERNAME_AND_PASSWORD | "Username and Password"
+TOKEN | "Token"
+
+
+
+## Enum: ModeEnum
+Name | Value
+---- | -----
+PROXY_ONLY | "Proxy Only"
+CACHE_AND_PROXY | "Cache and Proxy"
+
+
+
diff --git a/bindings/java/src/docs/GenericUpstreamRequestPatch.md b/bindings/java/src/docs/GenericUpstreamRequestPatch.md
new file mode 100644
index 00000000..d68337ea
--- /dev/null
+++ b/bindings/java/src/docs/GenericUpstreamRequestPatch.md
@@ -0,0 +1,40 @@
+
+# GenericUpstreamRequestPatch
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**authMode** | [**AuthModeEnum**](#AuthModeEnum) | The authentication mode to use when accessing this upstream. | [optional]
+**authSecret** | **String** | Secret to provide with requests to upstream. | [optional]
+**authUsername** | **String** | Username to provide with requests to upstream. | [optional]
+**extraHeader1** | **String** | The key for extra header #1 to send to upstream. | [optional]
+**extraHeader2** | **String** | The key for extra header #2 to send to upstream. | [optional]
+**extraValue1** | **String** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extraValue2** | **String** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**isActive** | **Boolean** | Whether or not this upstream is active and ready for requests. | [optional]
+**mode** | [**ModeEnum**](#ModeEnum) | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional]
+**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. | [optional]
+**priority** | **java.math.BigInteger** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**upstreamPrefix** | **String** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstreamUrl** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. | [optional]
+**verifySsl** | **Boolean** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+
+
+## Enum: AuthModeEnum
+Name | Value
+---- | -----
+NONE | "None"
+USERNAME_AND_PASSWORD | "Username and Password"
+TOKEN | "Token"
+
+
+
+## Enum: ModeEnum
+Name | Value
+---- | -----
+PROXY_ONLY | "Proxy Only"
+CACHE_AND_PROXY | "Cache and Proxy"
+
+
+
diff --git a/bindings/java/src/docs/MavenUpstream.md b/bindings/java/src/docs/MavenUpstream.md
index 8d2cb163..f38eb296 100644
--- a/bindings/java/src/docs/MavenUpstream.md
+++ b/bindings/java/src/docs/MavenUpstream.md
@@ -30,6 +30,7 @@ Name | Type | Description | Notes
**pendingValidation** | **Boolean** | When true, this upstream source is pending validation. | [optional]
**priority** | **java.math.BigInteger** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
**slugPerm** | **String** | | [optional]
+**trustLevel** | [**TrustLevelEnum**](#TrustLevelEnum) | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional]
**updatedAt** | [**OffsetDateTime**](OffsetDateTime.md) | | [optional]
**upstreamUrl** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
**verificationStatus** | [**VerificationStatusEnum**](#VerificationStatusEnum) | The signature verification status for this upstream. | [optional]
@@ -72,6 +73,14 @@ CACHE_AND_PROXY | "Cache and Proxy"
CACHE_ONLY | "Cache Only"
+
+## Enum: TrustLevelEnum
+Name | Value
+---- | -----
+TRUSTED | "Trusted"
+UNTRUSTED | "Untrusted"
+
+
## Enum: VerificationStatusEnum
Name | Value
diff --git a/bindings/java/src/docs/MavenUpstreamRequest.md b/bindings/java/src/docs/MavenUpstreamRequest.md
index f38d1adf..2e2139b2 100644
--- a/bindings/java/src/docs/MavenUpstreamRequest.md
+++ b/bindings/java/src/docs/MavenUpstreamRequest.md
@@ -18,6 +18,7 @@ Name | Type | Description | Notes
**mode** | [**ModeEnum**](#ModeEnum) | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional]
**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
**priority** | **java.math.BigInteger** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**trustLevel** | [**TrustLevelEnum**](#TrustLevelEnum) | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional]
**upstreamUrl** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
**verifySsl** | **Boolean** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
@@ -48,4 +49,12 @@ CACHE_AND_PROXY | "Cache and Proxy"
CACHE_ONLY | "Cache Only"
+
+## Enum: TrustLevelEnum
+Name | Value
+---- | -----
+TRUSTED | "Trusted"
+UNTRUSTED | "Untrusted"
+
+
diff --git a/bindings/java/src/docs/MavenUpstreamRequestPatch.md b/bindings/java/src/docs/MavenUpstreamRequestPatch.md
index ce67be51..2a0eef33 100644
--- a/bindings/java/src/docs/MavenUpstreamRequestPatch.md
+++ b/bindings/java/src/docs/MavenUpstreamRequestPatch.md
@@ -18,6 +18,7 @@ Name | Type | Description | Notes
**mode** | [**ModeEnum**](#ModeEnum) | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional]
**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. | [optional]
**priority** | **java.math.BigInteger** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**trustLevel** | [**TrustLevelEnum**](#TrustLevelEnum) | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional]
**upstreamUrl** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. | [optional]
**verifySsl** | **Boolean** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
@@ -48,4 +49,12 @@ CACHE_AND_PROXY | "Cache and Proxy"
CACHE_ONLY | "Cache Only"
+
+## Enum: TrustLevelEnum
+Name | Value
+---- | -----
+TRUSTED | "Trusted"
+UNTRUSTED | "Untrusted"
+
+
diff --git a/bindings/java/src/docs/ModelPackage.md b/bindings/java/src/docs/ModelPackage.md
index a9946a29..2a4ba2c0 100644
--- a/bindings/java/src/docs/ModelPackage.md
+++ b/bindings/java/src/docs/ModelPackage.md
@@ -20,6 +20,7 @@ Name | Type | Description | Notes
**epoch** | **java.math.BigInteger** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**List<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**formatUrl** | **String** | | [optional]
diff --git a/bindings/java/src/docs/OrganizationTeam.md b/bindings/java/src/docs/OrganizationTeam.md
index 09a9c480..b6dd331b 100644
--- a/bindings/java/src/docs/OrganizationTeam.md
+++ b/bindings/java/src/docs/OrganizationTeam.md
@@ -4,7 +4,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **String** | | [optional]
+**description** | **String** | A detailed description of the team. | [optional]
**name** | **String** | A descriptive name for the team. |
**slug** | **String** | | [optional]
**slugPerm** | **String** | | [optional]
diff --git a/bindings/java/src/docs/OrganizationTeamRequest.md b/bindings/java/src/docs/OrganizationTeamRequest.md
index 7656938a..2e96b5a0 100644
--- a/bindings/java/src/docs/OrganizationTeamRequest.md
+++ b/bindings/java/src/docs/OrganizationTeamRequest.md
@@ -4,7 +4,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **String** | | [optional]
+**description** | **String** | A detailed description of the team. | [optional]
**name** | **String** | A descriptive name for the team. |
**slug** | **String** | | [optional]
**visibility** | [**VisibilityEnum**](#VisibilityEnum) | | [optional]
diff --git a/bindings/java/src/docs/OrganizationTeamRequestPatch.md b/bindings/java/src/docs/OrganizationTeamRequestPatch.md
index 2c5b4861..a2237a83 100644
--- a/bindings/java/src/docs/OrganizationTeamRequestPatch.md
+++ b/bindings/java/src/docs/OrganizationTeamRequestPatch.md
@@ -4,7 +4,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **String** | | [optional]
+**description** | **String** | A detailed description of the team. | [optional]
**name** | **String** | A descriptive name for the team. | [optional]
**slug** | **String** | | [optional]
**visibility** | [**VisibilityEnum**](#VisibilityEnum) | | [optional]
diff --git a/bindings/java/src/docs/PackageCopy.md b/bindings/java/src/docs/PackageCopy.md
index 01d3d17e..56931466 100644
--- a/bindings/java/src/docs/PackageCopy.md
+++ b/bindings/java/src/docs/PackageCopy.md
@@ -20,6 +20,7 @@ Name | Type | Description | Notes
**epoch** | **java.math.BigInteger** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**List<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**formatUrl** | **String** | | [optional]
diff --git a/bindings/java/src/docs/PackageCopyRequest.md b/bindings/java/src/docs/PackageCopyRequest.md
index 5dcbdfd9..8b0c3349 100644
--- a/bindings/java/src/docs/PackageCopyRequest.md
+++ b/bindings/java/src/docs/PackageCopyRequest.md
@@ -4,7 +4,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**destination** | **String** | |
+**destination** | **String** | The name of the destination repository without the namespace. |
**republish** | **Boolean** | If true, the package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate. | [optional]
diff --git a/bindings/java/src/docs/PackageMove.md b/bindings/java/src/docs/PackageMove.md
index 7823c75b..1a683d0b 100644
--- a/bindings/java/src/docs/PackageMove.md
+++ b/bindings/java/src/docs/PackageMove.md
@@ -20,6 +20,7 @@ Name | Type | Description | Notes
**epoch** | **java.math.BigInteger** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**List<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**formatUrl** | **String** | | [optional]
diff --git a/bindings/java/src/docs/PackageMoveRequest.md b/bindings/java/src/docs/PackageMoveRequest.md
index e949a34d..b19321e3 100644
--- a/bindings/java/src/docs/PackageMoveRequest.md
+++ b/bindings/java/src/docs/PackageMoveRequest.md
@@ -4,7 +4,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**destination** | **String** | |
+**destination** | **String** | The name of the destination repository without the namespace. |
diff --git a/bindings/java/src/docs/PackageQuarantine.md b/bindings/java/src/docs/PackageQuarantine.md
index 34dcd654..b8f8afa4 100644
--- a/bindings/java/src/docs/PackageQuarantine.md
+++ b/bindings/java/src/docs/PackageQuarantine.md
@@ -20,6 +20,7 @@ Name | Type | Description | Notes
**epoch** | **java.math.BigInteger** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**List<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**formatUrl** | **String** | | [optional]
diff --git a/bindings/java/src/docs/PackageResync.md b/bindings/java/src/docs/PackageResync.md
index 6f0bac1f..ea8f06d6 100644
--- a/bindings/java/src/docs/PackageResync.md
+++ b/bindings/java/src/docs/PackageResync.md
@@ -20,6 +20,7 @@ Name | Type | Description | Notes
**epoch** | **java.math.BigInteger** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**List<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**formatUrl** | **String** | | [optional]
diff --git a/bindings/java/src/docs/PackageTag.md b/bindings/java/src/docs/PackageTag.md
index 7c318b87..d35b3375 100644
--- a/bindings/java/src/docs/PackageTag.md
+++ b/bindings/java/src/docs/PackageTag.md
@@ -20,6 +20,7 @@ Name | Type | Description | Notes
**epoch** | **java.math.BigInteger** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**List<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**formatUrl** | **String** | | [optional]
diff --git a/bindings/java/src/docs/PackagesApi.md b/bindings/java/src/docs/PackagesApi.md
index d32792f0..02c67b67 100644
--- a/bindings/java/src/docs/PackagesApi.md
+++ b/bindings/java/src/docs/PackagesApi.md
@@ -27,6 +27,7 @@ Method | HTTP request | Description
[**packagesUploadDart**](PackagesApi.md#packagesUploadDart) | **POST** /packages/{owner}/{repo}/upload/dart/ | Create a new Dart package
[**packagesUploadDeb**](PackagesApi.md#packagesUploadDeb) | **POST** /packages/{owner}/{repo}/upload/deb/ | Create a new Debian package
[**packagesUploadDocker**](PackagesApi.md#packagesUploadDocker) | **POST** /packages/{owner}/{repo}/upload/docker/ | Create a new Docker package
+[**packagesUploadGeneric**](PackagesApi.md#packagesUploadGeneric) | **POST** /packages/{owner}/{repo}/upload/generic/ | Create a new Generic package
[**packagesUploadGo**](PackagesApi.md#packagesUploadGo) | **POST** /packages/{owner}/{repo}/upload/go/ | Create a new Go package
[**packagesUploadHelm**](PackagesApi.md#packagesUploadHelm) | **POST** /packages/{owner}/{repo}/upload/helm/ | Create a new Helm package
[**packagesUploadHex**](PackagesApi.md#packagesUploadHex) | **POST** /packages/{owner}/{repo}/upload/hex/ | Create a new Hex package
@@ -53,6 +54,7 @@ Method | HTTP request | Description
[**packagesValidateUploadDart**](PackagesApi.md#packagesValidateUploadDart) | **POST** /packages/{owner}/{repo}/validate-upload/dart/ | Validate parameters for create Dart package
[**packagesValidateUploadDeb**](PackagesApi.md#packagesValidateUploadDeb) | **POST** /packages/{owner}/{repo}/validate-upload/deb/ | Validate parameters for create Debian package
[**packagesValidateUploadDocker**](PackagesApi.md#packagesValidateUploadDocker) | **POST** /packages/{owner}/{repo}/validate-upload/docker/ | Validate parameters for create Docker package
+[**packagesValidateUploadGeneric**](PackagesApi.md#packagesValidateUploadGeneric) | **POST** /packages/{owner}/{repo}/validate-upload/generic/ | Validate parameters for create Generic package
[**packagesValidateUploadGo**](PackagesApi.md#packagesValidateUploadGo) | **POST** /packages/{owner}/{repo}/validate-upload/go/ | Validate parameters for create Go package
[**packagesValidateUploadHelm**](PackagesApi.md#packagesValidateUploadHelm) | **POST** /packages/{owner}/{repo}/validate-upload/helm/ | Validate parameters for create Helm package
[**packagesValidateUploadHex**](PackagesApi.md#packagesValidateUploadHex) | **POST** /packages/{owner}/{repo}/validate-upload/hex/ | Validate parameters for create Hex package
@@ -1563,6 +1565,70 @@ Name | Type | Description | Notes
[apikey](../README.md#apikey), [basic](../README.md#basic)
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **packagesUploadGeneric**
+> GenericPackageUpload packagesUploadGeneric(owner, repo, data)
+
+Create a new Generic package
+
+Create a new Generic package
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.PackagesApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+PackagesApi apiInstance = new PackagesApi();
+String owner = "owner_example"; // String |
+String repo = "repo_example"; // String |
+GenericPackageUploadRequest data = new GenericPackageUploadRequest(); // GenericPackageUploadRequest |
+try {
+ GenericPackageUpload result = apiInstance.packagesUploadGeneric(owner, repo, data);
+ System.out.println(result);
+} catch (ApiException e) {
+ System.err.println("Exception when calling PackagesApi#packagesUploadGeneric");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **repo** | **String**| |
+ **data** | [**GenericPackageUploadRequest**](GenericPackageUploadRequest.md)| | [optional]
+
+### Return type
+
+[**GenericPackageUpload**](GenericPackageUpload.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
### HTTP request headers
- **Content-Type**: application/json
@@ -3217,6 +3283,69 @@ null (empty response body)
[apikey](../README.md#apikey), [basic](../README.md#basic)
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **packagesValidateUploadGeneric**
+> packagesValidateUploadGeneric(owner, repo, data)
+
+Validate parameters for create Generic package
+
+Validate parameters for create Generic package
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.PackagesApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+PackagesApi apiInstance = new PackagesApi();
+String owner = "owner_example"; // String |
+String repo = "repo_example"; // String |
+GenericPackageUploadRequest data = new GenericPackageUploadRequest(); // GenericPackageUploadRequest |
+try {
+ apiInstance.packagesValidateUploadGeneric(owner, repo, data);
+} catch (ApiException e) {
+ System.err.println("Exception when calling PackagesApi#packagesValidateUploadGeneric");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **repo** | **String**| |
+ **data** | [**GenericPackageUploadRequest**](GenericPackageUploadRequest.md)| | [optional]
+
+### Return type
+
+null (empty response body)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
### HTTP request headers
- **Content-Type**: application/json
diff --git a/bindings/java/src/docs/ReposApi.md b/bindings/java/src/docs/ReposApi.md
index 9781e3d5..4b41bd78 100644
--- a/bindings/java/src/docs/ReposApi.md
+++ b/bindings/java/src/docs/ReposApi.md
@@ -73,6 +73,12 @@ Method | HTTP request | Description
[**reposUpstreamDockerPartialUpdate**](ReposApi.md#reposUpstreamDockerPartialUpdate) | **PATCH** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Partially update a Docker upstream config for this repository.
[**reposUpstreamDockerRead**](ReposApi.md#reposUpstreamDockerRead) | **GET** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Retrieve a Docker upstream config for this repository.
[**reposUpstreamDockerUpdate**](ReposApi.md#reposUpstreamDockerUpdate) | **PUT** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Update a Docker upstream config for this repository.
+[**reposUpstreamGenericCreate**](ReposApi.md#reposUpstreamGenericCreate) | **POST** /repos/{owner}/{identifier}/upstream/generic/ | Create a Generic upstream config for this repository.
+[**reposUpstreamGenericDelete**](ReposApi.md#reposUpstreamGenericDelete) | **DELETE** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Delete a Generic upstream config for this repository.
+[**reposUpstreamGenericList**](ReposApi.md#reposUpstreamGenericList) | **GET** /repos/{owner}/{identifier}/upstream/generic/ | List Generic upstream configs for this repository.
+[**reposUpstreamGenericPartialUpdate**](ReposApi.md#reposUpstreamGenericPartialUpdate) | **PATCH** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Partially update a Generic upstream config for this repository.
+[**reposUpstreamGenericRead**](ReposApi.md#reposUpstreamGenericRead) | **GET** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Retrieve a Generic upstream config for this repository.
+[**reposUpstreamGenericUpdate**](ReposApi.md#reposUpstreamGenericUpdate) | **PUT** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Update a Generic upstream config for this repository.
[**reposUpstreamGoCreate**](ReposApi.md#reposUpstreamGoCreate) | **POST** /repos/{owner}/{identifier}/upstream/go/ | Create a Go upstream config for this repository.
[**reposUpstreamGoDelete**](ReposApi.md#reposUpstreamGoDelete) | **DELETE** /repos/{owner}/{identifier}/upstream/go/{slug_perm}/ | Delete a Go upstream config for this repository.
[**reposUpstreamGoList**](ReposApi.md#reposUpstreamGoList) | **GET** /repos/{owner}/{identifier}/upstream/go/ | List Go upstream configs for this repository.
@@ -4562,6 +4568,395 @@ Name | Type | Description | Notes
[apikey](../README.md#apikey), [basic](../README.md#basic)
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **reposUpstreamGenericCreate**
+> GenericUpstream reposUpstreamGenericCreate(owner, identifier, data)
+
+Create a Generic upstream config for this repository.
+
+Create a Generic upstream config for this repository.
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.ReposApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+ReposApi apiInstance = new ReposApi();
+String owner = "owner_example"; // String |
+String identifier = "identifier_example"; // String |
+GenericUpstreamRequest data = new GenericUpstreamRequest(); // GenericUpstreamRequest |
+try {
+ GenericUpstream result = apiInstance.reposUpstreamGenericCreate(owner, identifier, data);
+ System.out.println(result);
+} catch (ApiException e) {
+ System.err.println("Exception when calling ReposApi#reposUpstreamGenericCreate");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **data** | [**GenericUpstreamRequest**](GenericUpstreamRequest.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **reposUpstreamGenericDelete**
+> reposUpstreamGenericDelete(owner, identifier, slugPerm)
+
+Delete a Generic upstream config for this repository.
+
+Delete a Generic upstream config for this repository.
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.ReposApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+ReposApi apiInstance = new ReposApi();
+String owner = "owner_example"; // String |
+String identifier = "identifier_example"; // String |
+String slugPerm = "slugPerm_example"; // String |
+try {
+ apiInstance.reposUpstreamGenericDelete(owner, identifier, slugPerm);
+} catch (ApiException e) {
+ System.err.println("Exception when calling ReposApi#reposUpstreamGenericDelete");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slugPerm** | **String**| |
+
+### Return type
+
+null (empty response body)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **reposUpstreamGenericList**
+> List<GenericUpstream> reposUpstreamGenericList(owner, identifier, page, pageSize)
+
+List Generic upstream configs for this repository.
+
+List Generic upstream configs for this repository.
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.ReposApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+ReposApi apiInstance = new ReposApi();
+String owner = "owner_example"; // String |
+String identifier = "identifier_example"; // String |
+java.math.BigInteger page = new java.math.BigInteger(); // java.math.BigInteger | A page number within the paginated result set.
+java.math.BigInteger pageSize = new java.math.BigInteger(); // java.math.BigInteger | Number of results to return per page.
+try {
+ List result = apiInstance.reposUpstreamGenericList(owner, identifier, page, pageSize);
+ System.out.println(result);
+} catch (ApiException e) {
+ System.err.println("Exception when calling ReposApi#reposUpstreamGenericList");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **page** | **java.math.BigInteger**| A page number within the paginated result set. | [optional]
+ **pageSize** | **java.math.BigInteger**| Number of results to return per page. | [optional]
+
+### Return type
+
+[**List<GenericUpstream>**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **reposUpstreamGenericPartialUpdate**
+> GenericUpstream reposUpstreamGenericPartialUpdate(owner, identifier, slugPerm, data)
+
+Partially update a Generic upstream config for this repository.
+
+Partially update a Generic upstream config for this repository.
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.ReposApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+ReposApi apiInstance = new ReposApi();
+String owner = "owner_example"; // String |
+String identifier = "identifier_example"; // String |
+String slugPerm = "slugPerm_example"; // String |
+GenericUpstreamRequestPatch data = new GenericUpstreamRequestPatch(); // GenericUpstreamRequestPatch |
+try {
+ GenericUpstream result = apiInstance.reposUpstreamGenericPartialUpdate(owner, identifier, slugPerm, data);
+ System.out.println(result);
+} catch (ApiException e) {
+ System.err.println("Exception when calling ReposApi#reposUpstreamGenericPartialUpdate");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slugPerm** | **String**| |
+ **data** | [**GenericUpstreamRequestPatch**](GenericUpstreamRequestPatch.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **reposUpstreamGenericRead**
+> GenericUpstream reposUpstreamGenericRead(owner, identifier, slugPerm)
+
+Retrieve a Generic upstream config for this repository.
+
+Retrieve a Generic upstream config for this repository.
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.ReposApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+ReposApi apiInstance = new ReposApi();
+String owner = "owner_example"; // String |
+String identifier = "identifier_example"; // String |
+String slugPerm = "slugPerm_example"; // String |
+try {
+ GenericUpstream result = apiInstance.reposUpstreamGenericRead(owner, identifier, slugPerm);
+ System.out.println(result);
+} catch (ApiException e) {
+ System.err.println("Exception when calling ReposApi#reposUpstreamGenericRead");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slugPerm** | **String**| |
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+# **reposUpstreamGenericUpdate**
+> GenericUpstream reposUpstreamGenericUpdate(owner, identifier, slugPerm, data)
+
+Update a Generic upstream config for this repository.
+
+Update a Generic upstream config for this repository.
+
+### Example
+```java
+// Import classes:
+//import io.cloudsmith.api.ApiClient;
+//import io.cloudsmith.api.ApiException;
+//import io.cloudsmith.api.Configuration;
+//import io.cloudsmith.api.auth.*;
+//import io.cloudsmith.api.apis.ReposApi;
+
+ApiClient defaultClient = Configuration.getDefaultApiClient();
+
+// Configure API key authorization: apikey
+ApiKeyAuth apikey = (ApiKeyAuth) defaultClient.getAuthentication("apikey");
+apikey.setApiKey("YOUR API KEY");
+// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
+//apikey.setApiKeyPrefix("Token");
+
+// Configure HTTP basic authorization: basic
+HttpBasicAuth basic = (HttpBasicAuth) defaultClient.getAuthentication("basic");
+basic.setUsername("YOUR USERNAME");
+basic.setPassword("YOUR PASSWORD");
+
+ReposApi apiInstance = new ReposApi();
+String owner = "owner_example"; // String |
+String identifier = "identifier_example"; // String |
+String slugPerm = "slugPerm_example"; // String |
+GenericUpstreamRequest data = new GenericUpstreamRequest(); // GenericUpstreamRequest |
+try {
+ GenericUpstream result = apiInstance.reposUpstreamGenericUpdate(owner, identifier, slugPerm, data);
+ System.out.println(result);
+} catch (ApiException e) {
+ System.err.println("Exception when calling ReposApi#reposUpstreamGenericUpdate");
+ e.printStackTrace();
+}
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slugPerm** | **String**| |
+ **data** | [**GenericUpstreamRequest**](GenericUpstreamRequest.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
### HTTP request headers
- **Content-Type**: application/json
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/JSON.java b/bindings/java/src/src/main/java/io/cloudsmith/api/JSON.java
index 49786fd6..a141906c 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/JSON.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/JSON.java
@@ -302,6 +302,11 @@
import io.cloudsmith.api.models.*;
import io.cloudsmith.api.models.*;
import io.cloudsmith.api.models.*;
+import io.cloudsmith.api.models.*;
+import io.cloudsmith.api.models.*;
+import io.cloudsmith.api.models.*;
+import io.cloudsmith.api.models.*;
+import io.cloudsmith.api.models.*;
import okio.ByteString;
import java.io.IOException;
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/apis/PackagesApi.java b/bindings/java/src/src/main/java/io/cloudsmith/api/apis/PackagesApi.java
index c14ddca3..408851c5 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/apis/PackagesApi.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/apis/PackagesApi.java
@@ -57,6 +57,8 @@
import io.cloudsmith.api.models.DockerPackageUpload;
import io.cloudsmith.api.models.DockerPackageUploadRequest;
import io.cloudsmith.api.models.ErrorDetail;
+import io.cloudsmith.api.models.GenericPackageUpload;
+import io.cloudsmith.api.models.GenericPackageUploadRequest;
import io.cloudsmith.api.models.GoPackageUpload;
import io.cloudsmith.api.models.GoPackageUploadRequest;
import io.cloudsmith.api.models.HelmPackageUpload;
@@ -3639,6 +3641,155 @@ public void onRequestProgress(long bytesWritten, long contentLength, boolean don
apiClient.executeAsync(call, localVarReturnType, callback);
return call;
}
+ /**
+ * Build call for packagesUploadGeneric
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call packagesUploadGenericCall(String owner, String repo, GenericPackageUploadRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = data;
+
+ // create path and map variables
+ String localVarPath = "/packages/{owner}/{repo}/upload/generic/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "repo" + "\\}", apiClient.escapeString(repo.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "POST", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call packagesUploadGenericValidateBeforeCall(String owner, String repo, GenericPackageUploadRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, repo, data };
+ Method method = this.getClass().getMethod("packagesUploadGenericWithHttpInfo", String.class, String.class, GenericPackageUploadRequest.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = packagesUploadGenericCall(owner, repo, data, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * Create a new Generic package
+ * Create a new Generic package
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @return GenericPackageUpload
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public GenericPackageUpload packagesUploadGeneric(String owner, String repo, GenericPackageUploadRequest data) throws ApiException {
+ ApiResponse resp = packagesUploadGenericWithHttpInfo(owner, repo, data);
+ return resp.getData();
+ }
+
+ /**
+ * Create a new Generic package
+ * Create a new Generic package
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @return ApiResponse<GenericPackageUpload>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse packagesUploadGenericWithHttpInfo( @NotNull String owner, @NotNull String repo, GenericPackageUploadRequest data) throws ApiException {
+ com.squareup.okhttp.Call call = packagesUploadGenericValidateBeforeCall(owner, repo, data, null, null);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ return apiClient.execute(call, localVarReturnType);
+ }
+
+ /**
+ * Create a new Generic package (asynchronously)
+ * Create a new Generic package
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call packagesUploadGenericAsync(String owner, String repo, GenericPackageUploadRequest data, final ApiCallback callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = packagesUploadGenericValidateBeforeCall(owner, repo, data, progressListener, progressRequestListener);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ apiClient.executeAsync(call, localVarReturnType, callback);
+ return call;
+ }
/**
* Build call for packagesUploadGo
* @param owner (required)
@@ -7473,6 +7624,151 @@ public void onRequestProgress(long bytesWritten, long contentLength, boolean don
apiClient.executeAsync(call, callback);
return call;
}
+ /**
+ * Build call for packagesValidateUploadGeneric
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call packagesValidateUploadGenericCall(String owner, String repo, GenericPackageUploadRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = data;
+
+ // create path and map variables
+ String localVarPath = "/packages/{owner}/{repo}/validate-upload/generic/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "repo" + "\\}", apiClient.escapeString(repo.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "POST", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call packagesValidateUploadGenericValidateBeforeCall(String owner, String repo, GenericPackageUploadRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, repo, data };
+ Method method = this.getClass().getMethod("packagesValidateUploadGenericWithHttpInfo", String.class, String.class, GenericPackageUploadRequest.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = packagesValidateUploadGenericCall(owner, repo, data, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * Validate parameters for create Generic package
+ * Validate parameters for create Generic package
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public void packagesValidateUploadGeneric(String owner, String repo, GenericPackageUploadRequest data) throws ApiException {
+ packagesValidateUploadGenericWithHttpInfo(owner, repo, data);
+ }
+
+ /**
+ * Validate parameters for create Generic package
+ * Validate parameters for create Generic package
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @return ApiResponse<Void>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse packagesValidateUploadGenericWithHttpInfo( @NotNull String owner, @NotNull String repo, GenericPackageUploadRequest data) throws ApiException {
+ com.squareup.okhttp.Call call = packagesValidateUploadGenericValidateBeforeCall(owner, repo, data, null, null);
+ return apiClient.execute(call);
+ }
+
+ /**
+ * Validate parameters for create Generic package (asynchronously)
+ * Validate parameters for create Generic package
+ * @param owner (required)
+ * @param repo (required)
+ * @param data (optional)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call packagesValidateUploadGenericAsync(String owner, String repo, GenericPackageUploadRequest data, final ApiCallback callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = packagesValidateUploadGenericValidateBeforeCall(owner, repo, data, progressListener, progressRequestListener);
+ apiClient.executeAsync(call, callback);
+ return call;
+ }
/**
* Build call for packagesValidateUploadGo
* @param owner (required)
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/apis/ReposApi.java b/bindings/java/src/src/main/java/io/cloudsmith/api/apis/ReposApi.java
index e0ee0daf..c3dddfaa 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/apis/ReposApi.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/apis/ReposApi.java
@@ -58,6 +58,9 @@
import io.cloudsmith.api.models.DockerUpstreamRequest;
import io.cloudsmith.api.models.DockerUpstreamRequestPatch;
import io.cloudsmith.api.models.ErrorDetail;
+import io.cloudsmith.api.models.GenericUpstream;
+import io.cloudsmith.api.models.GenericUpstreamRequest;
+import io.cloudsmith.api.models.GenericUpstreamRequestPatch;
import io.cloudsmith.api.models.GoUpstream;
import io.cloudsmith.api.models.GoUpstreamRequest;
import io.cloudsmith.api.models.GoUpstreamRequestPatch;
@@ -10478,6 +10481,916 @@ public void onRequestProgress(long bytesWritten, long contentLength, boolean don
apiClient.executeAsync(call, localVarReturnType, callback);
return call;
}
+ /**
+ * Build call for reposUpstreamGenericCreate
+ * @param owner (required)
+ * @param identifier (required)
+ * @param data (optional)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericCreateCall(String owner, String identifier, GenericUpstreamRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = data;
+
+ // create path and map variables
+ String localVarPath = "/repos/{owner}/{identifier}/upstream/generic/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "identifier" + "\\}", apiClient.escapeString(identifier.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "POST", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call reposUpstreamGenericCreateValidateBeforeCall(String owner, String identifier, GenericUpstreamRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, identifier, data };
+ Method method = this.getClass().getMethod("reposUpstreamGenericCreateWithHttpInfo", String.class, String.class, GenericUpstreamRequest.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = reposUpstreamGenericCreateCall(owner, identifier, data, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * Create a Generic upstream config for this repository.
+ * Create a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param data (optional)
+ * @return GenericUpstream
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public GenericUpstream reposUpstreamGenericCreate(String owner, String identifier, GenericUpstreamRequest data) throws ApiException {
+ ApiResponse resp = reposUpstreamGenericCreateWithHttpInfo(owner, identifier, data);
+ return resp.getData();
+ }
+
+ /**
+ * Create a Generic upstream config for this repository.
+ * Create a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param data (optional)
+ * @return ApiResponse<GenericUpstream>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse reposUpstreamGenericCreateWithHttpInfo( @NotNull String owner, @NotNull String identifier, GenericUpstreamRequest data) throws ApiException {
+ com.squareup.okhttp.Call call = reposUpstreamGenericCreateValidateBeforeCall(owner, identifier, data, null, null);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ return apiClient.execute(call, localVarReturnType);
+ }
+
+ /**
+ * Create a Generic upstream config for this repository. (asynchronously)
+ * Create a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param data (optional)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericCreateAsync(String owner, String identifier, GenericUpstreamRequest data, final ApiCallback callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = reposUpstreamGenericCreateValidateBeforeCall(owner, identifier, data, progressListener, progressRequestListener);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ apiClient.executeAsync(call, localVarReturnType, callback);
+ return call;
+ }
+ /**
+ * Build call for reposUpstreamGenericDelete
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericDeleteCall(String owner, String identifier, String slugPerm, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = null;
+
+ // create path and map variables
+ String localVarPath = "/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "identifier" + "\\}", apiClient.escapeString(identifier.toString()))
+ .replaceAll("\\{" + "slug_perm" + "\\}", apiClient.escapeString(slugPerm.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "DELETE", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call reposUpstreamGenericDeleteValidateBeforeCall(String owner, String identifier, String slugPerm, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, identifier, slugPerm };
+ Method method = this.getClass().getMethod("reposUpstreamGenericDeleteWithHttpInfo", String.class, String.class, String.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = reposUpstreamGenericDeleteCall(owner, identifier, slugPerm, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * Delete a Generic upstream config for this repository.
+ * Delete a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public void reposUpstreamGenericDelete(String owner, String identifier, String slugPerm) throws ApiException {
+ reposUpstreamGenericDeleteWithHttpInfo(owner, identifier, slugPerm);
+ }
+
+ /**
+ * Delete a Generic upstream config for this repository.
+ * Delete a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @return ApiResponse<Void>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse reposUpstreamGenericDeleteWithHttpInfo( @NotNull String owner, @NotNull String identifier, @NotNull String slugPerm) throws ApiException {
+ com.squareup.okhttp.Call call = reposUpstreamGenericDeleteValidateBeforeCall(owner, identifier, slugPerm, null, null);
+ return apiClient.execute(call);
+ }
+
+ /**
+ * Delete a Generic upstream config for this repository. (asynchronously)
+ * Delete a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericDeleteAsync(String owner, String identifier, String slugPerm, final ApiCallback callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = reposUpstreamGenericDeleteValidateBeforeCall(owner, identifier, slugPerm, progressListener, progressRequestListener);
+ apiClient.executeAsync(call, callback);
+ return call;
+ }
+ /**
+ * Build call for reposUpstreamGenericList
+ * @param owner (required)
+ * @param identifier (required)
+ * @param page A page number within the paginated result set. (optional)
+ * @param pageSize Number of results to return per page. (optional)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericListCall(String owner, String identifier, java.math.BigInteger page, java.math.BigInteger pageSize, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = null;
+
+ // create path and map variables
+ String localVarPath = "/repos/{owner}/{identifier}/upstream/generic/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "identifier" + "\\}", apiClient.escapeString(identifier.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+ if (page != null)
+ localVarQueryParams.addAll(apiClient.parameterToPair("page", page));
+ if (pageSize != null)
+ localVarQueryParams.addAll(apiClient.parameterToPair("page_size", pageSize));
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "GET", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call reposUpstreamGenericListValidateBeforeCall(String owner, String identifier, java.math.BigInteger page, java.math.BigInteger pageSize, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, identifier, page, pageSize };
+ Method method = this.getClass().getMethod("reposUpstreamGenericListWithHttpInfo", String.class, String.class, java.math.BigInteger.class, java.math.BigInteger.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = reposUpstreamGenericListCall(owner, identifier, page, pageSize, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * List Generic upstream configs for this repository.
+ * List Generic upstream configs for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param page A page number within the paginated result set. (optional)
+ * @param pageSize Number of results to return per page. (optional)
+ * @return List<GenericUpstream>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public List reposUpstreamGenericList(String owner, String identifier, java.math.BigInteger page, java.math.BigInteger pageSize) throws ApiException {
+ ApiResponse> resp = reposUpstreamGenericListWithHttpInfo(owner, identifier, page, pageSize);
+ return resp.getData();
+ }
+
+ /**
+ * List Generic upstream configs for this repository.
+ * List Generic upstream configs for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param page A page number within the paginated result set. (optional)
+ * @param pageSize Number of results to return per page. (optional)
+ * @return ApiResponse<List<GenericUpstream>>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse> reposUpstreamGenericListWithHttpInfo( @NotNull String owner, @NotNull String identifier, java.math.BigInteger page, java.math.BigInteger pageSize) throws ApiException {
+ com.squareup.okhttp.Call call = reposUpstreamGenericListValidateBeforeCall(owner, identifier, page, pageSize, null, null);
+ Type localVarReturnType = new TypeToken>(){}.getType();
+ return apiClient.execute(call, localVarReturnType);
+ }
+
+ /**
+ * List Generic upstream configs for this repository. (asynchronously)
+ * List Generic upstream configs for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param page A page number within the paginated result set. (optional)
+ * @param pageSize Number of results to return per page. (optional)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericListAsync(String owner, String identifier, java.math.BigInteger page, java.math.BigInteger pageSize, final ApiCallback> callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = reposUpstreamGenericListValidateBeforeCall(owner, identifier, page, pageSize, progressListener, progressRequestListener);
+ Type localVarReturnType = new TypeToken>(){}.getType();
+ apiClient.executeAsync(call, localVarReturnType, callback);
+ return call;
+ }
+ /**
+ * Build call for reposUpstreamGenericPartialUpdate
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericPartialUpdateCall(String owner, String identifier, String slugPerm, GenericUpstreamRequestPatch data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = data;
+
+ // create path and map variables
+ String localVarPath = "/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "identifier" + "\\}", apiClient.escapeString(identifier.toString()))
+ .replaceAll("\\{" + "slug_perm" + "\\}", apiClient.escapeString(slugPerm.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "PATCH", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call reposUpstreamGenericPartialUpdateValidateBeforeCall(String owner, String identifier, String slugPerm, GenericUpstreamRequestPatch data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, identifier, slugPerm, data };
+ Method method = this.getClass().getMethod("reposUpstreamGenericPartialUpdateWithHttpInfo", String.class, String.class, String.class, GenericUpstreamRequestPatch.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = reposUpstreamGenericPartialUpdateCall(owner, identifier, slugPerm, data, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * Partially update a Generic upstream config for this repository.
+ * Partially update a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @return GenericUpstream
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public GenericUpstream reposUpstreamGenericPartialUpdate(String owner, String identifier, String slugPerm, GenericUpstreamRequestPatch data) throws ApiException {
+ ApiResponse resp = reposUpstreamGenericPartialUpdateWithHttpInfo(owner, identifier, slugPerm, data);
+ return resp.getData();
+ }
+
+ /**
+ * Partially update a Generic upstream config for this repository.
+ * Partially update a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @return ApiResponse<GenericUpstream>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse reposUpstreamGenericPartialUpdateWithHttpInfo( @NotNull String owner, @NotNull String identifier, @NotNull String slugPerm, GenericUpstreamRequestPatch data) throws ApiException {
+ com.squareup.okhttp.Call call = reposUpstreamGenericPartialUpdateValidateBeforeCall(owner, identifier, slugPerm, data, null, null);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ return apiClient.execute(call, localVarReturnType);
+ }
+
+ /**
+ * Partially update a Generic upstream config for this repository. (asynchronously)
+ * Partially update a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericPartialUpdateAsync(String owner, String identifier, String slugPerm, GenericUpstreamRequestPatch data, final ApiCallback callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = reposUpstreamGenericPartialUpdateValidateBeforeCall(owner, identifier, slugPerm, data, progressListener, progressRequestListener);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ apiClient.executeAsync(call, localVarReturnType, callback);
+ return call;
+ }
+ /**
+ * Build call for reposUpstreamGenericRead
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericReadCall(String owner, String identifier, String slugPerm, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = null;
+
+ // create path and map variables
+ String localVarPath = "/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "identifier" + "\\}", apiClient.escapeString(identifier.toString()))
+ .replaceAll("\\{" + "slug_perm" + "\\}", apiClient.escapeString(slugPerm.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "GET", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call reposUpstreamGenericReadValidateBeforeCall(String owner, String identifier, String slugPerm, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, identifier, slugPerm };
+ Method method = this.getClass().getMethod("reposUpstreamGenericReadWithHttpInfo", String.class, String.class, String.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = reposUpstreamGenericReadCall(owner, identifier, slugPerm, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * Retrieve a Generic upstream config for this repository.
+ * Retrieve a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @return GenericUpstream
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public GenericUpstream reposUpstreamGenericRead(String owner, String identifier, String slugPerm) throws ApiException {
+ ApiResponse resp = reposUpstreamGenericReadWithHttpInfo(owner, identifier, slugPerm);
+ return resp.getData();
+ }
+
+ /**
+ * Retrieve a Generic upstream config for this repository.
+ * Retrieve a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @return ApiResponse<GenericUpstream>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse reposUpstreamGenericReadWithHttpInfo( @NotNull String owner, @NotNull String identifier, @NotNull String slugPerm) throws ApiException {
+ com.squareup.okhttp.Call call = reposUpstreamGenericReadValidateBeforeCall(owner, identifier, slugPerm, null, null);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ return apiClient.execute(call, localVarReturnType);
+ }
+
+ /**
+ * Retrieve a Generic upstream config for this repository. (asynchronously)
+ * Retrieve a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericReadAsync(String owner, String identifier, String slugPerm, final ApiCallback callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = reposUpstreamGenericReadValidateBeforeCall(owner, identifier, slugPerm, progressListener, progressRequestListener);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ apiClient.executeAsync(call, localVarReturnType, callback);
+ return call;
+ }
+ /**
+ * Build call for reposUpstreamGenericUpdate
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @param progressListener Progress listener
+ * @param progressRequestListener Progress request listener
+ * @return Call to execute
+ * @throws ApiException If fail to serialize the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericUpdateCall(String owner, String identifier, String slugPerm, GenericUpstreamRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ Object localVarPostBody = data;
+
+ // create path and map variables
+ String localVarPath = "/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/"
+ .replaceAll("\\{" + "owner" + "\\}", apiClient.escapeString(owner.toString()))
+ .replaceAll("\\{" + "identifier" + "\\}", apiClient.escapeString(identifier.toString()))
+ .replaceAll("\\{" + "slug_perm" + "\\}", apiClient.escapeString(slugPerm.toString()));
+
+ List localVarQueryParams = new ArrayList();
+ List localVarCollectionQueryParams = new ArrayList();
+
+ Map localVarHeaderParams = new HashMap();
+
+ Map localVarFormParams = new HashMap();
+
+ final String[] localVarAccepts = {
+ "application/json"
+ };
+ final String localVarAccept = apiClient.selectHeaderAccept(localVarAccepts);
+ if (localVarAccept != null) localVarHeaderParams.put("Accept", localVarAccept);
+
+ final String[] localVarContentTypes = {
+ "application/json"
+ };
+ final String localVarContentType = apiClient.selectHeaderContentType(localVarContentTypes);
+ localVarHeaderParams.put("Content-Type", localVarContentType);
+
+ if(progressListener != null) {
+ apiClient.getHttpClient().networkInterceptors().add(new com.squareup.okhttp.Interceptor() {
+ @Override
+ public com.squareup.okhttp.Response intercept(com.squareup.okhttp.Interceptor.Chain chain) throws IOException {
+ com.squareup.okhttp.Response originalResponse = chain.proceed(chain.request());
+ return originalResponse.newBuilder()
+ .body(new ProgressResponseBody(originalResponse.body(), progressListener))
+ .build();
+ }
+ });
+ }
+
+ String[] localVarAuthNames = new String[] { "apikey", "basic" };
+ if (headers != null) {
+ localVarHeaderParams.putAll(headers);
+ }
+ return apiClient.buildCall(localVarPath, "PUT", localVarQueryParams, localVarCollectionQueryParams, localVarPostBody, localVarHeaderParams, localVarFormParams, localVarAuthNames, progressRequestListener);
+ }
+
+ @SuppressWarnings("rawtypes")
+ private com.squareup.okhttp.Call reposUpstreamGenericUpdateValidateBeforeCall(String owner, String identifier, String slugPerm, GenericUpstreamRequest data, final ProgressResponseBody.ProgressListener progressListener, final ProgressRequestBody.ProgressRequestListener progressRequestListener) throws ApiException {
+ try {
+ ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
+ ExecutableValidator executableValidator = factory.getValidator().forExecutables();
+
+ Object[] parameterValues = { owner, identifier, slugPerm, data };
+ Method method = this.getClass().getMethod("reposUpstreamGenericUpdateWithHttpInfo", String.class, String.class, String.class, GenericUpstreamRequest.class);
+ Set> violations = executableValidator.validateParameters(this, method,
+ parameterValues);
+
+ if (violations.size() == 0) {
+ com.squareup.okhttp.Call call = reposUpstreamGenericUpdateCall(owner, identifier, slugPerm, data, progressListener, progressRequestListener);
+ return call;
+
+ } else {
+ throw new BeanValidationException((Set) violations);
+ }
+ } catch (NoSuchMethodException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ } catch (SecurityException e) {
+ e.printStackTrace();
+ throw new ApiException(e.getMessage());
+ }
+
+ }
+
+ /**
+ * Update a Generic upstream config for this repository.
+ * Update a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @return GenericUpstream
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public GenericUpstream reposUpstreamGenericUpdate(String owner, String identifier, String slugPerm, GenericUpstreamRequest data) throws ApiException {
+ ApiResponse resp = reposUpstreamGenericUpdateWithHttpInfo(owner, identifier, slugPerm, data);
+ return resp.getData();
+ }
+
+ /**
+ * Update a Generic upstream config for this repository.
+ * Update a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @return ApiResponse<GenericUpstream>
+ * @throws ApiException If fail to call the API, e.g. server error or cannot deserialize the response body
+ */
+ public ApiResponse reposUpstreamGenericUpdateWithHttpInfo( @NotNull String owner, @NotNull String identifier, @NotNull String slugPerm, GenericUpstreamRequest data) throws ApiException {
+ com.squareup.okhttp.Call call = reposUpstreamGenericUpdateValidateBeforeCall(owner, identifier, slugPerm, data, null, null);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ return apiClient.execute(call, localVarReturnType);
+ }
+
+ /**
+ * Update a Generic upstream config for this repository. (asynchronously)
+ * Update a Generic upstream config for this repository.
+ * @param owner (required)
+ * @param identifier (required)
+ * @param slugPerm (required)
+ * @param data (optional)
+ * @param callback The callback to be executed when the API call finishes
+ * @return The request call
+ * @throws ApiException If fail to process the API call, e.g. serializing the request body object
+ */
+ public com.squareup.okhttp.Call reposUpstreamGenericUpdateAsync(String owner, String identifier, String slugPerm, GenericUpstreamRequest data, final ApiCallback callback) throws ApiException {
+
+ ProgressResponseBody.ProgressListener progressListener = null;
+ ProgressRequestBody.ProgressRequestListener progressRequestListener = null;
+
+ if (callback != null) {
+ progressListener = new ProgressResponseBody.ProgressListener() {
+ @Override
+ public void update(long bytesRead, long contentLength, boolean done) {
+ callback.onDownloadProgress(bytesRead, contentLength, done);
+ }
+ };
+
+ progressRequestListener = new ProgressRequestBody.ProgressRequestListener() {
+ @Override
+ public void onRequestProgress(long bytesWritten, long contentLength, boolean done) {
+ callback.onUploadProgress(bytesWritten, contentLength, done);
+ }
+ };
+ }
+
+ com.squareup.okhttp.Call call = reposUpstreamGenericUpdateValidateBeforeCall(owner, identifier, slugPerm, data, progressListener, progressRequestListener);
+ Type localVarReturnType = new TypeToken(){}.getType();
+ apiClient.executeAsync(call, localVarReturnType, callback);
+ return call;
+ }
/**
* Build call for reposUpstreamGoCreate
* @param owner (required)
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/FormatSupport.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/FormatSupport.java
index 364b21ab..2f54139a 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/FormatSupport.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/FormatSupport.java
@@ -45,6 +45,9 @@ public class FormatSupport implements Serializable {
@SerializedName("file_lists")
private Boolean fileLists = null;
+ @SerializedName("filepaths")
+ private Boolean filepaths = null;
+
@SerializedName("metadata")
private Boolean metadata = null;
@@ -111,6 +114,25 @@ public void setFileLists(Boolean fileLists) {
this.fileLists = fileLists;
}
+ public FormatSupport filepaths(Boolean filepaths) {
+ this.filepaths = filepaths;
+ return this;
+ }
+
+ /**
+ * If true the package format supports filepaths
+ * @return filepaths
+ **/
+ @NotNull
+ @ApiModelProperty(required = true, value = "If true the package format supports filepaths")
+ public Boolean isFilepaths() {
+ return filepaths;
+ }
+
+ public void setFilepaths(Boolean filepaths) {
+ this.filepaths = filepaths;
+ }
+
public FormatSupport metadata(Boolean metadata) {
this.metadata = metadata;
return this;
@@ -182,6 +204,7 @@ public boolean equals(java.lang.Object o) {
return Objects.equals(this.dependencies, formatSupport.dependencies) &&
Objects.equals(this.distributions, formatSupport.distributions) &&
Objects.equals(this.fileLists, formatSupport.fileLists) &&
+ Objects.equals(this.filepaths, formatSupport.filepaths) &&
Objects.equals(this.metadata, formatSupport.metadata) &&
Objects.equals(this.upstreams, formatSupport.upstreams) &&
Objects.equals(this.versioning, formatSupport.versioning);
@@ -189,7 +212,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(dependencies, distributions, fileLists, metadata, upstreams, versioning);
+ return Objects.hash(dependencies, distributions, fileLists, filepaths, metadata, upstreams, versioning);
}
@@ -201,6 +224,7 @@ public String toString() {
sb.append(" dependencies: ").append(toIndentedString(dependencies)).append("\n");
sb.append(" distributions: ").append(toIndentedString(distributions)).append("\n");
sb.append(" fileLists: ").append(toIndentedString(fileLists)).append("\n");
+ sb.append(" filepaths: ").append(toIndentedString(filepaths)).append("\n");
sb.append(" metadata: ").append(toIndentedString(metadata)).append("\n");
sb.append(" upstreams: ").append(toIndentedString(upstreams)).append("\n");
sb.append(" versioning: ").append(toIndentedString(versioning)).append("\n");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericPackageUpload.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericPackageUpload.java
new file mode 100644
index 00000000..3949c18f
--- /dev/null
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericPackageUpload.java
@@ -0,0 +1,1374 @@
+/*
+ * Cloudsmith API (v1)
+ * The API to the Cloudsmith Service
+ *
+ * OpenAPI spec version: v1
+ * Contact: support@cloudsmith.io
+ *
+ * NOTE: This class is auto generated by the swagger code generator program.
+ * https://github.com/swagger-api/swagger-codegen.git
+ * Do not edit the class manually.
+ */
+
+
+package io.cloudsmith.api.models;
+
+import java.util.Objects;
+import java.util.Arrays;
+import com.google.gson.TypeAdapter;
+import com.google.gson.annotations.JsonAdapter;
+import com.google.gson.annotations.SerializedName;
+import com.google.gson.stream.JsonReader;
+import com.google.gson.stream.JsonWriter;
+import io.cloudsmith.api.models.Architecture;
+import io.cloudsmith.api.models.Distribution;
+import io.cloudsmith.api.models.DistributionVersion;
+import io.cloudsmith.api.models.PackageFile;
+import io.cloudsmith.api.models.Tags;
+import io.swagger.annotations.ApiModel;
+import io.swagger.annotations.ApiModelProperty;
+import java.io.IOException;
+import java.time.OffsetDateTime;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.io.Serializable;
+import javax.validation.constraints.*;
+import javax.validation.Valid;
+
+/**
+ * GenericPackageUpload
+ */
+
+public class GenericPackageUpload implements Serializable {
+ private static final long serialVersionUID = 1L;
+
+ @SerializedName("architectures")
+ private List architectures = null;
+
+ @SerializedName("cdn_url")
+ private String cdnUrl = null;
+
+ @SerializedName("checksum_md5")
+ private String checksumMd5 = null;
+
+ @SerializedName("checksum_sha1")
+ private String checksumSha1 = null;
+
+ @SerializedName("checksum_sha256")
+ private String checksumSha256 = null;
+
+ @SerializedName("checksum_sha512")
+ private String checksumSha512 = null;
+
+ @SerializedName("dependencies_checksum_md5")
+ private String dependenciesChecksumMd5 = null;
+
+ @SerializedName("dependencies_url")
+ private String dependenciesUrl = null;
+
+ @SerializedName("description")
+ private String description = null;
+
+ @SerializedName("display_name")
+ private String displayName = null;
+
+ @SerializedName("distro")
+ private Distribution distro = null;
+
+ @SerializedName("distro_version")
+ private DistributionVersion distroVersion = null;
+
+ @SerializedName("downloads")
+ private java.math.BigInteger downloads = null;
+
+ @SerializedName("epoch")
+ private java.math.BigInteger epoch = null;
+
+ @SerializedName("extension")
+ private String extension = null;
+
+ @SerializedName("filename")
+ private String filename = null;
+
+ @SerializedName("files")
+ private List files = null;
+
+ @SerializedName("format")
+ private String format = null;
+
+ @SerializedName("format_url")
+ private String formatUrl = null;
+
+ @SerializedName("freeable_storage")
+ private java.math.BigInteger freeableStorage = null;
+
+ @SerializedName("fully_qualified_name")
+ private String fullyQualifiedName = null;
+
+ @SerializedName("identifier_perm")
+ private String identifierPerm = null;
+
+ @SerializedName("identifiers")
+ private Map identifiers = null;
+
+ @SerializedName("indexed")
+ private Boolean indexed = null;
+
+ @SerializedName("is_cancellable")
+ private Boolean isCancellable = null;
+
+ @SerializedName("is_copyable")
+ private Boolean isCopyable = null;
+
+ @SerializedName("is_deleteable")
+ private Boolean isDeleteable = null;
+
+ @SerializedName("is_downloadable")
+ private Boolean isDownloadable = null;
+
+ @SerializedName("is_moveable")
+ private Boolean isMoveable = null;
+
+ @SerializedName("is_quarantinable")
+ private Boolean isQuarantinable = null;
+
+ @SerializedName("is_quarantined")
+ private Boolean isQuarantined = null;
+
+ @SerializedName("is_resyncable")
+ private Boolean isResyncable = null;
+
+ @SerializedName("is_security_scannable")
+ private Boolean isSecurityScannable = null;
+
+ @SerializedName("is_sync_awaiting")
+ private Boolean isSyncAwaiting = null;
+
+ @SerializedName("is_sync_completed")
+ private Boolean isSyncCompleted = null;
+
+ @SerializedName("is_sync_failed")
+ private Boolean isSyncFailed = null;
+
+ @SerializedName("is_sync_in_flight")
+ private Boolean isSyncInFlight = null;
+
+ @SerializedName("is_sync_in_progress")
+ private Boolean isSyncInProgress = null;
+
+ @SerializedName("license")
+ private String license = null;
+
+ @SerializedName("name")
+ private String name = null;
+
+ @SerializedName("namespace")
+ private String namespace = null;
+
+ @SerializedName("namespace_url")
+ private String namespaceUrl = null;
+
+ @SerializedName("num_files")
+ private java.math.BigInteger numFiles = null;
+
+ @SerializedName("origin_repository")
+ private String originRepository = null;
+
+ @SerializedName("origin_repository_url")
+ private String originRepositoryUrl = null;
+
+ @SerializedName("package_type")
+ private java.math.BigInteger packageType = null;
+
+ @SerializedName("policy_violated")
+ private Boolean policyViolated = null;
+
+ @SerializedName("raw_license")
+ private String rawLicense = null;
+
+ @SerializedName("release")
+ private String release = null;
+
+ @SerializedName("repository")
+ private String repository = null;
+
+ @SerializedName("repository_url")
+ private String repositoryUrl = null;
+
+ @SerializedName("security_scan_completed_at")
+ private OffsetDateTime securityScanCompletedAt = null;
+
+ @SerializedName("security_scan_started_at")
+ private OffsetDateTime securityScanStartedAt = null;
+
+ /**
+ * Gets or Sets securityScanStatus
+ */
+ @JsonAdapter(SecurityScanStatusEnum.Adapter.class)
+ public enum SecurityScanStatusEnum {
+ AWAITING_SECURITY_SCAN("Awaiting Security Scan"),
+
+ SECURITY_SCANNING_IN_PROGRESS("Security Scanning in Progress"),
+
+ SCAN_DETECTED_VULNERABILITIES("Scan Detected Vulnerabilities"),
+
+ SCAN_DETECTED_NO_VULNERABILITIES("Scan Detected No Vulnerabilities"),
+
+ SECURITY_SCANNING_DISABLED("Security Scanning Disabled"),
+
+ SECURITY_SCANNING_FAILED("Security Scanning Failed"),
+
+ SECURITY_SCANNING_SKIPPED("Security Scanning Skipped"),
+
+ SECURITY_SCANNING_NOT_SUPPORTED("Security Scanning Not Supported");
+
+ private String value;
+
+ SecurityScanStatusEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static SecurityScanStatusEnum fromValue(String text) {
+ for (SecurityScanStatusEnum b : SecurityScanStatusEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final SecurityScanStatusEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public SecurityScanStatusEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return SecurityScanStatusEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("security_scan_status")
+ private SecurityScanStatusEnum securityScanStatus = SecurityScanStatusEnum.AWAITING_SECURITY_SCAN;
+
+ @SerializedName("security_scan_status_updated_at")
+ private OffsetDateTime securityScanStatusUpdatedAt = null;
+
+ @SerializedName("self_html_url")
+ private String selfHtmlUrl = null;
+
+ @SerializedName("self_url")
+ private String selfUrl = null;
+
+ @SerializedName("signature_url")
+ private String signatureUrl = null;
+
+ @SerializedName("size")
+ private java.math.BigInteger size = null;
+
+ @SerializedName("slug")
+ private String slug = null;
+
+ @SerializedName("slug_perm")
+ private String slugPerm = null;
+
+ @SerializedName("spdx_license")
+ private String spdxLicense = null;
+
+ @SerializedName("stage")
+ private java.math.BigInteger stage = null;
+
+ @SerializedName("stage_str")
+ private String stageStr = null;
+
+ @SerializedName("stage_updated_at")
+ private OffsetDateTime stageUpdatedAt = null;
+
+ @SerializedName("status")
+ private java.math.BigInteger status = null;
+
+ @SerializedName("status_reason")
+ private String statusReason = null;
+
+ @SerializedName("status_str")
+ private String statusStr = null;
+
+ @SerializedName("status_updated_at")
+ private OffsetDateTime statusUpdatedAt = null;
+
+ @SerializedName("status_url")
+ private String statusUrl = null;
+
+ @SerializedName("subtype")
+ private String subtype = null;
+
+ @SerializedName("summary")
+ private String summary = null;
+
+ @SerializedName("sync_finished_at")
+ private OffsetDateTime syncFinishedAt = null;
+
+ @SerializedName("sync_progress")
+ private java.math.BigInteger syncProgress = null;
+
+ @SerializedName("tags_automatic")
+ private Tags tagsAutomatic = null;
+
+ @SerializedName("tags_immutable")
+ private Tags tagsImmutable = null;
+
+ @SerializedName("type_display")
+ private String typeDisplay = null;
+
+ @SerializedName("uploaded_at")
+ private OffsetDateTime uploadedAt = null;
+
+ @SerializedName("uploader")
+ private String uploader = null;
+
+ @SerializedName("uploader_url")
+ private String uploaderUrl = null;
+
+ @SerializedName("version")
+ private String version = null;
+
+ @SerializedName("version_orig")
+ private String versionOrig = null;
+
+ @SerializedName("vulnerability_scan_results_url")
+ private String vulnerabilityScanResultsUrl = null;
+
+ /**
+ * Get architectures
+ * @return architectures
+ **/
+ @Valid
+ @ApiModelProperty(value = "")
+ public List getArchitectures() {
+ return architectures;
+ }
+
+ /**
+ * Get cdnUrl
+ * @return cdnUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getCdnUrl() {
+ return cdnUrl;
+ }
+
+ /**
+ * Get checksumMd5
+ * @return checksumMd5
+ **/
+ @ApiModelProperty(value = "")
+ public String getChecksumMd5() {
+ return checksumMd5;
+ }
+
+ /**
+ * Get checksumSha1
+ * @return checksumSha1
+ **/
+ @ApiModelProperty(value = "")
+ public String getChecksumSha1() {
+ return checksumSha1;
+ }
+
+ /**
+ * Get checksumSha256
+ * @return checksumSha256
+ **/
+ @ApiModelProperty(value = "")
+ public String getChecksumSha256() {
+ return checksumSha256;
+ }
+
+ /**
+ * Get checksumSha512
+ * @return checksumSha512
+ **/
+ @ApiModelProperty(value = "")
+ public String getChecksumSha512() {
+ return checksumSha512;
+ }
+
+ /**
+ * A checksum of all of the package's dependencies.
+ * @return dependenciesChecksumMd5
+ **/
+ @ApiModelProperty(value = "A checksum of all of the package's dependencies.")
+ public String getDependenciesChecksumMd5() {
+ return dependenciesChecksumMd5;
+ }
+
+ /**
+ * Get dependenciesUrl
+ * @return dependenciesUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getDependenciesUrl() {
+ return dependenciesUrl;
+ }
+
+ /**
+ * A textual description of this package.
+ * @return description
+ **/
+ @ApiModelProperty(value = "A textual description of this package.")
+ public String getDescription() {
+ return description;
+ }
+
+ /**
+ * Get displayName
+ * @return displayName
+ **/
+ @ApiModelProperty(value = "")
+ public String getDisplayName() {
+ return displayName;
+ }
+
+ public GenericPackageUpload distro(Distribution distro) {
+ this.distro = distro;
+ return this;
+ }
+
+ /**
+ * Get distro
+ * @return distro
+ **/
+ @Valid
+ @ApiModelProperty(value = "")
+ public Distribution getDistro() {
+ return distro;
+ }
+
+ public void setDistro(Distribution distro) {
+ this.distro = distro;
+ }
+
+ public GenericPackageUpload distroVersion(DistributionVersion distroVersion) {
+ this.distroVersion = distroVersion;
+ return this;
+ }
+
+ /**
+ * Get distroVersion
+ * @return distroVersion
+ **/
+ @Valid
+ @ApiModelProperty(value = "")
+ public DistributionVersion getDistroVersion() {
+ return distroVersion;
+ }
+
+ public void setDistroVersion(DistributionVersion distroVersion) {
+ this.distroVersion = distroVersion;
+ }
+
+ /**
+ * Get downloads
+ * @return downloads
+ **/
+ @ApiModelProperty(value = "")
+ public java.math.BigInteger getDownloads() {
+ return downloads;
+ }
+
+ /**
+ * The epoch of the package version (if any).
+ * @return epoch
+ **/
+ @ApiModelProperty(value = "The epoch of the package version (if any).")
+ public java.math.BigInteger getEpoch() {
+ return epoch;
+ }
+
+ /**
+ * Get extension
+ * @return extension
+ **/
+ @ApiModelProperty(value = "")
+ public String getExtension() {
+ return extension;
+ }
+
+ /**
+ * Get filename
+ * @return filename
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getFilename() {
+ return filename;
+ }
+
+ /**
+ * Get files
+ * @return files
+ **/
+ @Valid
+ @ApiModelProperty(value = "")
+ public List getFiles() {
+ return files;
+ }
+
+ /**
+ * Get format
+ * @return format
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getFormat() {
+ return format;
+ }
+
+ /**
+ * Get formatUrl
+ * @return formatUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getFormatUrl() {
+ return formatUrl;
+ }
+
+ /**
+ * Amount of storage that will be freed if this package is deleted
+ * @return freeableStorage
+ **/
+ @ApiModelProperty(value = "Amount of storage that will be freed if this package is deleted")
+ public java.math.BigInteger getFreeableStorage() {
+ return freeableStorage;
+ }
+
+ /**
+ * Get fullyQualifiedName
+ * @return fullyQualifiedName
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getFullyQualifiedName() {
+ return fullyQualifiedName;
+ }
+
+ /**
+ * Unique and permanent identifier for the package.
+ * @return identifierPerm
+ **/
+ @Size(min=1) @ApiModelProperty(value = "Unique and permanent identifier for the package.")
+ public String getIdentifierPerm() {
+ return identifierPerm;
+ }
+
+ /**
+ * Return a map of identifier field names and their values.
+ * @return identifiers
+ **/
+ @ApiModelProperty(value = "Return a map of identifier field names and their values.")
+ public Map getIdentifiers() {
+ return identifiers;
+ }
+
+ /**
+ * Get indexed
+ * @return indexed
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIndexed() {
+ return indexed;
+ }
+
+ /**
+ * Get isCancellable
+ * @return isCancellable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsCancellable() {
+ return isCancellable;
+ }
+
+ /**
+ * Get isCopyable
+ * @return isCopyable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsCopyable() {
+ return isCopyable;
+ }
+
+ /**
+ * Get isDeleteable
+ * @return isDeleteable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsDeleteable() {
+ return isDeleteable;
+ }
+
+ /**
+ * Get isDownloadable
+ * @return isDownloadable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsDownloadable() {
+ return isDownloadable;
+ }
+
+ /**
+ * Get isMoveable
+ * @return isMoveable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsMoveable() {
+ return isMoveable;
+ }
+
+ /**
+ * Get isQuarantinable
+ * @return isQuarantinable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsQuarantinable() {
+ return isQuarantinable;
+ }
+
+ /**
+ * Get isQuarantined
+ * @return isQuarantined
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsQuarantined() {
+ return isQuarantined;
+ }
+
+ /**
+ * Get isResyncable
+ * @return isResyncable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsResyncable() {
+ return isResyncable;
+ }
+
+ /**
+ * Get isSecurityScannable
+ * @return isSecurityScannable
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsSecurityScannable() {
+ return isSecurityScannable;
+ }
+
+ /**
+ * Get isSyncAwaiting
+ * @return isSyncAwaiting
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsSyncAwaiting() {
+ return isSyncAwaiting;
+ }
+
+ /**
+ * Get isSyncCompleted
+ * @return isSyncCompleted
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsSyncCompleted() {
+ return isSyncCompleted;
+ }
+
+ /**
+ * Get isSyncFailed
+ * @return isSyncFailed
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsSyncFailed() {
+ return isSyncFailed;
+ }
+
+ /**
+ * Get isSyncInFlight
+ * @return isSyncInFlight
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsSyncInFlight() {
+ return isSyncInFlight;
+ }
+
+ /**
+ * Get isSyncInProgress
+ * @return isSyncInProgress
+ **/
+ @ApiModelProperty(value = "")
+ public Boolean isIsSyncInProgress() {
+ return isSyncInProgress;
+ }
+
+ /**
+ * The license of this package.
+ * @return license
+ **/
+ @ApiModelProperty(value = "The license of this package.")
+ public String getLicense() {
+ return license;
+ }
+
+ public GenericPackageUpload name(String name) {
+ this.name = name;
+ return this;
+ }
+
+ /**
+ * The name of this package.
+ * @return name
+ **/
+ @Size(max=200) @ApiModelProperty(value = "The name of this package.")
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ /**
+ * Get namespace
+ * @return namespace
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getNamespace() {
+ return namespace;
+ }
+
+ /**
+ * Get namespaceUrl
+ * @return namespaceUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getNamespaceUrl() {
+ return namespaceUrl;
+ }
+
+ /**
+ * Get numFiles
+ * @return numFiles
+ **/
+ @ApiModelProperty(value = "")
+ public java.math.BigInteger getNumFiles() {
+ return numFiles;
+ }
+
+ /**
+ * Get originRepository
+ * @return originRepository
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getOriginRepository() {
+ return originRepository;
+ }
+
+ /**
+ * Get originRepositoryUrl
+ * @return originRepositoryUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getOriginRepositoryUrl() {
+ return originRepositoryUrl;
+ }
+
+ /**
+ * The type of package contents.
+ * @return packageType
+ **/
+ @ApiModelProperty(value = "The type of package contents.")
+ public java.math.BigInteger getPackageType() {
+ return packageType;
+ }
+
+ /**
+ * Whether or not the package has violated any policy.
+ * @return policyViolated
+ **/
+ @ApiModelProperty(value = "Whether or not the package has violated any policy.")
+ public Boolean isPolicyViolated() {
+ return policyViolated;
+ }
+
+ /**
+ * The raw license string.
+ * @return rawLicense
+ **/
+ @Size(min=1) @ApiModelProperty(value = "The raw license string.")
+ public String getRawLicense() {
+ return rawLicense;
+ }
+
+ /**
+ * The release of the package version (if any).
+ * @return release
+ **/
+ @ApiModelProperty(value = "The release of the package version (if any).")
+ public String getRelease() {
+ return release;
+ }
+
+ /**
+ * Get repository
+ * @return repository
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getRepository() {
+ return repository;
+ }
+
+ /**
+ * Get repositoryUrl
+ * @return repositoryUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getRepositoryUrl() {
+ return repositoryUrl;
+ }
+
+ /**
+ * The datetime the security scanning was completed.
+ * @return securityScanCompletedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The datetime the security scanning was completed.")
+ public OffsetDateTime getSecurityScanCompletedAt() {
+ return securityScanCompletedAt;
+ }
+
+ /**
+ * The datetime the security scanning was started.
+ * @return securityScanStartedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The datetime the security scanning was started.")
+ public OffsetDateTime getSecurityScanStartedAt() {
+ return securityScanStartedAt;
+ }
+
+ /**
+ * Get securityScanStatus
+ * @return securityScanStatus
+ **/
+ @ApiModelProperty(value = "")
+ public SecurityScanStatusEnum getSecurityScanStatus() {
+ return securityScanStatus;
+ }
+
+ /**
+ * The datetime the security scanning status was updated.
+ * @return securityScanStatusUpdatedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The datetime the security scanning status was updated.")
+ public OffsetDateTime getSecurityScanStatusUpdatedAt() {
+ return securityScanStatusUpdatedAt;
+ }
+
+ /**
+ * Get selfHtmlUrl
+ * @return selfHtmlUrl
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getSelfHtmlUrl() {
+ return selfHtmlUrl;
+ }
+
+ /**
+ * Get selfUrl
+ * @return selfUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getSelfUrl() {
+ return selfUrl;
+ }
+
+ /**
+ * Get signatureUrl
+ * @return signatureUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getSignatureUrl() {
+ return signatureUrl;
+ }
+
+ /**
+ * The calculated size of the package.
+ * @return size
+ **/
+ @ApiModelProperty(value = "The calculated size of the package.")
+ public java.math.BigInteger getSize() {
+ return size;
+ }
+
+ /**
+ * The public unique identifier for the package.
+ * @return slug
+ **/
+ @Pattern(regexp="^[-a-zA-Z0-9_]+$") @Size(min=1) @ApiModelProperty(value = "The public unique identifier for the package.")
+ public String getSlug() {
+ return slug;
+ }
+
+ /**
+ * Get slugPerm
+ * @return slugPerm
+ **/
+ @Pattern(regexp="^[-a-zA-Z0-9_]+$") @Size(min=1) @ApiModelProperty(value = "")
+ public String getSlugPerm() {
+ return slugPerm;
+ }
+
+ /**
+ * The SPDX license identifier for this package.
+ * @return spdxLicense
+ **/
+ @Size(min=1) @ApiModelProperty(value = "The SPDX license identifier for this package.")
+ public String getSpdxLicense() {
+ return spdxLicense;
+ }
+
+ /**
+ * The synchronisation (in progress) stage of the package.
+ * @return stage
+ **/
+ @ApiModelProperty(value = "The synchronisation (in progress) stage of the package.")
+ public java.math.BigInteger getStage() {
+ return stage;
+ }
+
+ /**
+ * Get stageStr
+ * @return stageStr
+ **/
+ @ApiModelProperty(value = "")
+ public String getStageStr() {
+ return stageStr;
+ }
+
+ /**
+ * The datetime the package stage was updated at.
+ * @return stageUpdatedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The datetime the package stage was updated at.")
+ public OffsetDateTime getStageUpdatedAt() {
+ return stageUpdatedAt;
+ }
+
+ /**
+ * The synchronisation status of the package.
+ * @return status
+ **/
+ @ApiModelProperty(value = "The synchronisation status of the package.")
+ public java.math.BigInteger getStatus() {
+ return status;
+ }
+
+ /**
+ * A textual description for the synchronous status reason (if any
+ * @return statusReason
+ **/
+ @ApiModelProperty(value = "A textual description for the synchronous status reason (if any")
+ public String getStatusReason() {
+ return statusReason;
+ }
+
+ /**
+ * Get statusStr
+ * @return statusStr
+ **/
+ @ApiModelProperty(value = "")
+ public String getStatusStr() {
+ return statusStr;
+ }
+
+ /**
+ * The datetime the package status was updated at.
+ * @return statusUpdatedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The datetime the package status was updated at.")
+ public OffsetDateTime getStatusUpdatedAt() {
+ return statusUpdatedAt;
+ }
+
+ /**
+ * Get statusUrl
+ * @return statusUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getStatusUrl() {
+ return statusUrl;
+ }
+
+ /**
+ * Get subtype
+ * @return subtype
+ **/
+ @ApiModelProperty(value = "")
+ public String getSubtype() {
+ return subtype;
+ }
+
+ /**
+ * A one-liner synopsis of this package.
+ * @return summary
+ **/
+ @ApiModelProperty(value = "A one-liner synopsis of this package.")
+ public String getSummary() {
+ return summary;
+ }
+
+ /**
+ * The datetime the package sync was finished at.
+ * @return syncFinishedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The datetime the package sync was finished at.")
+ public OffsetDateTime getSyncFinishedAt() {
+ return syncFinishedAt;
+ }
+
+ /**
+ * Synchronisation progress (from 0-100)
+ * @return syncProgress
+ **/
+ @ApiModelProperty(value = "Synchronisation progress (from 0-100)")
+ public java.math.BigInteger getSyncProgress() {
+ return syncProgress;
+ }
+
+ public GenericPackageUpload tagsAutomatic(Tags tagsAutomatic) {
+ this.tagsAutomatic = tagsAutomatic;
+ return this;
+ }
+
+ /**
+ * Get tagsAutomatic
+ * @return tagsAutomatic
+ **/
+ @Valid
+ @ApiModelProperty(value = "")
+ public Tags getTagsAutomatic() {
+ return tagsAutomatic;
+ }
+
+ public void setTagsAutomatic(Tags tagsAutomatic) {
+ this.tagsAutomatic = tagsAutomatic;
+ }
+
+ public GenericPackageUpload tagsImmutable(Tags tagsImmutable) {
+ this.tagsImmutable = tagsImmutable;
+ return this;
+ }
+
+ /**
+ * Get tagsImmutable
+ * @return tagsImmutable
+ **/
+ @Valid
+ @ApiModelProperty(value = "")
+ public Tags getTagsImmutable() {
+ return tagsImmutable;
+ }
+
+ public void setTagsImmutable(Tags tagsImmutable) {
+ this.tagsImmutable = tagsImmutable;
+ }
+
+ /**
+ * Get typeDisplay
+ * @return typeDisplay
+ **/
+ @ApiModelProperty(value = "")
+ public String getTypeDisplay() {
+ return typeDisplay;
+ }
+
+ /**
+ * The date this package was uploaded.
+ * @return uploadedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The date this package was uploaded.")
+ public OffsetDateTime getUploadedAt() {
+ return uploadedAt;
+ }
+
+ /**
+ * Get uploader
+ * @return uploader
+ **/
+ @Size(min=1) @ApiModelProperty(value = "")
+ public String getUploader() {
+ return uploader;
+ }
+
+ /**
+ * Get uploaderUrl
+ * @return uploaderUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getUploaderUrl() {
+ return uploaderUrl;
+ }
+
+ public GenericPackageUpload version(String version) {
+ this.version = version;
+ return this;
+ }
+
+ /**
+ * The raw version for this package.
+ * @return version
+ **/
+ @Size(max=255) @ApiModelProperty(value = "The raw version for this package.")
+ public String getVersion() {
+ return version;
+ }
+
+ public void setVersion(String version) {
+ this.version = version;
+ }
+
+ /**
+ * Get versionOrig
+ * @return versionOrig
+ **/
+ @ApiModelProperty(value = "")
+ public String getVersionOrig() {
+ return versionOrig;
+ }
+
+ /**
+ * Get vulnerabilityScanResultsUrl
+ * @return vulnerabilityScanResultsUrl
+ **/
+ @ApiModelProperty(value = "")
+ public String getVulnerabilityScanResultsUrl() {
+ return vulnerabilityScanResultsUrl;
+ }
+
+
+ @Override
+ public boolean equals(java.lang.Object o) {
+ if (this == o) {
+ return true;
+ }
+ if (o == null || getClass() != o.getClass()) {
+ return false;
+ }
+ GenericPackageUpload genericPackageUpload = (GenericPackageUpload) o;
+ return Objects.equals(this.architectures, genericPackageUpload.architectures) &&
+ Objects.equals(this.cdnUrl, genericPackageUpload.cdnUrl) &&
+ Objects.equals(this.checksumMd5, genericPackageUpload.checksumMd5) &&
+ Objects.equals(this.checksumSha1, genericPackageUpload.checksumSha1) &&
+ Objects.equals(this.checksumSha256, genericPackageUpload.checksumSha256) &&
+ Objects.equals(this.checksumSha512, genericPackageUpload.checksumSha512) &&
+ Objects.equals(this.dependenciesChecksumMd5, genericPackageUpload.dependenciesChecksumMd5) &&
+ Objects.equals(this.dependenciesUrl, genericPackageUpload.dependenciesUrl) &&
+ Objects.equals(this.description, genericPackageUpload.description) &&
+ Objects.equals(this.displayName, genericPackageUpload.displayName) &&
+ Objects.equals(this.distro, genericPackageUpload.distro) &&
+ Objects.equals(this.distroVersion, genericPackageUpload.distroVersion) &&
+ Objects.equals(this.downloads, genericPackageUpload.downloads) &&
+ Objects.equals(this.epoch, genericPackageUpload.epoch) &&
+ Objects.equals(this.extension, genericPackageUpload.extension) &&
+ Objects.equals(this.filename, genericPackageUpload.filename) &&
+ Objects.equals(this.files, genericPackageUpload.files) &&
+ Objects.equals(this.format, genericPackageUpload.format) &&
+ Objects.equals(this.formatUrl, genericPackageUpload.formatUrl) &&
+ Objects.equals(this.freeableStorage, genericPackageUpload.freeableStorage) &&
+ Objects.equals(this.fullyQualifiedName, genericPackageUpload.fullyQualifiedName) &&
+ Objects.equals(this.identifierPerm, genericPackageUpload.identifierPerm) &&
+ Objects.equals(this.identifiers, genericPackageUpload.identifiers) &&
+ Objects.equals(this.indexed, genericPackageUpload.indexed) &&
+ Objects.equals(this.isCancellable, genericPackageUpload.isCancellable) &&
+ Objects.equals(this.isCopyable, genericPackageUpload.isCopyable) &&
+ Objects.equals(this.isDeleteable, genericPackageUpload.isDeleteable) &&
+ Objects.equals(this.isDownloadable, genericPackageUpload.isDownloadable) &&
+ Objects.equals(this.isMoveable, genericPackageUpload.isMoveable) &&
+ Objects.equals(this.isQuarantinable, genericPackageUpload.isQuarantinable) &&
+ Objects.equals(this.isQuarantined, genericPackageUpload.isQuarantined) &&
+ Objects.equals(this.isResyncable, genericPackageUpload.isResyncable) &&
+ Objects.equals(this.isSecurityScannable, genericPackageUpload.isSecurityScannable) &&
+ Objects.equals(this.isSyncAwaiting, genericPackageUpload.isSyncAwaiting) &&
+ Objects.equals(this.isSyncCompleted, genericPackageUpload.isSyncCompleted) &&
+ Objects.equals(this.isSyncFailed, genericPackageUpload.isSyncFailed) &&
+ Objects.equals(this.isSyncInFlight, genericPackageUpload.isSyncInFlight) &&
+ Objects.equals(this.isSyncInProgress, genericPackageUpload.isSyncInProgress) &&
+ Objects.equals(this.license, genericPackageUpload.license) &&
+ Objects.equals(this.name, genericPackageUpload.name) &&
+ Objects.equals(this.namespace, genericPackageUpload.namespace) &&
+ Objects.equals(this.namespaceUrl, genericPackageUpload.namespaceUrl) &&
+ Objects.equals(this.numFiles, genericPackageUpload.numFiles) &&
+ Objects.equals(this.originRepository, genericPackageUpload.originRepository) &&
+ Objects.equals(this.originRepositoryUrl, genericPackageUpload.originRepositoryUrl) &&
+ Objects.equals(this.packageType, genericPackageUpload.packageType) &&
+ Objects.equals(this.policyViolated, genericPackageUpload.policyViolated) &&
+ Objects.equals(this.rawLicense, genericPackageUpload.rawLicense) &&
+ Objects.equals(this.release, genericPackageUpload.release) &&
+ Objects.equals(this.repository, genericPackageUpload.repository) &&
+ Objects.equals(this.repositoryUrl, genericPackageUpload.repositoryUrl) &&
+ Objects.equals(this.securityScanCompletedAt, genericPackageUpload.securityScanCompletedAt) &&
+ Objects.equals(this.securityScanStartedAt, genericPackageUpload.securityScanStartedAt) &&
+ Objects.equals(this.securityScanStatus, genericPackageUpload.securityScanStatus) &&
+ Objects.equals(this.securityScanStatusUpdatedAt, genericPackageUpload.securityScanStatusUpdatedAt) &&
+ Objects.equals(this.selfHtmlUrl, genericPackageUpload.selfHtmlUrl) &&
+ Objects.equals(this.selfUrl, genericPackageUpload.selfUrl) &&
+ Objects.equals(this.signatureUrl, genericPackageUpload.signatureUrl) &&
+ Objects.equals(this.size, genericPackageUpload.size) &&
+ Objects.equals(this.slug, genericPackageUpload.slug) &&
+ Objects.equals(this.slugPerm, genericPackageUpload.slugPerm) &&
+ Objects.equals(this.spdxLicense, genericPackageUpload.spdxLicense) &&
+ Objects.equals(this.stage, genericPackageUpload.stage) &&
+ Objects.equals(this.stageStr, genericPackageUpload.stageStr) &&
+ Objects.equals(this.stageUpdatedAt, genericPackageUpload.stageUpdatedAt) &&
+ Objects.equals(this.status, genericPackageUpload.status) &&
+ Objects.equals(this.statusReason, genericPackageUpload.statusReason) &&
+ Objects.equals(this.statusStr, genericPackageUpload.statusStr) &&
+ Objects.equals(this.statusUpdatedAt, genericPackageUpload.statusUpdatedAt) &&
+ Objects.equals(this.statusUrl, genericPackageUpload.statusUrl) &&
+ Objects.equals(this.subtype, genericPackageUpload.subtype) &&
+ Objects.equals(this.summary, genericPackageUpload.summary) &&
+ Objects.equals(this.syncFinishedAt, genericPackageUpload.syncFinishedAt) &&
+ Objects.equals(this.syncProgress, genericPackageUpload.syncProgress) &&
+ Objects.equals(this.tagsAutomatic, genericPackageUpload.tagsAutomatic) &&
+ Objects.equals(this.tagsImmutable, genericPackageUpload.tagsImmutable) &&
+ Objects.equals(this.typeDisplay, genericPackageUpload.typeDisplay) &&
+ Objects.equals(this.uploadedAt, genericPackageUpload.uploadedAt) &&
+ Objects.equals(this.uploader, genericPackageUpload.uploader) &&
+ Objects.equals(this.uploaderUrl, genericPackageUpload.uploaderUrl) &&
+ Objects.equals(this.version, genericPackageUpload.version) &&
+ Objects.equals(this.versionOrig, genericPackageUpload.versionOrig) &&
+ Objects.equals(this.vulnerabilityScanResultsUrl, genericPackageUpload.vulnerabilityScanResultsUrl);
+ }
+
+ @Override
+ public int hashCode() {
+ return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
+ }
+
+
+ @Override
+ public String toString() {
+ StringBuilder sb = new StringBuilder();
+ sb.append("class GenericPackageUpload {\n");
+
+ sb.append(" architectures: ").append(toIndentedString(architectures)).append("\n");
+ sb.append(" cdnUrl: ").append(toIndentedString(cdnUrl)).append("\n");
+ sb.append(" checksumMd5: ").append(toIndentedString(checksumMd5)).append("\n");
+ sb.append(" checksumSha1: ").append(toIndentedString(checksumSha1)).append("\n");
+ sb.append(" checksumSha256: ").append(toIndentedString(checksumSha256)).append("\n");
+ sb.append(" checksumSha512: ").append(toIndentedString(checksumSha512)).append("\n");
+ sb.append(" dependenciesChecksumMd5: ").append(toIndentedString(dependenciesChecksumMd5)).append("\n");
+ sb.append(" dependenciesUrl: ").append(toIndentedString(dependenciesUrl)).append("\n");
+ sb.append(" description: ").append(toIndentedString(description)).append("\n");
+ sb.append(" displayName: ").append(toIndentedString(displayName)).append("\n");
+ sb.append(" distro: ").append(toIndentedString(distro)).append("\n");
+ sb.append(" distroVersion: ").append(toIndentedString(distroVersion)).append("\n");
+ sb.append(" downloads: ").append(toIndentedString(downloads)).append("\n");
+ sb.append(" epoch: ").append(toIndentedString(epoch)).append("\n");
+ sb.append(" extension: ").append(toIndentedString(extension)).append("\n");
+ sb.append(" filename: ").append(toIndentedString(filename)).append("\n");
+ sb.append(" files: ").append(toIndentedString(files)).append("\n");
+ sb.append(" format: ").append(toIndentedString(format)).append("\n");
+ sb.append(" formatUrl: ").append(toIndentedString(formatUrl)).append("\n");
+ sb.append(" freeableStorage: ").append(toIndentedString(freeableStorage)).append("\n");
+ sb.append(" fullyQualifiedName: ").append(toIndentedString(fullyQualifiedName)).append("\n");
+ sb.append(" identifierPerm: ").append(toIndentedString(identifierPerm)).append("\n");
+ sb.append(" identifiers: ").append(toIndentedString(identifiers)).append("\n");
+ sb.append(" indexed: ").append(toIndentedString(indexed)).append("\n");
+ sb.append(" isCancellable: ").append(toIndentedString(isCancellable)).append("\n");
+ sb.append(" isCopyable: ").append(toIndentedString(isCopyable)).append("\n");
+ sb.append(" isDeleteable: ").append(toIndentedString(isDeleteable)).append("\n");
+ sb.append(" isDownloadable: ").append(toIndentedString(isDownloadable)).append("\n");
+ sb.append(" isMoveable: ").append(toIndentedString(isMoveable)).append("\n");
+ sb.append(" isQuarantinable: ").append(toIndentedString(isQuarantinable)).append("\n");
+ sb.append(" isQuarantined: ").append(toIndentedString(isQuarantined)).append("\n");
+ sb.append(" isResyncable: ").append(toIndentedString(isResyncable)).append("\n");
+ sb.append(" isSecurityScannable: ").append(toIndentedString(isSecurityScannable)).append("\n");
+ sb.append(" isSyncAwaiting: ").append(toIndentedString(isSyncAwaiting)).append("\n");
+ sb.append(" isSyncCompleted: ").append(toIndentedString(isSyncCompleted)).append("\n");
+ sb.append(" isSyncFailed: ").append(toIndentedString(isSyncFailed)).append("\n");
+ sb.append(" isSyncInFlight: ").append(toIndentedString(isSyncInFlight)).append("\n");
+ sb.append(" isSyncInProgress: ").append(toIndentedString(isSyncInProgress)).append("\n");
+ sb.append(" license: ").append(toIndentedString(license)).append("\n");
+ sb.append(" name: ").append(toIndentedString(name)).append("\n");
+ sb.append(" namespace: ").append(toIndentedString(namespace)).append("\n");
+ sb.append(" namespaceUrl: ").append(toIndentedString(namespaceUrl)).append("\n");
+ sb.append(" numFiles: ").append(toIndentedString(numFiles)).append("\n");
+ sb.append(" originRepository: ").append(toIndentedString(originRepository)).append("\n");
+ sb.append(" originRepositoryUrl: ").append(toIndentedString(originRepositoryUrl)).append("\n");
+ sb.append(" packageType: ").append(toIndentedString(packageType)).append("\n");
+ sb.append(" policyViolated: ").append(toIndentedString(policyViolated)).append("\n");
+ sb.append(" rawLicense: ").append(toIndentedString(rawLicense)).append("\n");
+ sb.append(" release: ").append(toIndentedString(release)).append("\n");
+ sb.append(" repository: ").append(toIndentedString(repository)).append("\n");
+ sb.append(" repositoryUrl: ").append(toIndentedString(repositoryUrl)).append("\n");
+ sb.append(" securityScanCompletedAt: ").append(toIndentedString(securityScanCompletedAt)).append("\n");
+ sb.append(" securityScanStartedAt: ").append(toIndentedString(securityScanStartedAt)).append("\n");
+ sb.append(" securityScanStatus: ").append(toIndentedString(securityScanStatus)).append("\n");
+ sb.append(" securityScanStatusUpdatedAt: ").append(toIndentedString(securityScanStatusUpdatedAt)).append("\n");
+ sb.append(" selfHtmlUrl: ").append(toIndentedString(selfHtmlUrl)).append("\n");
+ sb.append(" selfUrl: ").append(toIndentedString(selfUrl)).append("\n");
+ sb.append(" signatureUrl: ").append(toIndentedString(signatureUrl)).append("\n");
+ sb.append(" size: ").append(toIndentedString(size)).append("\n");
+ sb.append(" slug: ").append(toIndentedString(slug)).append("\n");
+ sb.append(" slugPerm: ").append(toIndentedString(slugPerm)).append("\n");
+ sb.append(" spdxLicense: ").append(toIndentedString(spdxLicense)).append("\n");
+ sb.append(" stage: ").append(toIndentedString(stage)).append("\n");
+ sb.append(" stageStr: ").append(toIndentedString(stageStr)).append("\n");
+ sb.append(" stageUpdatedAt: ").append(toIndentedString(stageUpdatedAt)).append("\n");
+ sb.append(" status: ").append(toIndentedString(status)).append("\n");
+ sb.append(" statusReason: ").append(toIndentedString(statusReason)).append("\n");
+ sb.append(" statusStr: ").append(toIndentedString(statusStr)).append("\n");
+ sb.append(" statusUpdatedAt: ").append(toIndentedString(statusUpdatedAt)).append("\n");
+ sb.append(" statusUrl: ").append(toIndentedString(statusUrl)).append("\n");
+ sb.append(" subtype: ").append(toIndentedString(subtype)).append("\n");
+ sb.append(" summary: ").append(toIndentedString(summary)).append("\n");
+ sb.append(" syncFinishedAt: ").append(toIndentedString(syncFinishedAt)).append("\n");
+ sb.append(" syncProgress: ").append(toIndentedString(syncProgress)).append("\n");
+ sb.append(" tagsAutomatic: ").append(toIndentedString(tagsAutomatic)).append("\n");
+ sb.append(" tagsImmutable: ").append(toIndentedString(tagsImmutable)).append("\n");
+ sb.append(" typeDisplay: ").append(toIndentedString(typeDisplay)).append("\n");
+ sb.append(" uploadedAt: ").append(toIndentedString(uploadedAt)).append("\n");
+ sb.append(" uploader: ").append(toIndentedString(uploader)).append("\n");
+ sb.append(" uploaderUrl: ").append(toIndentedString(uploaderUrl)).append("\n");
+ sb.append(" version: ").append(toIndentedString(version)).append("\n");
+ sb.append(" versionOrig: ").append(toIndentedString(versionOrig)).append("\n");
+ sb.append(" vulnerabilityScanResultsUrl: ").append(toIndentedString(vulnerabilityScanResultsUrl)).append("\n");
+ sb.append("}");
+ return sb.toString();
+ }
+
+ /**
+ * Convert the given object to string with each line indented by 4 spaces
+ * (except the first line).
+ */
+ private String toIndentedString(java.lang.Object o) {
+ if (o == null) {
+ return "null";
+ }
+ return o.toString().replace("\n", "\n ");
+ }
+
+}
+
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericPackageUploadRequest.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericPackageUploadRequest.java
new file mode 100644
index 00000000..a9851f18
--- /dev/null
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericPackageUploadRequest.java
@@ -0,0 +1,216 @@
+/*
+ * Cloudsmith API (v1)
+ * The API to the Cloudsmith Service
+ *
+ * OpenAPI spec version: v1
+ * Contact: support@cloudsmith.io
+ *
+ * NOTE: This class is auto generated by the swagger code generator program.
+ * https://github.com/swagger-api/swagger-codegen.git
+ * Do not edit the class manually.
+ */
+
+
+package io.cloudsmith.api.models;
+
+import java.util.Objects;
+import java.util.Arrays;
+import com.google.gson.TypeAdapter;
+import com.google.gson.annotations.JsonAdapter;
+import com.google.gson.annotations.SerializedName;
+import com.google.gson.stream.JsonReader;
+import com.google.gson.stream.JsonWriter;
+import io.swagger.annotations.ApiModel;
+import io.swagger.annotations.ApiModelProperty;
+import java.io.IOException;
+import java.io.Serializable;
+import javax.validation.constraints.*;
+import javax.validation.Valid;
+
+/**
+ * GenericPackageUploadRequest
+ */
+
+public class GenericPackageUploadRequest implements Serializable {
+ private static final long serialVersionUID = 1L;
+
+ @SerializedName("filepath")
+ private String filepath = null;
+
+ @SerializedName("name")
+ private String name = null;
+
+ @SerializedName("package_file")
+ private String packageFile = null;
+
+ @SerializedName("republish")
+ private Boolean republish = null;
+
+ @SerializedName("tags")
+ private String tags = null;
+
+ @SerializedName("version")
+ private String version = null;
+
+ public GenericPackageUploadRequest filepath(String filepath) {
+ this.filepath = filepath;
+ return this;
+ }
+
+ /**
+ * The full filepath of the package including filename.
+ * @return filepath
+ **/
+ @NotNull
+ @Size(min=1,max=2083) @ApiModelProperty(required = true, value = "The full filepath of the package including filename.")
+ public String getFilepath() {
+ return filepath;
+ }
+
+ public void setFilepath(String filepath) {
+ this.filepath = filepath;
+ }
+
+ public GenericPackageUploadRequest name(String name) {
+ this.name = name;
+ return this;
+ }
+
+ /**
+ * The name of this package.
+ * @return name
+ **/
+ @Size(max=200) @ApiModelProperty(value = "The name of this package.")
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ public GenericPackageUploadRequest packageFile(String packageFile) {
+ this.packageFile = packageFile;
+ return this;
+ }
+
+ /**
+ * The primary file for the package.
+ * @return packageFile
+ **/
+ @NotNull
+ @Size(min=1) @ApiModelProperty(required = true, value = "The primary file for the package.")
+ public String getPackageFile() {
+ return packageFile;
+ }
+
+ public void setPackageFile(String packageFile) {
+ this.packageFile = packageFile;
+ }
+
+ public GenericPackageUploadRequest republish(Boolean republish) {
+ this.republish = republish;
+ return this;
+ }
+
+ /**
+ * If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate.
+ * @return republish
+ **/
+ @ApiModelProperty(value = "If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate.")
+ public Boolean isRepublish() {
+ return republish;
+ }
+
+ public void setRepublish(Boolean republish) {
+ this.republish = republish;
+ }
+
+ public GenericPackageUploadRequest tags(String tags) {
+ this.tags = tags;
+ return this;
+ }
+
+ /**
+ * A comma-separated values list of tags to add to the package.
+ * @return tags
+ **/
+ @Size(min=1,max=1024) @ApiModelProperty(value = "A comma-separated values list of tags to add to the package.")
+ public String getTags() {
+ return tags;
+ }
+
+ public void setTags(String tags) {
+ this.tags = tags;
+ }
+
+ public GenericPackageUploadRequest version(String version) {
+ this.version = version;
+ return this;
+ }
+
+ /**
+ * The raw version for this package.
+ * @return version
+ **/
+ @Size(max=255) @ApiModelProperty(value = "The raw version for this package.")
+ public String getVersion() {
+ return version;
+ }
+
+ public void setVersion(String version) {
+ this.version = version;
+ }
+
+
+ @Override
+ public boolean equals(java.lang.Object o) {
+ if (this == o) {
+ return true;
+ }
+ if (o == null || getClass() != o.getClass()) {
+ return false;
+ }
+ GenericPackageUploadRequest genericPackageUploadRequest = (GenericPackageUploadRequest) o;
+ return Objects.equals(this.filepath, genericPackageUploadRequest.filepath) &&
+ Objects.equals(this.name, genericPackageUploadRequest.name) &&
+ Objects.equals(this.packageFile, genericPackageUploadRequest.packageFile) &&
+ Objects.equals(this.republish, genericPackageUploadRequest.republish) &&
+ Objects.equals(this.tags, genericPackageUploadRequest.tags) &&
+ Objects.equals(this.version, genericPackageUploadRequest.version);
+ }
+
+ @Override
+ public int hashCode() {
+ return Objects.hash(filepath, name, packageFile, republish, tags, version);
+ }
+
+
+ @Override
+ public String toString() {
+ StringBuilder sb = new StringBuilder();
+ sb.append("class GenericPackageUploadRequest {\n");
+
+ sb.append(" filepath: ").append(toIndentedString(filepath)).append("\n");
+ sb.append(" name: ").append(toIndentedString(name)).append("\n");
+ sb.append(" packageFile: ").append(toIndentedString(packageFile)).append("\n");
+ sb.append(" republish: ").append(toIndentedString(republish)).append("\n");
+ sb.append(" tags: ").append(toIndentedString(tags)).append("\n");
+ sb.append(" version: ").append(toIndentedString(version)).append("\n");
+ sb.append("}");
+ return sb.toString();
+ }
+
+ /**
+ * Convert the given object to string with each line indented by 4 spaces
+ * (except the first line).
+ */
+ private String toIndentedString(java.lang.Object o) {
+ if (o == null) {
+ return "null";
+ }
+ return o.toString().replace("\n", "\n ");
+ }
+
+}
+
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstream.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstream.java
new file mode 100644
index 00000000..df37c562
--- /dev/null
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstream.java
@@ -0,0 +1,720 @@
+/*
+ * Cloudsmith API (v1)
+ * The API to the Cloudsmith Service
+ *
+ * OpenAPI spec version: v1
+ * Contact: support@cloudsmith.io
+ *
+ * NOTE: This class is auto generated by the swagger code generator program.
+ * https://github.com/swagger-api/swagger-codegen.git
+ * Do not edit the class manually.
+ */
+
+
+package io.cloudsmith.api.models;
+
+import java.util.Objects;
+import java.util.Arrays;
+import com.google.gson.TypeAdapter;
+import com.google.gson.annotations.JsonAdapter;
+import com.google.gson.annotations.SerializedName;
+import com.google.gson.stream.JsonReader;
+import com.google.gson.stream.JsonWriter;
+import io.swagger.annotations.ApiModel;
+import io.swagger.annotations.ApiModelProperty;
+import java.io.IOException;
+import java.time.OffsetDateTime;
+import java.io.Serializable;
+import javax.validation.constraints.*;
+import javax.validation.Valid;
+
+/**
+ * GenericUpstream
+ */
+
+public class GenericUpstream implements Serializable {
+ private static final long serialVersionUID = 1L;
+
+ /**
+ * The authentication mode to use when accessing this upstream.
+ */
+ @JsonAdapter(AuthModeEnum.Adapter.class)
+ public enum AuthModeEnum {
+ NONE("None"),
+
+ USERNAME_AND_PASSWORD("Username and Password"),
+
+ TOKEN("Token");
+
+ private String value;
+
+ AuthModeEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static AuthModeEnum fromValue(String text) {
+ for (AuthModeEnum b : AuthModeEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final AuthModeEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public AuthModeEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return AuthModeEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("auth_mode")
+ private AuthModeEnum authMode = AuthModeEnum.NONE;
+
+ @SerializedName("auth_secret")
+ private String authSecret = null;
+
+ @SerializedName("auth_username")
+ private String authUsername = null;
+
+ @SerializedName("available")
+ private String available = null;
+
+ @SerializedName("can_reindex")
+ private String canReindex = null;
+
+ @SerializedName("created_at")
+ private OffsetDateTime createdAt = null;
+
+ /**
+ * Gets or Sets disableReason
+ */
+ @JsonAdapter(DisableReasonEnum.Adapter.class)
+ public enum DisableReasonEnum {
+ N_A("N/A"),
+
+ UPSTREAM_POINTS_TO_ITS_OWN_REPOSITORY("Upstream points to its own repository"),
+
+ MISSING_UPSTREAM_SOURCE("Missing upstream source"),
+
+ UPSTREAM_WAS_DISABLED_BY_REQUEST_OF_USER("Upstream was disabled by request of user");
+
+ private String value;
+
+ DisableReasonEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static DisableReasonEnum fromValue(String text) {
+ for (DisableReasonEnum b : DisableReasonEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final DisableReasonEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public DisableReasonEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return DisableReasonEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("disable_reason")
+ private DisableReasonEnum disableReason = DisableReasonEnum.N_A;
+
+ @SerializedName("disable_reason_text")
+ private String disableReasonText = null;
+
+ @SerializedName("extra_header_1")
+ private String extraHeader1 = null;
+
+ @SerializedName("extra_header_2")
+ private String extraHeader2 = null;
+
+ @SerializedName("extra_value_1")
+ private String extraValue1 = null;
+
+ @SerializedName("extra_value_2")
+ private String extraValue2 = null;
+
+ @SerializedName("has_failed_signature_verification")
+ private String hasFailedSignatureVerification = null;
+
+ @SerializedName("index_package_count")
+ private String indexPackageCount = null;
+
+ @SerializedName("index_status")
+ private String indexStatus = null;
+
+ @SerializedName("is_active")
+ private Boolean isActive = null;
+
+ @SerializedName("last_indexed")
+ private String lastIndexed = null;
+
+ /**
+ * The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ */
+ @JsonAdapter(ModeEnum.Adapter.class)
+ public enum ModeEnum {
+ PROXY_ONLY("Proxy Only"),
+
+ CACHE_AND_PROXY("Cache and Proxy");
+
+ private String value;
+
+ ModeEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static ModeEnum fromValue(String text) {
+ for (ModeEnum b : ModeEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final ModeEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public ModeEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return ModeEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("mode")
+ private ModeEnum mode = ModeEnum.PROXY_ONLY;
+
+ @SerializedName("name")
+ private String name = null;
+
+ @SerializedName("pending_validation")
+ private Boolean pendingValidation = null;
+
+ @SerializedName("priority")
+ private java.math.BigInteger priority = null;
+
+ @SerializedName("slug_perm")
+ private String slugPerm = null;
+
+ @SerializedName("updated_at")
+ private OffsetDateTime updatedAt = null;
+
+ @SerializedName("upstream_prefix")
+ private String upstreamPrefix = null;
+
+ @SerializedName("upstream_url")
+ private String upstreamUrl = null;
+
+ @SerializedName("verify_ssl")
+ private Boolean verifySsl = null;
+
+ public GenericUpstream authMode(AuthModeEnum authMode) {
+ this.authMode = authMode;
+ return this;
+ }
+
+ /**
+ * The authentication mode to use when accessing this upstream.
+ * @return authMode
+ **/
+ @ApiModelProperty(value = "The authentication mode to use when accessing this upstream. ")
+ public AuthModeEnum getAuthMode() {
+ return authMode;
+ }
+
+ public void setAuthMode(AuthModeEnum authMode) {
+ this.authMode = authMode;
+ }
+
+ public GenericUpstream authSecret(String authSecret) {
+ this.authSecret = authSecret;
+ return this;
+ }
+
+ /**
+ * Secret to provide with requests to upstream.
+ * @return authSecret
+ **/
+ @Size(max=4096) @ApiModelProperty(value = "Secret to provide with requests to upstream.")
+ public String getAuthSecret() {
+ return authSecret;
+ }
+
+ public void setAuthSecret(String authSecret) {
+ this.authSecret = authSecret;
+ }
+
+ public GenericUpstream authUsername(String authUsername) {
+ this.authUsername = authUsername;
+ return this;
+ }
+
+ /**
+ * Username to provide with requests to upstream.
+ * @return authUsername
+ **/
+ @Size(max=64) @ApiModelProperty(value = "Username to provide with requests to upstream.")
+ public String getAuthUsername() {
+ return authUsername;
+ }
+
+ public void setAuthUsername(String authUsername) {
+ this.authUsername = authUsername;
+ }
+
+ /**
+ * Get available
+ * @return available
+ **/
+ @ApiModelProperty(value = "")
+ public String getAvailable() {
+ return available;
+ }
+
+ /**
+ * Get canReindex
+ * @return canReindex
+ **/
+ @ApiModelProperty(value = "")
+ public String getCanReindex() {
+ return canReindex;
+ }
+
+ /**
+ * The datetime the upstream source was created.
+ * @return createdAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "The datetime the upstream source was created.")
+ public OffsetDateTime getCreatedAt() {
+ return createdAt;
+ }
+
+ /**
+ * Get disableReason
+ * @return disableReason
+ **/
+ @ApiModelProperty(value = "")
+ public DisableReasonEnum getDisableReason() {
+ return disableReason;
+ }
+
+ /**
+ * Human-readable explanation of why this upstream is disabled
+ * @return disableReasonText
+ **/
+ @ApiModelProperty(value = "Human-readable explanation of why this upstream is disabled")
+ public String getDisableReasonText() {
+ return disableReasonText;
+ }
+
+ public GenericUpstream extraHeader1(String extraHeader1) {
+ this.extraHeader1 = extraHeader1;
+ return this;
+ }
+
+ /**
+ * The key for extra header #1 to send to upstream.
+ * @return extraHeader1
+ **/
+ @Pattern(regexp="^[-\\w]+$") @Size(max=64) @ApiModelProperty(value = "The key for extra header #1 to send to upstream.")
+ public String getExtraHeader1() {
+ return extraHeader1;
+ }
+
+ public void setExtraHeader1(String extraHeader1) {
+ this.extraHeader1 = extraHeader1;
+ }
+
+ public GenericUpstream extraHeader2(String extraHeader2) {
+ this.extraHeader2 = extraHeader2;
+ return this;
+ }
+
+ /**
+ * The key for extra header #2 to send to upstream.
+ * @return extraHeader2
+ **/
+ @Pattern(regexp="^[-\\w]+$") @Size(max=64) @ApiModelProperty(value = "The key for extra header #2 to send to upstream.")
+ public String getExtraHeader2() {
+ return extraHeader2;
+ }
+
+ public void setExtraHeader2(String extraHeader2) {
+ this.extraHeader2 = extraHeader2;
+ }
+
+ public GenericUpstream extraValue1(String extraValue1) {
+ this.extraValue1 = extraValue1;
+ return this;
+ }
+
+ /**
+ * The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ * @return extraValue1
+ **/
+ @Pattern(regexp="^[^\\n\\r]+$") @Size(max=128) @ApiModelProperty(value = "The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.")
+ public String getExtraValue1() {
+ return extraValue1;
+ }
+
+ public void setExtraValue1(String extraValue1) {
+ this.extraValue1 = extraValue1;
+ }
+
+ public GenericUpstream extraValue2(String extraValue2) {
+ this.extraValue2 = extraValue2;
+ return this;
+ }
+
+ /**
+ * The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ * @return extraValue2
+ **/
+ @Pattern(regexp="^[^\\n\\r]+$") @Size(max=128) @ApiModelProperty(value = "The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.")
+ public String getExtraValue2() {
+ return extraValue2;
+ }
+
+ public void setExtraValue2(String extraValue2) {
+ this.extraValue2 = extraValue2;
+ }
+
+ /**
+ * Get hasFailedSignatureVerification
+ * @return hasFailedSignatureVerification
+ **/
+ @ApiModelProperty(value = "")
+ public String getHasFailedSignatureVerification() {
+ return hasFailedSignatureVerification;
+ }
+
+ /**
+ * The number of packages available in this upstream source
+ * @return indexPackageCount
+ **/
+ @ApiModelProperty(value = "The number of packages available in this upstream source")
+ public String getIndexPackageCount() {
+ return indexPackageCount;
+ }
+
+ /**
+ * The current indexing status of this upstream source
+ * @return indexStatus
+ **/
+ @ApiModelProperty(value = "The current indexing status of this upstream source")
+ public String getIndexStatus() {
+ return indexStatus;
+ }
+
+ public GenericUpstream isActive(Boolean isActive) {
+ this.isActive = isActive;
+ return this;
+ }
+
+ /**
+ * Whether or not this upstream is active and ready for requests.
+ * @return isActive
+ **/
+ @ApiModelProperty(value = "Whether or not this upstream is active and ready for requests.")
+ public Boolean isIsActive() {
+ return isActive;
+ }
+
+ public void setIsActive(Boolean isActive) {
+ this.isActive = isActive;
+ }
+
+ /**
+ * The last time this upstream source was indexed
+ * @return lastIndexed
+ **/
+ @ApiModelProperty(value = "The last time this upstream source was indexed")
+ public String getLastIndexed() {
+ return lastIndexed;
+ }
+
+ public GenericUpstream mode(ModeEnum mode) {
+ this.mode = mode;
+ return this;
+ }
+
+ /**
+ * The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ * @return mode
+ **/
+ @ApiModelProperty(value = "The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.")
+ public ModeEnum getMode() {
+ return mode;
+ }
+
+ public void setMode(ModeEnum mode) {
+ this.mode = mode;
+ }
+
+ public GenericUpstream name(String name) {
+ this.name = name;
+ return this;
+ }
+
+ /**
+ * A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+ * @return name
+ **/
+ @NotNull
+ @Pattern(regexp="^\\w[\\w \\-'\\./()]+$") @Size(min=1,max=64) @ApiModelProperty(required = true, value = "A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.")
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ /**
+ * When true, this upstream source is pending validation.
+ * @return pendingValidation
+ **/
+ @ApiModelProperty(value = "When true, this upstream source is pending validation.")
+ public Boolean isPendingValidation() {
+ return pendingValidation;
+ }
+
+ public GenericUpstream priority(java.math.BigInteger priority) {
+ this.priority = priority;
+ return this;
+ }
+
+ /**
+ * Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+ * minimum: 1
+ * maximum: 32767
+ * @return priority
+ **/
+ @Min(1L) @Max(32767L) @ApiModelProperty(value = "Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.")
+ public java.math.BigInteger getPriority() {
+ return priority;
+ }
+
+ public void setPriority(java.math.BigInteger priority) {
+ this.priority = priority;
+ }
+
+ /**
+ * Get slugPerm
+ * @return slugPerm
+ **/
+ @Pattern(regexp="^[-a-zA-Z0-9_]+$") @Size(min=1) @ApiModelProperty(value = "")
+ public String getSlugPerm() {
+ return slugPerm;
+ }
+
+ /**
+ * Get updatedAt
+ * @return updatedAt
+ **/
+ @Valid
+ @ApiModelProperty(value = "")
+ public OffsetDateTime getUpdatedAt() {
+ return updatedAt;
+ }
+
+ public GenericUpstream upstreamPrefix(String upstreamPrefix) {
+ this.upstreamPrefix = upstreamPrefix;
+ return this;
+ }
+
+ /**
+ * A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+ * @return upstreamPrefix
+ **/
+ @Size(min=1,max=64) @ApiModelProperty(value = "A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.")
+ public String getUpstreamPrefix() {
+ return upstreamPrefix;
+ }
+
+ public void setUpstreamPrefix(String upstreamPrefix) {
+ this.upstreamPrefix = upstreamPrefix;
+ }
+
+ public GenericUpstream upstreamUrl(String upstreamUrl) {
+ this.upstreamUrl = upstreamUrl;
+ return this;
+ }
+
+ /**
+ * The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+ * @return upstreamUrl
+ **/
+ @NotNull
+ @Size(min=1,max=200) @ApiModelProperty(required = true, value = "The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. ")
+ public String getUpstreamUrl() {
+ return upstreamUrl;
+ }
+
+ public void setUpstreamUrl(String upstreamUrl) {
+ this.upstreamUrl = upstreamUrl;
+ }
+
+ public GenericUpstream verifySsl(Boolean verifySsl) {
+ this.verifySsl = verifySsl;
+ return this;
+ }
+
+ /**
+ * If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+ * @return verifySsl
+ **/
+ @ApiModelProperty(value = "If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.")
+ public Boolean isVerifySsl() {
+ return verifySsl;
+ }
+
+ public void setVerifySsl(Boolean verifySsl) {
+ this.verifySsl = verifySsl;
+ }
+
+
+ @Override
+ public boolean equals(java.lang.Object o) {
+ if (this == o) {
+ return true;
+ }
+ if (o == null || getClass() != o.getClass()) {
+ return false;
+ }
+ GenericUpstream genericUpstream = (GenericUpstream) o;
+ return Objects.equals(this.authMode, genericUpstream.authMode) &&
+ Objects.equals(this.authSecret, genericUpstream.authSecret) &&
+ Objects.equals(this.authUsername, genericUpstream.authUsername) &&
+ Objects.equals(this.available, genericUpstream.available) &&
+ Objects.equals(this.canReindex, genericUpstream.canReindex) &&
+ Objects.equals(this.createdAt, genericUpstream.createdAt) &&
+ Objects.equals(this.disableReason, genericUpstream.disableReason) &&
+ Objects.equals(this.disableReasonText, genericUpstream.disableReasonText) &&
+ Objects.equals(this.extraHeader1, genericUpstream.extraHeader1) &&
+ Objects.equals(this.extraHeader2, genericUpstream.extraHeader2) &&
+ Objects.equals(this.extraValue1, genericUpstream.extraValue1) &&
+ Objects.equals(this.extraValue2, genericUpstream.extraValue2) &&
+ Objects.equals(this.hasFailedSignatureVerification, genericUpstream.hasFailedSignatureVerification) &&
+ Objects.equals(this.indexPackageCount, genericUpstream.indexPackageCount) &&
+ Objects.equals(this.indexStatus, genericUpstream.indexStatus) &&
+ Objects.equals(this.isActive, genericUpstream.isActive) &&
+ Objects.equals(this.lastIndexed, genericUpstream.lastIndexed) &&
+ Objects.equals(this.mode, genericUpstream.mode) &&
+ Objects.equals(this.name, genericUpstream.name) &&
+ Objects.equals(this.pendingValidation, genericUpstream.pendingValidation) &&
+ Objects.equals(this.priority, genericUpstream.priority) &&
+ Objects.equals(this.slugPerm, genericUpstream.slugPerm) &&
+ Objects.equals(this.updatedAt, genericUpstream.updatedAt) &&
+ Objects.equals(this.upstreamPrefix, genericUpstream.upstreamPrefix) &&
+ Objects.equals(this.upstreamUrl, genericUpstream.upstreamUrl) &&
+ Objects.equals(this.verifySsl, genericUpstream.verifySsl);
+ }
+
+ @Override
+ public int hashCode() {
+ return Objects.hash(authMode, authSecret, authUsername, available, canReindex, createdAt, disableReason, disableReasonText, extraHeader1, extraHeader2, extraValue1, extraValue2, hasFailedSignatureVerification, indexPackageCount, indexStatus, isActive, lastIndexed, mode, name, pendingValidation, priority, slugPerm, updatedAt, upstreamPrefix, upstreamUrl, verifySsl);
+ }
+
+
+ @Override
+ public String toString() {
+ StringBuilder sb = new StringBuilder();
+ sb.append("class GenericUpstream {\n");
+
+ sb.append(" authMode: ").append(toIndentedString(authMode)).append("\n");
+ sb.append(" authSecret: ").append(toIndentedString(authSecret)).append("\n");
+ sb.append(" authUsername: ").append(toIndentedString(authUsername)).append("\n");
+ sb.append(" available: ").append(toIndentedString(available)).append("\n");
+ sb.append(" canReindex: ").append(toIndentedString(canReindex)).append("\n");
+ sb.append(" createdAt: ").append(toIndentedString(createdAt)).append("\n");
+ sb.append(" disableReason: ").append(toIndentedString(disableReason)).append("\n");
+ sb.append(" disableReasonText: ").append(toIndentedString(disableReasonText)).append("\n");
+ sb.append(" extraHeader1: ").append(toIndentedString(extraHeader1)).append("\n");
+ sb.append(" extraHeader2: ").append(toIndentedString(extraHeader2)).append("\n");
+ sb.append(" extraValue1: ").append(toIndentedString(extraValue1)).append("\n");
+ sb.append(" extraValue2: ").append(toIndentedString(extraValue2)).append("\n");
+ sb.append(" hasFailedSignatureVerification: ").append(toIndentedString(hasFailedSignatureVerification)).append("\n");
+ sb.append(" indexPackageCount: ").append(toIndentedString(indexPackageCount)).append("\n");
+ sb.append(" indexStatus: ").append(toIndentedString(indexStatus)).append("\n");
+ sb.append(" isActive: ").append(toIndentedString(isActive)).append("\n");
+ sb.append(" lastIndexed: ").append(toIndentedString(lastIndexed)).append("\n");
+ sb.append(" mode: ").append(toIndentedString(mode)).append("\n");
+ sb.append(" name: ").append(toIndentedString(name)).append("\n");
+ sb.append(" pendingValidation: ").append(toIndentedString(pendingValidation)).append("\n");
+ sb.append(" priority: ").append(toIndentedString(priority)).append("\n");
+ sb.append(" slugPerm: ").append(toIndentedString(slugPerm)).append("\n");
+ sb.append(" updatedAt: ").append(toIndentedString(updatedAt)).append("\n");
+ sb.append(" upstreamPrefix: ").append(toIndentedString(upstreamPrefix)).append("\n");
+ sb.append(" upstreamUrl: ").append(toIndentedString(upstreamUrl)).append("\n");
+ sb.append(" verifySsl: ").append(toIndentedString(verifySsl)).append("\n");
+ sb.append("}");
+ return sb.toString();
+ }
+
+ /**
+ * Convert the given object to string with each line indented by 4 spaces
+ * (except the first line).
+ */
+ private String toIndentedString(java.lang.Object o) {
+ if (o == null) {
+ return "null";
+ }
+ return o.toString().replace("\n", "\n ");
+ }
+
+}
+
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstreamRequest.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstreamRequest.java
new file mode 100644
index 00000000..1bd88354
--- /dev/null
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstreamRequest.java
@@ -0,0 +1,498 @@
+/*
+ * Cloudsmith API (v1)
+ * The API to the Cloudsmith Service
+ *
+ * OpenAPI spec version: v1
+ * Contact: support@cloudsmith.io
+ *
+ * NOTE: This class is auto generated by the swagger code generator program.
+ * https://github.com/swagger-api/swagger-codegen.git
+ * Do not edit the class manually.
+ */
+
+
+package io.cloudsmith.api.models;
+
+import java.util.Objects;
+import java.util.Arrays;
+import com.google.gson.TypeAdapter;
+import com.google.gson.annotations.JsonAdapter;
+import com.google.gson.annotations.SerializedName;
+import com.google.gson.stream.JsonReader;
+import com.google.gson.stream.JsonWriter;
+import io.swagger.annotations.ApiModel;
+import io.swagger.annotations.ApiModelProperty;
+import java.io.IOException;
+import java.io.Serializable;
+import javax.validation.constraints.*;
+import javax.validation.Valid;
+
+/**
+ * GenericUpstreamRequest
+ */
+
+public class GenericUpstreamRequest implements Serializable {
+ private static final long serialVersionUID = 1L;
+
+ /**
+ * The authentication mode to use when accessing this upstream.
+ */
+ @JsonAdapter(AuthModeEnum.Adapter.class)
+ public enum AuthModeEnum {
+ NONE("None"),
+
+ USERNAME_AND_PASSWORD("Username and Password"),
+
+ TOKEN("Token");
+
+ private String value;
+
+ AuthModeEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static AuthModeEnum fromValue(String text) {
+ for (AuthModeEnum b : AuthModeEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final AuthModeEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public AuthModeEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return AuthModeEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("auth_mode")
+ private AuthModeEnum authMode = AuthModeEnum.NONE;
+
+ @SerializedName("auth_secret")
+ private String authSecret = null;
+
+ @SerializedName("auth_username")
+ private String authUsername = null;
+
+ @SerializedName("extra_header_1")
+ private String extraHeader1 = null;
+
+ @SerializedName("extra_header_2")
+ private String extraHeader2 = null;
+
+ @SerializedName("extra_value_1")
+ private String extraValue1 = null;
+
+ @SerializedName("extra_value_2")
+ private String extraValue2 = null;
+
+ @SerializedName("is_active")
+ private Boolean isActive = null;
+
+ /**
+ * The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ */
+ @JsonAdapter(ModeEnum.Adapter.class)
+ public enum ModeEnum {
+ PROXY_ONLY("Proxy Only"),
+
+ CACHE_AND_PROXY("Cache and Proxy");
+
+ private String value;
+
+ ModeEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static ModeEnum fromValue(String text) {
+ for (ModeEnum b : ModeEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final ModeEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public ModeEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return ModeEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("mode")
+ private ModeEnum mode = ModeEnum.PROXY_ONLY;
+
+ @SerializedName("name")
+ private String name = null;
+
+ @SerializedName("priority")
+ private java.math.BigInteger priority = null;
+
+ @SerializedName("upstream_prefix")
+ private String upstreamPrefix = null;
+
+ @SerializedName("upstream_url")
+ private String upstreamUrl = null;
+
+ @SerializedName("verify_ssl")
+ private Boolean verifySsl = null;
+
+ public GenericUpstreamRequest authMode(AuthModeEnum authMode) {
+ this.authMode = authMode;
+ return this;
+ }
+
+ /**
+ * The authentication mode to use when accessing this upstream.
+ * @return authMode
+ **/
+ @ApiModelProperty(value = "The authentication mode to use when accessing this upstream. ")
+ public AuthModeEnum getAuthMode() {
+ return authMode;
+ }
+
+ public void setAuthMode(AuthModeEnum authMode) {
+ this.authMode = authMode;
+ }
+
+ public GenericUpstreamRequest authSecret(String authSecret) {
+ this.authSecret = authSecret;
+ return this;
+ }
+
+ /**
+ * Secret to provide with requests to upstream.
+ * @return authSecret
+ **/
+ @Size(max=4096) @ApiModelProperty(value = "Secret to provide with requests to upstream.")
+ public String getAuthSecret() {
+ return authSecret;
+ }
+
+ public void setAuthSecret(String authSecret) {
+ this.authSecret = authSecret;
+ }
+
+ public GenericUpstreamRequest authUsername(String authUsername) {
+ this.authUsername = authUsername;
+ return this;
+ }
+
+ /**
+ * Username to provide with requests to upstream.
+ * @return authUsername
+ **/
+ @Size(max=64) @ApiModelProperty(value = "Username to provide with requests to upstream.")
+ public String getAuthUsername() {
+ return authUsername;
+ }
+
+ public void setAuthUsername(String authUsername) {
+ this.authUsername = authUsername;
+ }
+
+ public GenericUpstreamRequest extraHeader1(String extraHeader1) {
+ this.extraHeader1 = extraHeader1;
+ return this;
+ }
+
+ /**
+ * The key for extra header #1 to send to upstream.
+ * @return extraHeader1
+ **/
+ @Pattern(regexp="^[-\\w]+$") @Size(max=64) @ApiModelProperty(value = "The key for extra header #1 to send to upstream.")
+ public String getExtraHeader1() {
+ return extraHeader1;
+ }
+
+ public void setExtraHeader1(String extraHeader1) {
+ this.extraHeader1 = extraHeader1;
+ }
+
+ public GenericUpstreamRequest extraHeader2(String extraHeader2) {
+ this.extraHeader2 = extraHeader2;
+ return this;
+ }
+
+ /**
+ * The key for extra header #2 to send to upstream.
+ * @return extraHeader2
+ **/
+ @Pattern(regexp="^[-\\w]+$") @Size(max=64) @ApiModelProperty(value = "The key for extra header #2 to send to upstream.")
+ public String getExtraHeader2() {
+ return extraHeader2;
+ }
+
+ public void setExtraHeader2(String extraHeader2) {
+ this.extraHeader2 = extraHeader2;
+ }
+
+ public GenericUpstreamRequest extraValue1(String extraValue1) {
+ this.extraValue1 = extraValue1;
+ return this;
+ }
+
+ /**
+ * The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ * @return extraValue1
+ **/
+ @Pattern(regexp="^[^\\n\\r]+$") @Size(max=128) @ApiModelProperty(value = "The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.")
+ public String getExtraValue1() {
+ return extraValue1;
+ }
+
+ public void setExtraValue1(String extraValue1) {
+ this.extraValue1 = extraValue1;
+ }
+
+ public GenericUpstreamRequest extraValue2(String extraValue2) {
+ this.extraValue2 = extraValue2;
+ return this;
+ }
+
+ /**
+ * The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ * @return extraValue2
+ **/
+ @Pattern(regexp="^[^\\n\\r]+$") @Size(max=128) @ApiModelProperty(value = "The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.")
+ public String getExtraValue2() {
+ return extraValue2;
+ }
+
+ public void setExtraValue2(String extraValue2) {
+ this.extraValue2 = extraValue2;
+ }
+
+ public GenericUpstreamRequest isActive(Boolean isActive) {
+ this.isActive = isActive;
+ return this;
+ }
+
+ /**
+ * Whether or not this upstream is active and ready for requests.
+ * @return isActive
+ **/
+ @ApiModelProperty(value = "Whether or not this upstream is active and ready for requests.")
+ public Boolean isIsActive() {
+ return isActive;
+ }
+
+ public void setIsActive(Boolean isActive) {
+ this.isActive = isActive;
+ }
+
+ public GenericUpstreamRequest mode(ModeEnum mode) {
+ this.mode = mode;
+ return this;
+ }
+
+ /**
+ * The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ * @return mode
+ **/
+ @ApiModelProperty(value = "The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.")
+ public ModeEnum getMode() {
+ return mode;
+ }
+
+ public void setMode(ModeEnum mode) {
+ this.mode = mode;
+ }
+
+ public GenericUpstreamRequest name(String name) {
+ this.name = name;
+ return this;
+ }
+
+ /**
+ * A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+ * @return name
+ **/
+ @NotNull
+ @Pattern(regexp="^\\w[\\w \\-'\\./()]+$") @Size(min=1,max=64) @ApiModelProperty(required = true, value = "A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.")
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ public GenericUpstreamRequest priority(java.math.BigInteger priority) {
+ this.priority = priority;
+ return this;
+ }
+
+ /**
+ * Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+ * minimum: 1
+ * maximum: 32767
+ * @return priority
+ **/
+ @Min(1L) @Max(32767L) @ApiModelProperty(value = "Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.")
+ public java.math.BigInteger getPriority() {
+ return priority;
+ }
+
+ public void setPriority(java.math.BigInteger priority) {
+ this.priority = priority;
+ }
+
+ public GenericUpstreamRequest upstreamPrefix(String upstreamPrefix) {
+ this.upstreamPrefix = upstreamPrefix;
+ return this;
+ }
+
+ /**
+ * A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+ * @return upstreamPrefix
+ **/
+ @Size(min=1,max=64) @ApiModelProperty(value = "A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.")
+ public String getUpstreamPrefix() {
+ return upstreamPrefix;
+ }
+
+ public void setUpstreamPrefix(String upstreamPrefix) {
+ this.upstreamPrefix = upstreamPrefix;
+ }
+
+ public GenericUpstreamRequest upstreamUrl(String upstreamUrl) {
+ this.upstreamUrl = upstreamUrl;
+ return this;
+ }
+
+ /**
+ * The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+ * @return upstreamUrl
+ **/
+ @NotNull
+ @Size(min=1,max=200) @ApiModelProperty(required = true, value = "The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. ")
+ public String getUpstreamUrl() {
+ return upstreamUrl;
+ }
+
+ public void setUpstreamUrl(String upstreamUrl) {
+ this.upstreamUrl = upstreamUrl;
+ }
+
+ public GenericUpstreamRequest verifySsl(Boolean verifySsl) {
+ this.verifySsl = verifySsl;
+ return this;
+ }
+
+ /**
+ * If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+ * @return verifySsl
+ **/
+ @ApiModelProperty(value = "If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.")
+ public Boolean isVerifySsl() {
+ return verifySsl;
+ }
+
+ public void setVerifySsl(Boolean verifySsl) {
+ this.verifySsl = verifySsl;
+ }
+
+
+ @Override
+ public boolean equals(java.lang.Object o) {
+ if (this == o) {
+ return true;
+ }
+ if (o == null || getClass() != o.getClass()) {
+ return false;
+ }
+ GenericUpstreamRequest genericUpstreamRequest = (GenericUpstreamRequest) o;
+ return Objects.equals(this.authMode, genericUpstreamRequest.authMode) &&
+ Objects.equals(this.authSecret, genericUpstreamRequest.authSecret) &&
+ Objects.equals(this.authUsername, genericUpstreamRequest.authUsername) &&
+ Objects.equals(this.extraHeader1, genericUpstreamRequest.extraHeader1) &&
+ Objects.equals(this.extraHeader2, genericUpstreamRequest.extraHeader2) &&
+ Objects.equals(this.extraValue1, genericUpstreamRequest.extraValue1) &&
+ Objects.equals(this.extraValue2, genericUpstreamRequest.extraValue2) &&
+ Objects.equals(this.isActive, genericUpstreamRequest.isActive) &&
+ Objects.equals(this.mode, genericUpstreamRequest.mode) &&
+ Objects.equals(this.name, genericUpstreamRequest.name) &&
+ Objects.equals(this.priority, genericUpstreamRequest.priority) &&
+ Objects.equals(this.upstreamPrefix, genericUpstreamRequest.upstreamPrefix) &&
+ Objects.equals(this.upstreamUrl, genericUpstreamRequest.upstreamUrl) &&
+ Objects.equals(this.verifySsl, genericUpstreamRequest.verifySsl);
+ }
+
+ @Override
+ public int hashCode() {
+ return Objects.hash(authMode, authSecret, authUsername, extraHeader1, extraHeader2, extraValue1, extraValue2, isActive, mode, name, priority, upstreamPrefix, upstreamUrl, verifySsl);
+ }
+
+
+ @Override
+ public String toString() {
+ StringBuilder sb = new StringBuilder();
+ sb.append("class GenericUpstreamRequest {\n");
+
+ sb.append(" authMode: ").append(toIndentedString(authMode)).append("\n");
+ sb.append(" authSecret: ").append(toIndentedString(authSecret)).append("\n");
+ sb.append(" authUsername: ").append(toIndentedString(authUsername)).append("\n");
+ sb.append(" extraHeader1: ").append(toIndentedString(extraHeader1)).append("\n");
+ sb.append(" extraHeader2: ").append(toIndentedString(extraHeader2)).append("\n");
+ sb.append(" extraValue1: ").append(toIndentedString(extraValue1)).append("\n");
+ sb.append(" extraValue2: ").append(toIndentedString(extraValue2)).append("\n");
+ sb.append(" isActive: ").append(toIndentedString(isActive)).append("\n");
+ sb.append(" mode: ").append(toIndentedString(mode)).append("\n");
+ sb.append(" name: ").append(toIndentedString(name)).append("\n");
+ sb.append(" priority: ").append(toIndentedString(priority)).append("\n");
+ sb.append(" upstreamPrefix: ").append(toIndentedString(upstreamPrefix)).append("\n");
+ sb.append(" upstreamUrl: ").append(toIndentedString(upstreamUrl)).append("\n");
+ sb.append(" verifySsl: ").append(toIndentedString(verifySsl)).append("\n");
+ sb.append("}");
+ return sb.toString();
+ }
+
+ /**
+ * Convert the given object to string with each line indented by 4 spaces
+ * (except the first line).
+ */
+ private String toIndentedString(java.lang.Object o) {
+ if (o == null) {
+ return "null";
+ }
+ return o.toString().replace("\n", "\n ");
+ }
+
+}
+
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstreamRequestPatch.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstreamRequestPatch.java
new file mode 100644
index 00000000..fd1b5188
--- /dev/null
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/GenericUpstreamRequestPatch.java
@@ -0,0 +1,496 @@
+/*
+ * Cloudsmith API (v1)
+ * The API to the Cloudsmith Service
+ *
+ * OpenAPI spec version: v1
+ * Contact: support@cloudsmith.io
+ *
+ * NOTE: This class is auto generated by the swagger code generator program.
+ * https://github.com/swagger-api/swagger-codegen.git
+ * Do not edit the class manually.
+ */
+
+
+package io.cloudsmith.api.models;
+
+import java.util.Objects;
+import java.util.Arrays;
+import com.google.gson.TypeAdapter;
+import com.google.gson.annotations.JsonAdapter;
+import com.google.gson.annotations.SerializedName;
+import com.google.gson.stream.JsonReader;
+import com.google.gson.stream.JsonWriter;
+import io.swagger.annotations.ApiModel;
+import io.swagger.annotations.ApiModelProperty;
+import java.io.IOException;
+import java.io.Serializable;
+import javax.validation.constraints.*;
+import javax.validation.Valid;
+
+/**
+ * GenericUpstreamRequestPatch
+ */
+
+public class GenericUpstreamRequestPatch implements Serializable {
+ private static final long serialVersionUID = 1L;
+
+ /**
+ * The authentication mode to use when accessing this upstream.
+ */
+ @JsonAdapter(AuthModeEnum.Adapter.class)
+ public enum AuthModeEnum {
+ NONE("None"),
+
+ USERNAME_AND_PASSWORD("Username and Password"),
+
+ TOKEN("Token");
+
+ private String value;
+
+ AuthModeEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static AuthModeEnum fromValue(String text) {
+ for (AuthModeEnum b : AuthModeEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final AuthModeEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public AuthModeEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return AuthModeEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("auth_mode")
+ private AuthModeEnum authMode = AuthModeEnum.NONE;
+
+ @SerializedName("auth_secret")
+ private String authSecret = null;
+
+ @SerializedName("auth_username")
+ private String authUsername = null;
+
+ @SerializedName("extra_header_1")
+ private String extraHeader1 = null;
+
+ @SerializedName("extra_header_2")
+ private String extraHeader2 = null;
+
+ @SerializedName("extra_value_1")
+ private String extraValue1 = null;
+
+ @SerializedName("extra_value_2")
+ private String extraValue2 = null;
+
+ @SerializedName("is_active")
+ private Boolean isActive = null;
+
+ /**
+ * The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ */
+ @JsonAdapter(ModeEnum.Adapter.class)
+ public enum ModeEnum {
+ PROXY_ONLY("Proxy Only"),
+
+ CACHE_AND_PROXY("Cache and Proxy");
+
+ private String value;
+
+ ModeEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static ModeEnum fromValue(String text) {
+ for (ModeEnum b : ModeEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final ModeEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public ModeEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return ModeEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("mode")
+ private ModeEnum mode = ModeEnum.PROXY_ONLY;
+
+ @SerializedName("name")
+ private String name = null;
+
+ @SerializedName("priority")
+ private java.math.BigInteger priority = null;
+
+ @SerializedName("upstream_prefix")
+ private String upstreamPrefix = null;
+
+ @SerializedName("upstream_url")
+ private String upstreamUrl = null;
+
+ @SerializedName("verify_ssl")
+ private Boolean verifySsl = null;
+
+ public GenericUpstreamRequestPatch authMode(AuthModeEnum authMode) {
+ this.authMode = authMode;
+ return this;
+ }
+
+ /**
+ * The authentication mode to use when accessing this upstream.
+ * @return authMode
+ **/
+ @ApiModelProperty(value = "The authentication mode to use when accessing this upstream. ")
+ public AuthModeEnum getAuthMode() {
+ return authMode;
+ }
+
+ public void setAuthMode(AuthModeEnum authMode) {
+ this.authMode = authMode;
+ }
+
+ public GenericUpstreamRequestPatch authSecret(String authSecret) {
+ this.authSecret = authSecret;
+ return this;
+ }
+
+ /**
+ * Secret to provide with requests to upstream.
+ * @return authSecret
+ **/
+ @Size(max=4096) @ApiModelProperty(value = "Secret to provide with requests to upstream.")
+ public String getAuthSecret() {
+ return authSecret;
+ }
+
+ public void setAuthSecret(String authSecret) {
+ this.authSecret = authSecret;
+ }
+
+ public GenericUpstreamRequestPatch authUsername(String authUsername) {
+ this.authUsername = authUsername;
+ return this;
+ }
+
+ /**
+ * Username to provide with requests to upstream.
+ * @return authUsername
+ **/
+ @Size(max=64) @ApiModelProperty(value = "Username to provide with requests to upstream.")
+ public String getAuthUsername() {
+ return authUsername;
+ }
+
+ public void setAuthUsername(String authUsername) {
+ this.authUsername = authUsername;
+ }
+
+ public GenericUpstreamRequestPatch extraHeader1(String extraHeader1) {
+ this.extraHeader1 = extraHeader1;
+ return this;
+ }
+
+ /**
+ * The key for extra header #1 to send to upstream.
+ * @return extraHeader1
+ **/
+ @Pattern(regexp="^[-\\w]+$") @Size(max=64) @ApiModelProperty(value = "The key for extra header #1 to send to upstream.")
+ public String getExtraHeader1() {
+ return extraHeader1;
+ }
+
+ public void setExtraHeader1(String extraHeader1) {
+ this.extraHeader1 = extraHeader1;
+ }
+
+ public GenericUpstreamRequestPatch extraHeader2(String extraHeader2) {
+ this.extraHeader2 = extraHeader2;
+ return this;
+ }
+
+ /**
+ * The key for extra header #2 to send to upstream.
+ * @return extraHeader2
+ **/
+ @Pattern(regexp="^[-\\w]+$") @Size(max=64) @ApiModelProperty(value = "The key for extra header #2 to send to upstream.")
+ public String getExtraHeader2() {
+ return extraHeader2;
+ }
+
+ public void setExtraHeader2(String extraHeader2) {
+ this.extraHeader2 = extraHeader2;
+ }
+
+ public GenericUpstreamRequestPatch extraValue1(String extraValue1) {
+ this.extraValue1 = extraValue1;
+ return this;
+ }
+
+ /**
+ * The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ * @return extraValue1
+ **/
+ @Pattern(regexp="^[^\\n\\r]+$") @Size(max=128) @ApiModelProperty(value = "The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.")
+ public String getExtraValue1() {
+ return extraValue1;
+ }
+
+ public void setExtraValue1(String extraValue1) {
+ this.extraValue1 = extraValue1;
+ }
+
+ public GenericUpstreamRequestPatch extraValue2(String extraValue2) {
+ this.extraValue2 = extraValue2;
+ return this;
+ }
+
+ /**
+ * The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ * @return extraValue2
+ **/
+ @Pattern(regexp="^[^\\n\\r]+$") @Size(max=128) @ApiModelProperty(value = "The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.")
+ public String getExtraValue2() {
+ return extraValue2;
+ }
+
+ public void setExtraValue2(String extraValue2) {
+ this.extraValue2 = extraValue2;
+ }
+
+ public GenericUpstreamRequestPatch isActive(Boolean isActive) {
+ this.isActive = isActive;
+ return this;
+ }
+
+ /**
+ * Whether or not this upstream is active and ready for requests.
+ * @return isActive
+ **/
+ @ApiModelProperty(value = "Whether or not this upstream is active and ready for requests.")
+ public Boolean isIsActive() {
+ return isActive;
+ }
+
+ public void setIsActive(Boolean isActive) {
+ this.isActive = isActive;
+ }
+
+ public GenericUpstreamRequestPatch mode(ModeEnum mode) {
+ this.mode = mode;
+ return this;
+ }
+
+ /**
+ * The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ * @return mode
+ **/
+ @ApiModelProperty(value = "The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.")
+ public ModeEnum getMode() {
+ return mode;
+ }
+
+ public void setMode(ModeEnum mode) {
+ this.mode = mode;
+ }
+
+ public GenericUpstreamRequestPatch name(String name) {
+ this.name = name;
+ return this;
+ }
+
+ /**
+ * A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+ * @return name
+ **/
+ @Pattern(regexp="^\\w[\\w \\-'\\./()]+$") @Size(min=1,max=64) @ApiModelProperty(value = "A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.")
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ public GenericUpstreamRequestPatch priority(java.math.BigInteger priority) {
+ this.priority = priority;
+ return this;
+ }
+
+ /**
+ * Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+ * minimum: 1
+ * maximum: 32767
+ * @return priority
+ **/
+ @Min(1L) @Max(32767L) @ApiModelProperty(value = "Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.")
+ public java.math.BigInteger getPriority() {
+ return priority;
+ }
+
+ public void setPriority(java.math.BigInteger priority) {
+ this.priority = priority;
+ }
+
+ public GenericUpstreamRequestPatch upstreamPrefix(String upstreamPrefix) {
+ this.upstreamPrefix = upstreamPrefix;
+ return this;
+ }
+
+ /**
+ * A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+ * @return upstreamPrefix
+ **/
+ @Size(min=1,max=64) @ApiModelProperty(value = "A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.")
+ public String getUpstreamPrefix() {
+ return upstreamPrefix;
+ }
+
+ public void setUpstreamPrefix(String upstreamPrefix) {
+ this.upstreamPrefix = upstreamPrefix;
+ }
+
+ public GenericUpstreamRequestPatch upstreamUrl(String upstreamUrl) {
+ this.upstreamUrl = upstreamUrl;
+ return this;
+ }
+
+ /**
+ * The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+ * @return upstreamUrl
+ **/
+ @Size(min=1,max=200) @ApiModelProperty(value = "The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. ")
+ public String getUpstreamUrl() {
+ return upstreamUrl;
+ }
+
+ public void setUpstreamUrl(String upstreamUrl) {
+ this.upstreamUrl = upstreamUrl;
+ }
+
+ public GenericUpstreamRequestPatch verifySsl(Boolean verifySsl) {
+ this.verifySsl = verifySsl;
+ return this;
+ }
+
+ /**
+ * If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+ * @return verifySsl
+ **/
+ @ApiModelProperty(value = "If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.")
+ public Boolean isVerifySsl() {
+ return verifySsl;
+ }
+
+ public void setVerifySsl(Boolean verifySsl) {
+ this.verifySsl = verifySsl;
+ }
+
+
+ @Override
+ public boolean equals(java.lang.Object o) {
+ if (this == o) {
+ return true;
+ }
+ if (o == null || getClass() != o.getClass()) {
+ return false;
+ }
+ GenericUpstreamRequestPatch genericUpstreamRequestPatch = (GenericUpstreamRequestPatch) o;
+ return Objects.equals(this.authMode, genericUpstreamRequestPatch.authMode) &&
+ Objects.equals(this.authSecret, genericUpstreamRequestPatch.authSecret) &&
+ Objects.equals(this.authUsername, genericUpstreamRequestPatch.authUsername) &&
+ Objects.equals(this.extraHeader1, genericUpstreamRequestPatch.extraHeader1) &&
+ Objects.equals(this.extraHeader2, genericUpstreamRequestPatch.extraHeader2) &&
+ Objects.equals(this.extraValue1, genericUpstreamRequestPatch.extraValue1) &&
+ Objects.equals(this.extraValue2, genericUpstreamRequestPatch.extraValue2) &&
+ Objects.equals(this.isActive, genericUpstreamRequestPatch.isActive) &&
+ Objects.equals(this.mode, genericUpstreamRequestPatch.mode) &&
+ Objects.equals(this.name, genericUpstreamRequestPatch.name) &&
+ Objects.equals(this.priority, genericUpstreamRequestPatch.priority) &&
+ Objects.equals(this.upstreamPrefix, genericUpstreamRequestPatch.upstreamPrefix) &&
+ Objects.equals(this.upstreamUrl, genericUpstreamRequestPatch.upstreamUrl) &&
+ Objects.equals(this.verifySsl, genericUpstreamRequestPatch.verifySsl);
+ }
+
+ @Override
+ public int hashCode() {
+ return Objects.hash(authMode, authSecret, authUsername, extraHeader1, extraHeader2, extraValue1, extraValue2, isActive, mode, name, priority, upstreamPrefix, upstreamUrl, verifySsl);
+ }
+
+
+ @Override
+ public String toString() {
+ StringBuilder sb = new StringBuilder();
+ sb.append("class GenericUpstreamRequestPatch {\n");
+
+ sb.append(" authMode: ").append(toIndentedString(authMode)).append("\n");
+ sb.append(" authSecret: ").append(toIndentedString(authSecret)).append("\n");
+ sb.append(" authUsername: ").append(toIndentedString(authUsername)).append("\n");
+ sb.append(" extraHeader1: ").append(toIndentedString(extraHeader1)).append("\n");
+ sb.append(" extraHeader2: ").append(toIndentedString(extraHeader2)).append("\n");
+ sb.append(" extraValue1: ").append(toIndentedString(extraValue1)).append("\n");
+ sb.append(" extraValue2: ").append(toIndentedString(extraValue2)).append("\n");
+ sb.append(" isActive: ").append(toIndentedString(isActive)).append("\n");
+ sb.append(" mode: ").append(toIndentedString(mode)).append("\n");
+ sb.append(" name: ").append(toIndentedString(name)).append("\n");
+ sb.append(" priority: ").append(toIndentedString(priority)).append("\n");
+ sb.append(" upstreamPrefix: ").append(toIndentedString(upstreamPrefix)).append("\n");
+ sb.append(" upstreamUrl: ").append(toIndentedString(upstreamUrl)).append("\n");
+ sb.append(" verifySsl: ").append(toIndentedString(verifySsl)).append("\n");
+ sb.append("}");
+ return sb.toString();
+ }
+
+ /**
+ * Convert the given object to string with each line indented by 4 spaces
+ * (except the first line).
+ */
+ private String toIndentedString(java.lang.Object o) {
+ if (o == null) {
+ return "null";
+ }
+ return o.toString().replace("\n", "\n ");
+ }
+
+}
+
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstream.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstream.java
index 10303341..6c45d999 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstream.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstream.java
@@ -309,6 +309,56 @@ public ModeEnum read(final JsonReader jsonReader) throws IOException {
@SerializedName("slug_perm")
private String slugPerm = null;
+ /**
+ * Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ */
+ @JsonAdapter(TrustLevelEnum.Adapter.class)
+ public enum TrustLevelEnum {
+ TRUSTED("Trusted"),
+
+ UNTRUSTED("Untrusted");
+
+ private String value;
+
+ TrustLevelEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static TrustLevelEnum fromValue(String text) {
+ for (TrustLevelEnum b : TrustLevelEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final TrustLevelEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public TrustLevelEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return TrustLevelEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("trust_level")
+ private TrustLevelEnum trustLevel = TrustLevelEnum.TRUSTED;
+
@SerializedName("updated_at")
private OffsetDateTime updatedAt = null;
@@ -736,6 +786,24 @@ public String getSlugPerm() {
return slugPerm;
}
+ public MavenUpstream trustLevel(TrustLevelEnum trustLevel) {
+ this.trustLevel = trustLevel;
+ return this;
+ }
+
+ /**
+ * Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ * @return trustLevel
+ **/
+ @ApiModelProperty(value = "Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.")
+ public TrustLevelEnum getTrustLevel() {
+ return trustLevel;
+ }
+
+ public void setTrustLevel(TrustLevelEnum trustLevel) {
+ this.trustLevel = trustLevel;
+ }
+
/**
* Get updatedAt
* @return updatedAt
@@ -828,6 +896,7 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.pendingValidation, mavenUpstream.pendingValidation) &&
Objects.equals(this.priority, mavenUpstream.priority) &&
Objects.equals(this.slugPerm, mavenUpstream.slugPerm) &&
+ Objects.equals(this.trustLevel, mavenUpstream.trustLevel) &&
Objects.equals(this.updatedAt, mavenUpstream.updatedAt) &&
Objects.equals(this.upstreamUrl, mavenUpstream.upstreamUrl) &&
Objects.equals(this.verificationStatus, mavenUpstream.verificationStatus) &&
@@ -836,7 +905,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(authMode, authSecret, authUsername, available, canReindex, createdAt, disableReason, disableReasonText, extraHeader1, extraHeader2, extraValue1, extraValue2, gpgKeyFingerprintShort, gpgKeyInline, gpgKeyUrl, gpgVerification, hasFailedSignatureVerification, indexPackageCount, indexStatus, isActive, lastIndexed, mode, name, pendingValidation, priority, slugPerm, updatedAt, upstreamUrl, verificationStatus, verifySsl);
+ return Objects.hash(authMode, authSecret, authUsername, available, canReindex, createdAt, disableReason, disableReasonText, extraHeader1, extraHeader2, extraValue1, extraValue2, gpgKeyFingerprintShort, gpgKeyInline, gpgKeyUrl, gpgVerification, hasFailedSignatureVerification, indexPackageCount, indexStatus, isActive, lastIndexed, mode, name, pendingValidation, priority, slugPerm, trustLevel, updatedAt, upstreamUrl, verificationStatus, verifySsl);
}
@@ -871,6 +940,7 @@ public String toString() {
sb.append(" pendingValidation: ").append(toIndentedString(pendingValidation)).append("\n");
sb.append(" priority: ").append(toIndentedString(priority)).append("\n");
sb.append(" slugPerm: ").append(toIndentedString(slugPerm)).append("\n");
+ sb.append(" trustLevel: ").append(toIndentedString(trustLevel)).append("\n");
sb.append(" updatedAt: ").append(toIndentedString(updatedAt)).append("\n");
sb.append(" upstreamUrl: ").append(toIndentedString(upstreamUrl)).append("\n");
sb.append(" verificationStatus: ").append(toIndentedString(verificationStatus)).append("\n");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequest.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequest.java
index 47166d1f..d5f42939 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequest.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequest.java
@@ -221,6 +221,56 @@ public ModeEnum read(final JsonReader jsonReader) throws IOException {
@SerializedName("priority")
private java.math.BigInteger priority = null;
+ /**
+ * Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ */
+ @JsonAdapter(TrustLevelEnum.Adapter.class)
+ public enum TrustLevelEnum {
+ TRUSTED("Trusted"),
+
+ UNTRUSTED("Untrusted");
+
+ private String value;
+
+ TrustLevelEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static TrustLevelEnum fromValue(String text) {
+ for (TrustLevelEnum b : TrustLevelEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final TrustLevelEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public TrustLevelEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return TrustLevelEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("trust_level")
+ private TrustLevelEnum trustLevel = TrustLevelEnum.TRUSTED;
+
@SerializedName("upstream_url")
private String upstreamUrl = null;
@@ -482,6 +532,24 @@ public void setPriority(java.math.BigInteger priority) {
this.priority = priority;
}
+ public MavenUpstreamRequest trustLevel(TrustLevelEnum trustLevel) {
+ this.trustLevel = trustLevel;
+ return this;
+ }
+
+ /**
+ * Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ * @return trustLevel
+ **/
+ @ApiModelProperty(value = "Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.")
+ public TrustLevelEnum getTrustLevel() {
+ return trustLevel;
+ }
+
+ public void setTrustLevel(TrustLevelEnum trustLevel) {
+ this.trustLevel = trustLevel;
+ }
+
public MavenUpstreamRequest upstreamUrl(String upstreamUrl) {
this.upstreamUrl = upstreamUrl;
return this;
@@ -543,13 +611,14 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.mode, mavenUpstreamRequest.mode) &&
Objects.equals(this.name, mavenUpstreamRequest.name) &&
Objects.equals(this.priority, mavenUpstreamRequest.priority) &&
+ Objects.equals(this.trustLevel, mavenUpstreamRequest.trustLevel) &&
Objects.equals(this.upstreamUrl, mavenUpstreamRequest.upstreamUrl) &&
Objects.equals(this.verifySsl, mavenUpstreamRequest.verifySsl);
}
@Override
public int hashCode() {
- return Objects.hash(authMode, authSecret, authUsername, extraHeader1, extraHeader2, extraValue1, extraValue2, gpgKeyInline, gpgKeyUrl, gpgVerification, isActive, mode, name, priority, upstreamUrl, verifySsl);
+ return Objects.hash(authMode, authSecret, authUsername, extraHeader1, extraHeader2, extraValue1, extraValue2, gpgKeyInline, gpgKeyUrl, gpgVerification, isActive, mode, name, priority, trustLevel, upstreamUrl, verifySsl);
}
@@ -572,6 +641,7 @@ public String toString() {
sb.append(" mode: ").append(toIndentedString(mode)).append("\n");
sb.append(" name: ").append(toIndentedString(name)).append("\n");
sb.append(" priority: ").append(toIndentedString(priority)).append("\n");
+ sb.append(" trustLevel: ").append(toIndentedString(trustLevel)).append("\n");
sb.append(" upstreamUrl: ").append(toIndentedString(upstreamUrl)).append("\n");
sb.append(" verifySsl: ").append(toIndentedString(verifySsl)).append("\n");
sb.append("}");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequestPatch.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequestPatch.java
index 843b2e2b..079577f5 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequestPatch.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/MavenUpstreamRequestPatch.java
@@ -221,6 +221,56 @@ public ModeEnum read(final JsonReader jsonReader) throws IOException {
@SerializedName("priority")
private java.math.BigInteger priority = null;
+ /**
+ * Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ */
+ @JsonAdapter(TrustLevelEnum.Adapter.class)
+ public enum TrustLevelEnum {
+ TRUSTED("Trusted"),
+
+ UNTRUSTED("Untrusted");
+
+ private String value;
+
+ TrustLevelEnum(String value) {
+ this.value = value;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ @Override
+ public String toString() {
+ return String.valueOf(value);
+ }
+
+ public static TrustLevelEnum fromValue(String text) {
+ for (TrustLevelEnum b : TrustLevelEnum.values()) {
+ if (String.valueOf(b.value).equals(text)) {
+ return b;
+ }
+ }
+ return null;
+ }
+
+ public static class Adapter extends TypeAdapter {
+ @Override
+ public void write(final JsonWriter jsonWriter, final TrustLevelEnum enumeration) throws IOException {
+ jsonWriter.value(enumeration.getValue());
+ }
+
+ @Override
+ public TrustLevelEnum read(final JsonReader jsonReader) throws IOException {
+ String value = jsonReader.nextString();
+ return TrustLevelEnum.fromValue(String.valueOf(value));
+ }
+ }
+ }
+
+ @SerializedName("trust_level")
+ private TrustLevelEnum trustLevel = TrustLevelEnum.TRUSTED;
+
@SerializedName("upstream_url")
private String upstreamUrl = null;
@@ -481,6 +531,24 @@ public void setPriority(java.math.BigInteger priority) {
this.priority = priority;
}
+ public MavenUpstreamRequestPatch trustLevel(TrustLevelEnum trustLevel) {
+ this.trustLevel = trustLevel;
+ return this;
+ }
+
+ /**
+ * Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ * @return trustLevel
+ **/
+ @ApiModelProperty(value = "Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.")
+ public TrustLevelEnum getTrustLevel() {
+ return trustLevel;
+ }
+
+ public void setTrustLevel(TrustLevelEnum trustLevel) {
+ this.trustLevel = trustLevel;
+ }
+
public MavenUpstreamRequestPatch upstreamUrl(String upstreamUrl) {
this.upstreamUrl = upstreamUrl;
return this;
@@ -541,13 +609,14 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.mode, mavenUpstreamRequestPatch.mode) &&
Objects.equals(this.name, mavenUpstreamRequestPatch.name) &&
Objects.equals(this.priority, mavenUpstreamRequestPatch.priority) &&
+ Objects.equals(this.trustLevel, mavenUpstreamRequestPatch.trustLevel) &&
Objects.equals(this.upstreamUrl, mavenUpstreamRequestPatch.upstreamUrl) &&
Objects.equals(this.verifySsl, mavenUpstreamRequestPatch.verifySsl);
}
@Override
public int hashCode() {
- return Objects.hash(authMode, authSecret, authUsername, extraHeader1, extraHeader2, extraValue1, extraValue2, gpgKeyInline, gpgKeyUrl, gpgVerification, isActive, mode, name, priority, upstreamUrl, verifySsl);
+ return Objects.hash(authMode, authSecret, authUsername, extraHeader1, extraHeader2, extraValue1, extraValue2, gpgKeyInline, gpgKeyUrl, gpgVerification, isActive, mode, name, priority, trustLevel, upstreamUrl, verifySsl);
}
@@ -570,6 +639,7 @@ public String toString() {
sb.append(" mode: ").append(toIndentedString(mode)).append("\n");
sb.append(" name: ").append(toIndentedString(name)).append("\n");
sb.append(" priority: ").append(toIndentedString(priority)).append("\n");
+ sb.append(" trustLevel: ").append(toIndentedString(trustLevel)).append("\n");
sb.append(" upstreamUrl: ").append(toIndentedString(upstreamUrl)).append("\n");
sb.append(" verifySsl: ").append(toIndentedString(verifySsl)).append("\n");
sb.append("}");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/ModelPackage.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/ModelPackage.java
index 8b933e85..98cf3de7 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/ModelPackage.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/ModelPackage.java
@@ -92,6 +92,9 @@ public class ModelPackage implements Serializable {
@SerializedName("filename")
private String filename = null;
+ @SerializedName("filepath")
+ private String filepath = null;
+
@SerializedName("files")
private List files = null;
@@ -520,6 +523,15 @@ public String getFilename() {
return filename;
}
+ /**
+ * Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ * @return filepath
+ **/
+ @Size(min=1) @ApiModelProperty(value = "Full path to the file, including filename e.g. bin/utils/tool.tar.gz")
+ public String getFilepath() {
+ return filepath;
+ }
+
/**
* Get files
* @return files
@@ -1196,6 +1208,7 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.epoch, _package.epoch) &&
Objects.equals(this.extension, _package.extension) &&
Objects.equals(this.filename, _package.filename) &&
+ Objects.equals(this.filepath, _package.filepath) &&
Objects.equals(this.files, _package.files) &&
Objects.equals(this.format, _package.format) &&
Objects.equals(this.formatUrl, _package.formatUrl) &&
@@ -1268,7 +1281,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
+ return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, filepath, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
}
@@ -1293,6 +1306,7 @@ public String toString() {
sb.append(" epoch: ").append(toIndentedString(epoch)).append("\n");
sb.append(" extension: ").append(toIndentedString(extension)).append("\n");
sb.append(" filename: ").append(toIndentedString(filename)).append("\n");
+ sb.append(" filepath: ").append(toIndentedString(filepath)).append("\n");
sb.append(" files: ").append(toIndentedString(files)).append("\n");
sb.append(" format: ").append(toIndentedString(format)).append("\n");
sb.append(" formatUrl: ").append(toIndentedString(formatUrl)).append("\n");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeam.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeam.java
index 837b3a28..e20137c4 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeam.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeam.java
@@ -102,10 +102,10 @@ public OrganizationTeam description(String description) {
}
/**
- * Get description
+ * A detailed description of the team.
* @return description
**/
- @Size(min=1,max=140) @ApiModelProperty(value = "")
+ @Size(max=200) @ApiModelProperty(value = "A detailed description of the team.")
public String getDescription() {
return description;
}
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequest.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequest.java
index f2d2dd5b..7280887a 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequest.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequest.java
@@ -99,10 +99,10 @@ public OrganizationTeamRequest description(String description) {
}
/**
- * Get description
+ * A detailed description of the team.
* @return description
**/
- @Size(min=1,max=140) @ApiModelProperty(value = "")
+ @Size(max=200) @ApiModelProperty(value = "A detailed description of the team.")
public String getDescription() {
return description;
}
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequestPatch.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequestPatch.java
index 62112924..281977f6 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequestPatch.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/OrganizationTeamRequestPatch.java
@@ -99,10 +99,10 @@ public OrganizationTeamRequestPatch description(String description) {
}
/**
- * Get description
+ * A detailed description of the team.
* @return description
**/
- @Size(min=1,max=140) @ApiModelProperty(value = "")
+ @Size(max=200) @ApiModelProperty(value = "A detailed description of the team.")
public String getDescription() {
return description;
}
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopy.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopy.java
index b162810e..3721801a 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopy.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopy.java
@@ -92,6 +92,9 @@ public class PackageCopy implements Serializable {
@SerializedName("filename")
private String filename = null;
+ @SerializedName("filepath")
+ private String filepath = null;
+
@SerializedName("files")
private List files = null;
@@ -520,6 +523,15 @@ public String getFilename() {
return filename;
}
+ /**
+ * Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ * @return filepath
+ **/
+ @Size(min=1) @ApiModelProperty(value = "Full path to the file, including filename e.g. bin/utils/tool.tar.gz")
+ public String getFilepath() {
+ return filepath;
+ }
+
/**
* Get files
* @return files
@@ -1196,6 +1208,7 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.epoch, packageCopy.epoch) &&
Objects.equals(this.extension, packageCopy.extension) &&
Objects.equals(this.filename, packageCopy.filename) &&
+ Objects.equals(this.filepath, packageCopy.filepath) &&
Objects.equals(this.files, packageCopy.files) &&
Objects.equals(this.format, packageCopy.format) &&
Objects.equals(this.formatUrl, packageCopy.formatUrl) &&
@@ -1268,7 +1281,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
+ return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, filepath, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
}
@@ -1293,6 +1306,7 @@ public String toString() {
sb.append(" epoch: ").append(toIndentedString(epoch)).append("\n");
sb.append(" extension: ").append(toIndentedString(extension)).append("\n");
sb.append(" filename: ").append(toIndentedString(filename)).append("\n");
+ sb.append(" filepath: ").append(toIndentedString(filepath)).append("\n");
sb.append(" files: ").append(toIndentedString(files)).append("\n");
sb.append(" format: ").append(toIndentedString(format)).append("\n");
sb.append(" formatUrl: ").append(toIndentedString(formatUrl)).append("\n");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopyRequest.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopyRequest.java
index c764ccd4..978a9d53 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopyRequest.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageCopyRequest.java
@@ -46,11 +46,11 @@ public PackageCopyRequest destination(String destination) {
}
/**
- * Get destination
+ * The name of the destination repository without the namespace.
* @return destination
**/
@NotNull
- @Size(min=1) @ApiModelProperty(required = true, value = "")
+ @Size(min=1) @ApiModelProperty(required = true, value = "The name of the destination repository without the namespace.")
public String getDestination() {
return destination;
}
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMove.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMove.java
index ea99b75a..dbdcf678 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMove.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMove.java
@@ -92,6 +92,9 @@ public class PackageMove implements Serializable {
@SerializedName("filename")
private String filename = null;
+ @SerializedName("filepath")
+ private String filepath = null;
+
@SerializedName("files")
private List files = null;
@@ -520,6 +523,15 @@ public String getFilename() {
return filename;
}
+ /**
+ * Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ * @return filepath
+ **/
+ @Size(min=1) @ApiModelProperty(value = "Full path to the file, including filename e.g. bin/utils/tool.tar.gz")
+ public String getFilepath() {
+ return filepath;
+ }
+
/**
* Get files
* @return files
@@ -1196,6 +1208,7 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.epoch, packageMove.epoch) &&
Objects.equals(this.extension, packageMove.extension) &&
Objects.equals(this.filename, packageMove.filename) &&
+ Objects.equals(this.filepath, packageMove.filepath) &&
Objects.equals(this.files, packageMove.files) &&
Objects.equals(this.format, packageMove.format) &&
Objects.equals(this.formatUrl, packageMove.formatUrl) &&
@@ -1268,7 +1281,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
+ return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, filepath, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
}
@@ -1293,6 +1306,7 @@ public String toString() {
sb.append(" epoch: ").append(toIndentedString(epoch)).append("\n");
sb.append(" extension: ").append(toIndentedString(extension)).append("\n");
sb.append(" filename: ").append(toIndentedString(filename)).append("\n");
+ sb.append(" filepath: ").append(toIndentedString(filepath)).append("\n");
sb.append(" files: ").append(toIndentedString(files)).append("\n");
sb.append(" format: ").append(toIndentedString(format)).append("\n");
sb.append(" formatUrl: ").append(toIndentedString(formatUrl)).append("\n");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMoveRequest.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMoveRequest.java
index b11b0aea..525d7c4f 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMoveRequest.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageMoveRequest.java
@@ -43,11 +43,11 @@ public PackageMoveRequest destination(String destination) {
}
/**
- * Get destination
+ * The name of the destination repository without the namespace.
* @return destination
**/
@NotNull
- @Size(min=1) @ApiModelProperty(required = true, value = "")
+ @Size(min=1) @ApiModelProperty(required = true, value = "The name of the destination repository without the namespace.")
public String getDestination() {
return destination;
}
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageQuarantine.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageQuarantine.java
index 2f0ac0ab..86112585 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageQuarantine.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageQuarantine.java
@@ -92,6 +92,9 @@ public class PackageQuarantine implements Serializable {
@SerializedName("filename")
private String filename = null;
+ @SerializedName("filepath")
+ private String filepath = null;
+
@SerializedName("files")
private List files = null;
@@ -517,6 +520,15 @@ public String getFilename() {
return filename;
}
+ /**
+ * Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ * @return filepath
+ **/
+ @Size(min=1) @ApiModelProperty(value = "Full path to the file, including filename e.g. bin/utils/tool.tar.gz")
+ public String getFilepath() {
+ return filepath;
+ }
+
/**
* Get files
* @return files
@@ -1184,6 +1196,7 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.epoch, packageQuarantine.epoch) &&
Objects.equals(this.extension, packageQuarantine.extension) &&
Objects.equals(this.filename, packageQuarantine.filename) &&
+ Objects.equals(this.filepath, packageQuarantine.filepath) &&
Objects.equals(this.files, packageQuarantine.files) &&
Objects.equals(this.format, packageQuarantine.format) &&
Objects.equals(this.formatUrl, packageQuarantine.formatUrl) &&
@@ -1255,7 +1268,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
+ return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, filepath, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
}
@@ -1280,6 +1293,7 @@ public String toString() {
sb.append(" epoch: ").append(toIndentedString(epoch)).append("\n");
sb.append(" extension: ").append(toIndentedString(extension)).append("\n");
sb.append(" filename: ").append(toIndentedString(filename)).append("\n");
+ sb.append(" filepath: ").append(toIndentedString(filepath)).append("\n");
sb.append(" files: ").append(toIndentedString(files)).append("\n");
sb.append(" format: ").append(toIndentedString(format)).append("\n");
sb.append(" formatUrl: ").append(toIndentedString(formatUrl)).append("\n");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageResync.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageResync.java
index 1b2991d0..e9bb8264 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageResync.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageResync.java
@@ -92,6 +92,9 @@ public class PackageResync implements Serializable {
@SerializedName("filename")
private String filename = null;
+ @SerializedName("filepath")
+ private String filepath = null;
+
@SerializedName("files")
private List files = null;
@@ -520,6 +523,15 @@ public String getFilename() {
return filename;
}
+ /**
+ * Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ * @return filepath
+ **/
+ @Size(min=1) @ApiModelProperty(value = "Full path to the file, including filename e.g. bin/utils/tool.tar.gz")
+ public String getFilepath() {
+ return filepath;
+ }
+
/**
* Get files
* @return files
@@ -1196,6 +1208,7 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.epoch, packageResync.epoch) &&
Objects.equals(this.extension, packageResync.extension) &&
Objects.equals(this.filename, packageResync.filename) &&
+ Objects.equals(this.filepath, packageResync.filepath) &&
Objects.equals(this.files, packageResync.files) &&
Objects.equals(this.format, packageResync.format) &&
Objects.equals(this.formatUrl, packageResync.formatUrl) &&
@@ -1268,7 +1281,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
+ return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, filepath, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tags, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
}
@@ -1293,6 +1306,7 @@ public String toString() {
sb.append(" epoch: ").append(toIndentedString(epoch)).append("\n");
sb.append(" extension: ").append(toIndentedString(extension)).append("\n");
sb.append(" filename: ").append(toIndentedString(filename)).append("\n");
+ sb.append(" filepath: ").append(toIndentedString(filepath)).append("\n");
sb.append(" files: ").append(toIndentedString(files)).append("\n");
sb.append(" format: ").append(toIndentedString(format)).append("\n");
sb.append(" formatUrl: ").append(toIndentedString(formatUrl)).append("\n");
diff --git a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageTag.java b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageTag.java
index 77c83e48..9bc81d43 100644
--- a/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageTag.java
+++ b/bindings/java/src/src/main/java/io/cloudsmith/api/models/PackageTag.java
@@ -92,6 +92,9 @@ public class PackageTag implements Serializable {
@SerializedName("filename")
private String filename = null;
+ @SerializedName("filepath")
+ private String filepath = null;
+
@SerializedName("files")
private List files = null;
@@ -520,6 +523,15 @@ public String getFilename() {
return filename;
}
+ /**
+ * Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ * @return filepath
+ **/
+ @Size(min=1) @ApiModelProperty(value = "Full path to the file, including filename e.g. bin/utils/tool.tar.gz")
+ public String getFilepath() {
+ return filepath;
+ }
+
/**
* Get files
* @return files
@@ -1195,6 +1207,7 @@ public boolean equals(java.lang.Object o) {
Objects.equals(this.epoch, packageTag.epoch) &&
Objects.equals(this.extension, packageTag.extension) &&
Objects.equals(this.filename, packageTag.filename) &&
+ Objects.equals(this.filepath, packageTag.filepath) &&
Objects.equals(this.files, packageTag.files) &&
Objects.equals(this.format, packageTag.format) &&
Objects.equals(this.formatUrl, packageTag.formatUrl) &&
@@ -1267,7 +1280,7 @@ public boolean equals(java.lang.Object o) {
@Override
public int hashCode() {
- return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isImmutable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
+ return Objects.hash(architectures, cdnUrl, checksumMd5, checksumSha1, checksumSha256, checksumSha512, dependenciesChecksumMd5, dependenciesUrl, description, displayName, distro, distroVersion, downloads, epoch, extension, filename, filepath, files, format, formatUrl, freeableStorage, fullyQualifiedName, identifierPerm, identifiers, indexed, isCancellable, isCopyable, isDeleteable, isDownloadable, isImmutable, isMoveable, isQuarantinable, isQuarantined, isResyncable, isSecurityScannable, isSyncAwaiting, isSyncCompleted, isSyncFailed, isSyncInFlight, isSyncInProgress, license, name, namespace, namespaceUrl, numFiles, originRepository, originRepositoryUrl, packageType, policyViolated, rawLicense, release, repository, repositoryUrl, securityScanCompletedAt, securityScanStartedAt, securityScanStatus, securityScanStatusUpdatedAt, selfHtmlUrl, selfUrl, signatureUrl, size, slug, slugPerm, spdxLicense, stage, stageStr, stageUpdatedAt, status, statusReason, statusStr, statusUpdatedAt, statusUrl, subtype, summary, syncFinishedAt, syncProgress, tagsAutomatic, tagsImmutable, typeDisplay, uploadedAt, uploader, uploaderUrl, version, versionOrig, vulnerabilityScanResultsUrl);
}
@@ -1292,6 +1305,7 @@ public String toString() {
sb.append(" epoch: ").append(toIndentedString(epoch)).append("\n");
sb.append(" extension: ").append(toIndentedString(extension)).append("\n");
sb.append(" filename: ").append(toIndentedString(filename)).append("\n");
+ sb.append(" filepath: ").append(toIndentedString(filepath)).append("\n");
sb.append(" files: ").append(toIndentedString(files)).append("\n");
sb.append(" format: ").append(toIndentedString(format)).append("\n");
sb.append(" formatUrl: ").append(toIndentedString(formatUrl)).append("\n");
diff --git a/bindings/java/src/src/test/java/io/cloudsmith/api/apis/PackagesApiTest.java b/bindings/java/src/src/test/java/io/cloudsmith/api/apis/PackagesApiTest.java
index fe1d02c0..193f4c6c 100644
--- a/bindings/java/src/src/test/java/io/cloudsmith/api/apis/PackagesApiTest.java
+++ b/bindings/java/src/src/test/java/io/cloudsmith/api/apis/PackagesApiTest.java
@@ -34,6 +34,8 @@
import io.cloudsmith.api.models.DockerPackageUpload;
import io.cloudsmith.api.models.DockerPackageUploadRequest;
import io.cloudsmith.api.models.ErrorDetail;
+import io.cloudsmith.api.models.GenericPackageUpload;
+import io.cloudsmith.api.models.GenericPackageUploadRequest;
import io.cloudsmith.api.models.GoPackageUpload;
import io.cloudsmith.api.models.GoPackageUploadRequest;
import io.cloudsmith.api.models.HelmPackageUpload;
@@ -524,6 +526,24 @@ public void packagesUploadDockerTest() throws Exception {
// TODO: test validations
}
+ /**
+ * Create a new Generic package
+ *
+ * Create a new Generic package
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void packagesUploadGenericTest() throws Exception {
+ String owner = null;
+ String repo = null;
+ GenericPackageUploadRequest data = null;
+ GenericPackageUpload response = api.packagesUploadGeneric(owner, repo, data);
+
+ // TODO: test validations
+ }
+
/**
* Create a new Go package
*
@@ -992,6 +1012,24 @@ public void packagesValidateUploadDockerTest() throws Exception {
// TODO: test validations
}
+ /**
+ * Validate parameters for create Generic package
+ *
+ * Validate parameters for create Generic package
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void packagesValidateUploadGenericTest() throws Exception {
+ String owner = null;
+ String repo = null;
+ GenericPackageUploadRequest data = null;
+ api.packagesValidateUploadGeneric(owner, repo, data);
+
+ // TODO: test validations
+ }
+
/**
* Validate parameters for create Go package
*
diff --git a/bindings/java/src/src/test/java/io/cloudsmith/api/apis/ReposApiTest.java b/bindings/java/src/src/test/java/io/cloudsmith/api/apis/ReposApiTest.java
index 0367c1fe..9a435d9c 100644
--- a/bindings/java/src/src/test/java/io/cloudsmith/api/apis/ReposApiTest.java
+++ b/bindings/java/src/src/test/java/io/cloudsmith/api/apis/ReposApiTest.java
@@ -35,6 +35,9 @@
import io.cloudsmith.api.models.DockerUpstreamRequest;
import io.cloudsmith.api.models.DockerUpstreamRequestPatch;
import io.cloudsmith.api.models.ErrorDetail;
+import io.cloudsmith.api.models.GenericUpstream;
+import io.cloudsmith.api.models.GenericUpstreamRequest;
+import io.cloudsmith.api.models.GenericUpstreamRequestPatch;
import io.cloudsmith.api.models.GoUpstream;
import io.cloudsmith.api.models.GoUpstreamRequest;
import io.cloudsmith.api.models.GoUpstreamRequestPatch;
@@ -1363,6 +1366,117 @@ public void reposUpstreamDockerUpdateTest() throws Exception {
// TODO: test validations
}
+ /**
+ * Create a Generic upstream config for this repository.
+ *
+ * Create a Generic upstream config for this repository.
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void reposUpstreamGenericCreateTest() throws Exception {
+ String owner = null;
+ String identifier = null;
+ GenericUpstreamRequest data = null;
+ GenericUpstream response = api.reposUpstreamGenericCreate(owner, identifier, data);
+
+ // TODO: test validations
+ }
+
+ /**
+ * Delete a Generic upstream config for this repository.
+ *
+ * Delete a Generic upstream config for this repository.
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void reposUpstreamGenericDeleteTest() throws Exception {
+ String owner = null;
+ String identifier = null;
+ String slugPerm = null;
+ api.reposUpstreamGenericDelete(owner, identifier, slugPerm);
+
+ // TODO: test validations
+ }
+
+ /**
+ * List Generic upstream configs for this repository.
+ *
+ * List Generic upstream configs for this repository.
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void reposUpstreamGenericListTest() throws Exception {
+ String owner = null;
+ String identifier = null;
+ java.math.BigInteger page = null;
+ java.math.BigInteger pageSize = null;
+ List response = api.reposUpstreamGenericList(owner, identifier, page, pageSize);
+
+ // TODO: test validations
+ }
+
+ /**
+ * Partially update a Generic upstream config for this repository.
+ *
+ * Partially update a Generic upstream config for this repository.
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void reposUpstreamGenericPartialUpdateTest() throws Exception {
+ String owner = null;
+ String identifier = null;
+ String slugPerm = null;
+ GenericUpstreamRequestPatch data = null;
+ GenericUpstream response = api.reposUpstreamGenericPartialUpdate(owner, identifier, slugPerm, data);
+
+ // TODO: test validations
+ }
+
+ /**
+ * Retrieve a Generic upstream config for this repository.
+ *
+ * Retrieve a Generic upstream config for this repository.
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void reposUpstreamGenericReadTest() throws Exception {
+ String owner = null;
+ String identifier = null;
+ String slugPerm = null;
+ GenericUpstream response = api.reposUpstreamGenericRead(owner, identifier, slugPerm);
+
+ // TODO: test validations
+ }
+
+ /**
+ * Update a Generic upstream config for this repository.
+ *
+ * Update a Generic upstream config for this repository.
+ *
+ * @throws Exception
+ * if the Api call fails
+ */
+ @Test
+ public void reposUpstreamGenericUpdateTest() throws Exception {
+ String owner = null;
+ String identifier = null;
+ String slugPerm = null;
+ GenericUpstreamRequest data = null;
+ GenericUpstream response = api.reposUpstreamGenericUpdate(owner, identifier, slugPerm, data);
+
+ // TODO: test validations
+ }
+
/**
* Create a Go upstream config for this repository.
*
diff --git a/bindings/python/src/README.md b/bindings/python/src/README.md
index fcd46b78..43992f8e 100644
--- a/bindings/python/src/README.md
+++ b/bindings/python/src/README.md
@@ -208,6 +208,7 @@ Class | Method | HTTP request | Description
*PackagesApi* | [**packages_upload_dart**](docs/PackagesApi.md#packages_upload_dart) | **POST** /packages/{owner}/{repo}/upload/dart/ | Create a new Dart package
*PackagesApi* | [**packages_upload_deb**](docs/PackagesApi.md#packages_upload_deb) | **POST** /packages/{owner}/{repo}/upload/deb/ | Create a new Debian package
*PackagesApi* | [**packages_upload_docker**](docs/PackagesApi.md#packages_upload_docker) | **POST** /packages/{owner}/{repo}/upload/docker/ | Create a new Docker package
+*PackagesApi* | [**packages_upload_generic**](docs/PackagesApi.md#packages_upload_generic) | **POST** /packages/{owner}/{repo}/upload/generic/ | Create a new Generic package
*PackagesApi* | [**packages_upload_go**](docs/PackagesApi.md#packages_upload_go) | **POST** /packages/{owner}/{repo}/upload/go/ | Create a new Go package
*PackagesApi* | [**packages_upload_helm**](docs/PackagesApi.md#packages_upload_helm) | **POST** /packages/{owner}/{repo}/upload/helm/ | Create a new Helm package
*PackagesApi* | [**packages_upload_hex**](docs/PackagesApi.md#packages_upload_hex) | **POST** /packages/{owner}/{repo}/upload/hex/ | Create a new Hex package
@@ -234,6 +235,7 @@ Class | Method | HTTP request | Description
*PackagesApi* | [**packages_validate_upload_dart**](docs/PackagesApi.md#packages_validate_upload_dart) | **POST** /packages/{owner}/{repo}/validate-upload/dart/ | Validate parameters for create Dart package
*PackagesApi* | [**packages_validate_upload_deb**](docs/PackagesApi.md#packages_validate_upload_deb) | **POST** /packages/{owner}/{repo}/validate-upload/deb/ | Validate parameters for create Debian package
*PackagesApi* | [**packages_validate_upload_docker**](docs/PackagesApi.md#packages_validate_upload_docker) | **POST** /packages/{owner}/{repo}/validate-upload/docker/ | Validate parameters for create Docker package
+*PackagesApi* | [**packages_validate_upload_generic**](docs/PackagesApi.md#packages_validate_upload_generic) | **POST** /packages/{owner}/{repo}/validate-upload/generic/ | Validate parameters for create Generic package
*PackagesApi* | [**packages_validate_upload_go**](docs/PackagesApi.md#packages_validate_upload_go) | **POST** /packages/{owner}/{repo}/validate-upload/go/ | Validate parameters for create Go package
*PackagesApi* | [**packages_validate_upload_helm**](docs/PackagesApi.md#packages_validate_upload_helm) | **POST** /packages/{owner}/{repo}/validate-upload/helm/ | Validate parameters for create Helm package
*PackagesApi* | [**packages_validate_upload_hex**](docs/PackagesApi.md#packages_validate_upload_hex) | **POST** /packages/{owner}/{repo}/validate-upload/hex/ | Validate parameters for create Hex package
@@ -324,6 +326,12 @@ Class | Method | HTTP request | Description
*ReposApi* | [**repos_upstream_docker_partial_update**](docs/ReposApi.md#repos_upstream_docker_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Partially update a Docker upstream config for this repository.
*ReposApi* | [**repos_upstream_docker_read**](docs/ReposApi.md#repos_upstream_docker_read) | **GET** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Retrieve a Docker upstream config for this repository.
*ReposApi* | [**repos_upstream_docker_update**](docs/ReposApi.md#repos_upstream_docker_update) | **PUT** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Update a Docker upstream config for this repository.
+*ReposApi* | [**repos_upstream_generic_create**](docs/ReposApi.md#repos_upstream_generic_create) | **POST** /repos/{owner}/{identifier}/upstream/generic/ | Create a Generic upstream config for this repository.
+*ReposApi* | [**repos_upstream_generic_delete**](docs/ReposApi.md#repos_upstream_generic_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Delete a Generic upstream config for this repository.
+*ReposApi* | [**repos_upstream_generic_list**](docs/ReposApi.md#repos_upstream_generic_list) | **GET** /repos/{owner}/{identifier}/upstream/generic/ | List Generic upstream configs for this repository.
+*ReposApi* | [**repos_upstream_generic_partial_update**](docs/ReposApi.md#repos_upstream_generic_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Partially update a Generic upstream config for this repository.
+*ReposApi* | [**repos_upstream_generic_read**](docs/ReposApi.md#repos_upstream_generic_read) | **GET** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Retrieve a Generic upstream config for this repository.
+*ReposApi* | [**repos_upstream_generic_update**](docs/ReposApi.md#repos_upstream_generic_update) | **PUT** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Update a Generic upstream config for this repository.
*ReposApi* | [**repos_upstream_go_create**](docs/ReposApi.md#repos_upstream_go_create) | **POST** /repos/{owner}/{identifier}/upstream/go/ | Create a Go upstream config for this repository.
*ReposApi* | [**repos_upstream_go_delete**](docs/ReposApi.md#repos_upstream_go_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/go/{slug_perm}/ | Delete a Go upstream config for this repository.
*ReposApi* | [**repos_upstream_go_list**](docs/ReposApi.md#repos_upstream_go_list) | **GET** /repos/{owner}/{identifier}/upstream/go/ | List Go upstream configs for this repository.
@@ -476,6 +484,11 @@ Class | Method | HTTP request | Description
- [Format](docs/Format.md)
- [FormatSupport](docs/FormatSupport.md)
- [FormatSupportUpstream](docs/FormatSupportUpstream.md)
+ - [GenericPackageUpload](docs/GenericPackageUpload.md)
+ - [GenericPackageUploadRequest](docs/GenericPackageUploadRequest.md)
+ - [GenericUpstream](docs/GenericUpstream.md)
+ - [GenericUpstreamRequest](docs/GenericUpstreamRequest.md)
+ - [GenericUpstreamRequestPatch](docs/GenericUpstreamRequestPatch.md)
- [GeoIpLocation](docs/GeoIpLocation.md)
- [GoPackageUpload](docs/GoPackageUpload.md)
- [GoPackageUploadRequest](docs/GoPackageUploadRequest.md)
diff --git a/bindings/python/src/cloudsmith_api/__init__.py b/bindings/python/src/cloudsmith_api/__init__.py
index bb841ad2..19b975fd 100644
--- a/bindings/python/src/cloudsmith_api/__init__.py
+++ b/bindings/python/src/cloudsmith_api/__init__.py
@@ -102,6 +102,11 @@
from cloudsmith_api.models.format import Format
from cloudsmith_api.models.format_support import FormatSupport
from cloudsmith_api.models.format_support_upstream import FormatSupportUpstream
+from cloudsmith_api.models.generic_package_upload import GenericPackageUpload
+from cloudsmith_api.models.generic_package_upload_request import GenericPackageUploadRequest
+from cloudsmith_api.models.generic_upstream import GenericUpstream
+from cloudsmith_api.models.generic_upstream_request import GenericUpstreamRequest
+from cloudsmith_api.models.generic_upstream_request_patch import GenericUpstreamRequestPatch
from cloudsmith_api.models.geo_ip_location import GeoIpLocation
from cloudsmith_api.models.go_package_upload import GoPackageUpload
from cloudsmith_api.models.go_package_upload_request import GoPackageUploadRequest
diff --git a/bindings/python/src/cloudsmith_api/api/packages_api.py b/bindings/python/src/cloudsmith_api/api/packages_api.py
index f5efce8f..8c91bc5d 100644
--- a/bindings/python/src/cloudsmith_api/api/packages_api.py
+++ b/bindings/python/src/cloudsmith_api/api/packages_api.py
@@ -2682,6 +2682,117 @@ def packages_upload_docker_with_http_info(self, owner, repo, **kwargs): # noqa:
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
+ def packages_upload_generic(self, owner, repo, **kwargs): # noqa: E501
+ """Create a new Generic package # noqa: E501
+
+ Create a new Generic package # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.packages_upload_generic(owner, repo, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str repo: (required)
+ :param GenericPackageUploadRequest data:
+ :return: GenericPackageUpload
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.packages_upload_generic_with_http_info(owner, repo, **kwargs) # noqa: E501
+ else:
+ (data) = self.packages_upload_generic_with_http_info(owner, repo, **kwargs) # noqa: E501
+ return data
+
+ def packages_upload_generic_with_http_info(self, owner, repo, **kwargs): # noqa: E501
+ """Create a new Generic package # noqa: E501
+
+ Create a new Generic package # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.packages_upload_generic_with_http_info(owner, repo, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str repo: (required)
+ :param GenericPackageUploadRequest data:
+ :return: GenericPackageUpload
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'repo', 'data'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method packages_upload_generic" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `packages_upload_generic`") # noqa: E501
+ # verify the required parameter 'repo' is set
+ if self.api_client.client_side_validation and ('repo' not in params or
+ params['repo'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `repo` when calling `packages_upload_generic`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'repo' in params:
+ path_params['repo'] = params['repo'] # noqa: E501
+
+ query_params = []
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ if 'data' in params:
+ body_params = params['data']
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/packages/{owner}/{repo}/upload/generic/', 'POST',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type='GenericPackageUpload', # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
def packages_upload_go(self, owner, repo, **kwargs): # noqa: E501
"""Create a new Go package # noqa: E501
@@ -5568,6 +5679,117 @@ def packages_validate_upload_docker_with_http_info(self, owner, repo, **kwargs):
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
+ def packages_validate_upload_generic(self, owner, repo, **kwargs): # noqa: E501
+ """Validate parameters for create Generic package # noqa: E501
+
+ Validate parameters for create Generic package # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.packages_validate_upload_generic(owner, repo, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str repo: (required)
+ :param GenericPackageUploadRequest data:
+ :return: None
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.packages_validate_upload_generic_with_http_info(owner, repo, **kwargs) # noqa: E501
+ else:
+ (data) = self.packages_validate_upload_generic_with_http_info(owner, repo, **kwargs) # noqa: E501
+ return data
+
+ def packages_validate_upload_generic_with_http_info(self, owner, repo, **kwargs): # noqa: E501
+ """Validate parameters for create Generic package # noqa: E501
+
+ Validate parameters for create Generic package # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.packages_validate_upload_generic_with_http_info(owner, repo, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str repo: (required)
+ :param GenericPackageUploadRequest data:
+ :return: None
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'repo', 'data'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method packages_validate_upload_generic" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `packages_validate_upload_generic`") # noqa: E501
+ # verify the required parameter 'repo' is set
+ if self.api_client.client_side_validation and ('repo' not in params or
+ params['repo'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `repo` when calling `packages_validate_upload_generic`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'repo' in params:
+ path_params['repo'] = params['repo'] # noqa: E501
+
+ query_params = []
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ if 'data' in params:
+ body_params = params['data']
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/packages/{owner}/{repo}/validate-upload/generic/', 'POST',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type=None, # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
def packages_validate_upload_go(self, owner, repo, **kwargs): # noqa: E501
"""Validate parameters for create Go package # noqa: E501
diff --git a/bindings/python/src/cloudsmith_api/api/repos_api.py b/bindings/python/src/cloudsmith_api/api/repos_api.py
index 5a2f2e24..067ba27e 100644
--- a/bindings/python/src/cloudsmith_api/api/repos_api.py
+++ b/bindings/python/src/cloudsmith_api/api/repos_api.py
@@ -7836,6 +7836,700 @@ def repos_upstream_docker_update_with_http_info(self, owner, identifier, slug_pe
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
+ def repos_upstream_generic_create(self, owner, identifier, **kwargs): # noqa: E501
+ """Create a Generic upstream config for this repository. # noqa: E501
+
+ Create a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_create(owner, identifier, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param GenericUpstreamRequest data:
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.repos_upstream_generic_create_with_http_info(owner, identifier, **kwargs) # noqa: E501
+ else:
+ (data) = self.repos_upstream_generic_create_with_http_info(owner, identifier, **kwargs) # noqa: E501
+ return data
+
+ def repos_upstream_generic_create_with_http_info(self, owner, identifier, **kwargs): # noqa: E501
+ """Create a Generic upstream config for this repository. # noqa: E501
+
+ Create a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_create_with_http_info(owner, identifier, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param GenericUpstreamRequest data:
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'identifier', 'data'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method repos_upstream_generic_create" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `repos_upstream_generic_create`") # noqa: E501
+ # verify the required parameter 'identifier' is set
+ if self.api_client.client_side_validation and ('identifier' not in params or
+ params['identifier'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `identifier` when calling `repos_upstream_generic_create`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'identifier' in params:
+ path_params['identifier'] = params['identifier'] # noqa: E501
+
+ query_params = []
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ if 'data' in params:
+ body_params = params['data']
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/repos/{owner}/{identifier}/upstream/generic/', 'POST',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type='GenericUpstream', # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
+ def repos_upstream_generic_delete(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Delete a Generic upstream config for this repository. # noqa: E501
+
+ Delete a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_delete(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :return: None
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.repos_upstream_generic_delete_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ else:
+ (data) = self.repos_upstream_generic_delete_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ return data
+
+ def repos_upstream_generic_delete_with_http_info(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Delete a Generic upstream config for this repository. # noqa: E501
+
+ Delete a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_delete_with_http_info(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :return: None
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'identifier', 'slug_perm'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method repos_upstream_generic_delete" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `repos_upstream_generic_delete`") # noqa: E501
+ # verify the required parameter 'identifier' is set
+ if self.api_client.client_side_validation and ('identifier' not in params or
+ params['identifier'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `identifier` when calling `repos_upstream_generic_delete`") # noqa: E501
+ # verify the required parameter 'slug_perm' is set
+ if self.api_client.client_side_validation and ('slug_perm' not in params or
+ params['slug_perm'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `slug_perm` when calling `repos_upstream_generic_delete`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'identifier' in params:
+ path_params['identifier'] = params['identifier'] # noqa: E501
+ if 'slug_perm' in params:
+ path_params['slug_perm'] = params['slug_perm'] # noqa: E501
+
+ query_params = []
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/', 'DELETE',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type=None, # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
+ def repos_upstream_generic_list(self, owner, identifier, **kwargs): # noqa: E501
+ """List Generic upstream configs for this repository. # noqa: E501
+
+ List Generic upstream configs for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_list(owner, identifier, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param int page: A page number within the paginated result set.
+ :param int page_size: Number of results to return per page.
+ :return: list[GenericUpstream]
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.repos_upstream_generic_list_with_http_info(owner, identifier, **kwargs) # noqa: E501
+ else:
+ (data) = self.repos_upstream_generic_list_with_http_info(owner, identifier, **kwargs) # noqa: E501
+ return data
+
+ def repos_upstream_generic_list_with_http_info(self, owner, identifier, **kwargs): # noqa: E501
+ """List Generic upstream configs for this repository. # noqa: E501
+
+ List Generic upstream configs for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_list_with_http_info(owner, identifier, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param int page: A page number within the paginated result set.
+ :param int page_size: Number of results to return per page.
+ :return: list[GenericUpstream]
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'identifier', 'page', 'page_size'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method repos_upstream_generic_list" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `repos_upstream_generic_list`") # noqa: E501
+ # verify the required parameter 'identifier' is set
+ if self.api_client.client_side_validation and ('identifier' not in params or
+ params['identifier'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `identifier` when calling `repos_upstream_generic_list`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'identifier' in params:
+ path_params['identifier'] = params['identifier'] # noqa: E501
+
+ query_params = []
+ if 'page' in params:
+ query_params.append(('page', params['page'])) # noqa: E501
+ if 'page_size' in params:
+ query_params.append(('page_size', params['page_size'])) # noqa: E501
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/repos/{owner}/{identifier}/upstream/generic/', 'GET',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type='list[GenericUpstream]', # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
+ def repos_upstream_generic_partial_update(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Partially update a Generic upstream config for this repository. # noqa: E501
+
+ Partially update a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_partial_update(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :param GenericUpstreamRequestPatch data:
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.repos_upstream_generic_partial_update_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ else:
+ (data) = self.repos_upstream_generic_partial_update_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ return data
+
+ def repos_upstream_generic_partial_update_with_http_info(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Partially update a Generic upstream config for this repository. # noqa: E501
+
+ Partially update a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_partial_update_with_http_info(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :param GenericUpstreamRequestPatch data:
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'identifier', 'slug_perm', 'data'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method repos_upstream_generic_partial_update" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `repos_upstream_generic_partial_update`") # noqa: E501
+ # verify the required parameter 'identifier' is set
+ if self.api_client.client_side_validation and ('identifier' not in params or
+ params['identifier'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `identifier` when calling `repos_upstream_generic_partial_update`") # noqa: E501
+ # verify the required parameter 'slug_perm' is set
+ if self.api_client.client_side_validation and ('slug_perm' not in params or
+ params['slug_perm'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `slug_perm` when calling `repos_upstream_generic_partial_update`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'identifier' in params:
+ path_params['identifier'] = params['identifier'] # noqa: E501
+ if 'slug_perm' in params:
+ path_params['slug_perm'] = params['slug_perm'] # noqa: E501
+
+ query_params = []
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ if 'data' in params:
+ body_params = params['data']
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/', 'PATCH',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type='GenericUpstream', # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
+ def repos_upstream_generic_read(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Retrieve a Generic upstream config for this repository. # noqa: E501
+
+ Retrieve a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_read(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.repos_upstream_generic_read_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ else:
+ (data) = self.repos_upstream_generic_read_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ return data
+
+ def repos_upstream_generic_read_with_http_info(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Retrieve a Generic upstream config for this repository. # noqa: E501
+
+ Retrieve a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_read_with_http_info(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'identifier', 'slug_perm'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method repos_upstream_generic_read" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `repos_upstream_generic_read`") # noqa: E501
+ # verify the required parameter 'identifier' is set
+ if self.api_client.client_side_validation and ('identifier' not in params or
+ params['identifier'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `identifier` when calling `repos_upstream_generic_read`") # noqa: E501
+ # verify the required parameter 'slug_perm' is set
+ if self.api_client.client_side_validation and ('slug_perm' not in params or
+ params['slug_perm'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `slug_perm` when calling `repos_upstream_generic_read`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'identifier' in params:
+ path_params['identifier'] = params['identifier'] # noqa: E501
+ if 'slug_perm' in params:
+ path_params['slug_perm'] = params['slug_perm'] # noqa: E501
+
+ query_params = []
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/', 'GET',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type='GenericUpstream', # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
+ def repos_upstream_generic_update(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Update a Generic upstream config for this repository. # noqa: E501
+
+ Update a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_update(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :param GenericUpstreamRequest data:
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+ kwargs['_return_http_data_only'] = True
+ if kwargs.get('async_req'):
+ return self.repos_upstream_generic_update_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ else:
+ (data) = self.repos_upstream_generic_update_with_http_info(owner, identifier, slug_perm, **kwargs) # noqa: E501
+ return data
+
+ def repos_upstream_generic_update_with_http_info(self, owner, identifier, slug_perm, **kwargs): # noqa: E501
+ """Update a Generic upstream config for this repository. # noqa: E501
+
+ Update a Generic upstream config for this repository. # noqa: E501
+ This method makes a synchronous HTTP request by default. To make an
+ asynchronous HTTP request, please pass async_req=True
+ >>> thread = api.repos_upstream_generic_update_with_http_info(owner, identifier, slug_perm, async_req=True)
+ >>> result = thread.get()
+
+ :param async_req bool
+ :param str owner: (required)
+ :param str identifier: (required)
+ :param str slug_perm: (required)
+ :param GenericUpstreamRequest data:
+ :return: GenericUpstream
+ If the method is called asynchronously,
+ returns the request thread.
+ """
+
+ all_params = ['owner', 'identifier', 'slug_perm', 'data'] # noqa: E501
+ all_params.append('async_req')
+ all_params.append('_return_http_data_only')
+ all_params.append('_preload_content')
+ all_params.append('_request_timeout')
+
+ params = locals()
+ for key, val in six.iteritems(params['kwargs']):
+ if key not in all_params:
+ raise TypeError(
+ "Got an unexpected keyword argument '%s'"
+ " to method repos_upstream_generic_update" % key
+ )
+ params[key] = val
+ del params['kwargs']
+ # verify the required parameter 'owner' is set
+ if self.api_client.client_side_validation and ('owner' not in params or
+ params['owner'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `owner` when calling `repos_upstream_generic_update`") # noqa: E501
+ # verify the required parameter 'identifier' is set
+ if self.api_client.client_side_validation and ('identifier' not in params or
+ params['identifier'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `identifier` when calling `repos_upstream_generic_update`") # noqa: E501
+ # verify the required parameter 'slug_perm' is set
+ if self.api_client.client_side_validation and ('slug_perm' not in params or
+ params['slug_perm'] is None): # noqa: E501
+ raise ValueError("Missing the required parameter `slug_perm` when calling `repos_upstream_generic_update`") # noqa: E501
+
+ collection_formats = {}
+
+ path_params = {}
+ if 'owner' in params:
+ path_params['owner'] = params['owner'] # noqa: E501
+ if 'identifier' in params:
+ path_params['identifier'] = params['identifier'] # noqa: E501
+ if 'slug_perm' in params:
+ path_params['slug_perm'] = params['slug_perm'] # noqa: E501
+
+ query_params = []
+
+ header_params = {}
+
+ form_params = []
+ local_var_files = {}
+
+ body_params = None
+ if 'data' in params:
+ body_params = params['data']
+ # HTTP header `Accept`
+ header_params['Accept'] = self.api_client.select_header_accept(
+ ['application/json']) # noqa: E501
+
+ # HTTP header `Content-Type`
+ header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
+ ['application/json']) # noqa: E501
+
+ # Authentication setting
+ auth_settings = ['apikey', 'basic'] # noqa: E501
+
+ return self.api_client.call_api(
+ '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/', 'PUT',
+ path_params,
+ query_params,
+ header_params,
+ body=body_params,
+ post_params=form_params,
+ files=local_var_files,
+ response_type='GenericUpstream', # noqa: E501
+ auth_settings=auth_settings,
+ async_req=params.get('async_req'),
+ _return_http_data_only=params.get('_return_http_data_only'),
+ _preload_content=params.get('_preload_content', True),
+ _request_timeout=params.get('_request_timeout'),
+ collection_formats=collection_formats)
+
def repos_upstream_go_create(self, owner, identifier, **kwargs): # noqa: E501
"""Create a Go upstream config for this repository. # noqa: E501
diff --git a/bindings/python/src/cloudsmith_api/models/__init__.py b/bindings/python/src/cloudsmith_api/models/__init__.py
index 275dbfa0..19dd0996 100644
--- a/bindings/python/src/cloudsmith_api/models/__init__.py
+++ b/bindings/python/src/cloudsmith_api/models/__init__.py
@@ -76,6 +76,11 @@
from cloudsmith_api.models.format import Format
from cloudsmith_api.models.format_support import FormatSupport
from cloudsmith_api.models.format_support_upstream import FormatSupportUpstream
+from cloudsmith_api.models.generic_package_upload import GenericPackageUpload
+from cloudsmith_api.models.generic_package_upload_request import GenericPackageUploadRequest
+from cloudsmith_api.models.generic_upstream import GenericUpstream
+from cloudsmith_api.models.generic_upstream_request import GenericUpstreamRequest
+from cloudsmith_api.models.generic_upstream_request_patch import GenericUpstreamRequestPatch
from cloudsmith_api.models.geo_ip_location import GeoIpLocation
from cloudsmith_api.models.go_package_upload import GoPackageUpload
from cloudsmith_api.models.go_package_upload_request import GoPackageUploadRequest
diff --git a/bindings/python/src/cloudsmith_api/models/format_support.py b/bindings/python/src/cloudsmith_api/models/format_support.py
index f0e95678..122ed5d3 100644
--- a/bindings/python/src/cloudsmith_api/models/format_support.py
+++ b/bindings/python/src/cloudsmith_api/models/format_support.py
@@ -36,6 +36,7 @@ class FormatSupport(object):
'dependencies': 'bool',
'distributions': 'bool',
'file_lists': 'bool',
+ 'filepaths': 'bool',
'metadata': 'bool',
'upstreams': 'FormatSupportUpstream',
'versioning': 'bool'
@@ -45,12 +46,13 @@ class FormatSupport(object):
'dependencies': 'dependencies',
'distributions': 'distributions',
'file_lists': 'file_lists',
+ 'filepaths': 'filepaths',
'metadata': 'metadata',
'upstreams': 'upstreams',
'versioning': 'versioning'
}
- def __init__(self, dependencies=None, distributions=None, file_lists=None, metadata=None, upstreams=None, versioning=None, _configuration=None): # noqa: E501
+ def __init__(self, dependencies=None, distributions=None, file_lists=None, filepaths=None, metadata=None, upstreams=None, versioning=None, _configuration=None): # noqa: E501
"""FormatSupport - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -59,6 +61,7 @@ def __init__(self, dependencies=None, distributions=None, file_lists=None, metad
self._dependencies = None
self._distributions = None
self._file_lists = None
+ self._filepaths = None
self._metadata = None
self._upstreams = None
self._versioning = None
@@ -67,6 +70,7 @@ def __init__(self, dependencies=None, distributions=None, file_lists=None, metad
self.dependencies = dependencies
self.distributions = distributions
self.file_lists = file_lists
+ self.filepaths = filepaths
self.metadata = metadata
self.upstreams = upstreams
self.versioning = versioning
@@ -146,6 +150,31 @@ def file_lists(self, file_lists):
self._file_lists = file_lists
+ @property
+ def filepaths(self):
+ """Gets the filepaths of this FormatSupport.
+
+ If true the package format supports filepaths
+
+ :return: The filepaths of this FormatSupport.
+ :rtype: bool
+ """
+ return self._filepaths
+
+ @filepaths.setter
+ def filepaths(self, filepaths):
+ """Sets the filepaths of this FormatSupport.
+
+ If true the package format supports filepaths
+
+ :param filepaths: The filepaths of this FormatSupport.
+ :type: bool
+ """
+ if self._configuration.client_side_validation and filepaths is None:
+ raise ValueError("Invalid value for `filepaths`, must not be `None`") # noqa: E501
+
+ self._filepaths = filepaths
+
@property
def metadata(self):
"""Gets the metadata of this FormatSupport.
diff --git a/bindings/python/src/cloudsmith_api/models/generic_package_upload.py b/bindings/python/src/cloudsmith_api/models/generic_package_upload.py
new file mode 100644
index 00000000..ae8530de
--- /dev/null
+++ b/bindings/python/src/cloudsmith_api/models/generic_package_upload.py
@@ -0,0 +1,2370 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+import pprint
+import re # noqa: F401
+
+import six
+
+from cloudsmith_api.configuration import Configuration
+
+
+class GenericPackageUpload(object):
+ """NOTE: This class is auto generated by the swagger code generator program.
+
+ Do not edit the class manually.
+ """
+
+ """
+ Attributes:
+ swagger_types (dict): The key is attribute name
+ and the value is attribute type.
+ attribute_map (dict): The key is attribute name
+ and the value is json key in definition.
+ """
+ swagger_types = {
+ 'architectures': 'list[Architecture]',
+ 'cdn_url': 'str',
+ 'checksum_md5': 'str',
+ 'checksum_sha1': 'str',
+ 'checksum_sha256': 'str',
+ 'checksum_sha512': 'str',
+ 'dependencies_checksum_md5': 'str',
+ 'dependencies_url': 'str',
+ 'description': 'str',
+ 'display_name': 'str',
+ 'distro': 'Distribution',
+ 'distro_version': 'DistributionVersion',
+ 'downloads': 'int',
+ 'epoch': 'int',
+ 'extension': 'str',
+ 'filename': 'str',
+ 'files': 'list[PackageFile]',
+ 'format': 'str',
+ 'format_url': 'str',
+ 'freeable_storage': 'int',
+ 'fully_qualified_name': 'str',
+ 'identifier_perm': 'str',
+ 'identifiers': 'dict(str, str)',
+ 'indexed': 'bool',
+ 'is_cancellable': 'bool',
+ 'is_copyable': 'bool',
+ 'is_deleteable': 'bool',
+ 'is_downloadable': 'bool',
+ 'is_moveable': 'bool',
+ 'is_quarantinable': 'bool',
+ 'is_quarantined': 'bool',
+ 'is_resyncable': 'bool',
+ 'is_security_scannable': 'bool',
+ 'is_sync_awaiting': 'bool',
+ 'is_sync_completed': 'bool',
+ 'is_sync_failed': 'bool',
+ 'is_sync_in_flight': 'bool',
+ 'is_sync_in_progress': 'bool',
+ 'license': 'str',
+ 'name': 'str',
+ 'namespace': 'str',
+ 'namespace_url': 'str',
+ 'num_files': 'int',
+ 'origin_repository': 'str',
+ 'origin_repository_url': 'str',
+ 'package_type': 'int',
+ 'policy_violated': 'bool',
+ 'raw_license': 'str',
+ 'release': 'str',
+ 'repository': 'str',
+ 'repository_url': 'str',
+ 'security_scan_completed_at': 'datetime',
+ 'security_scan_started_at': 'datetime',
+ 'security_scan_status': 'str',
+ 'security_scan_status_updated_at': 'datetime',
+ 'self_html_url': 'str',
+ 'self_url': 'str',
+ 'signature_url': 'str',
+ 'size': 'int',
+ 'slug': 'str',
+ 'slug_perm': 'str',
+ 'spdx_license': 'str',
+ 'stage': 'int',
+ 'stage_str': 'str',
+ 'stage_updated_at': 'datetime',
+ 'status': 'int',
+ 'status_reason': 'str',
+ 'status_str': 'str',
+ 'status_updated_at': 'datetime',
+ 'status_url': 'str',
+ 'subtype': 'str',
+ 'summary': 'str',
+ 'sync_finished_at': 'datetime',
+ 'sync_progress': 'int',
+ 'tags_automatic': 'Tags',
+ 'tags_immutable': 'Tags',
+ 'type_display': 'str',
+ 'uploaded_at': 'datetime',
+ 'uploader': 'str',
+ 'uploader_url': 'str',
+ 'version': 'str',
+ 'version_orig': 'str',
+ 'vulnerability_scan_results_url': 'str'
+ }
+
+ attribute_map = {
+ 'architectures': 'architectures',
+ 'cdn_url': 'cdn_url',
+ 'checksum_md5': 'checksum_md5',
+ 'checksum_sha1': 'checksum_sha1',
+ 'checksum_sha256': 'checksum_sha256',
+ 'checksum_sha512': 'checksum_sha512',
+ 'dependencies_checksum_md5': 'dependencies_checksum_md5',
+ 'dependencies_url': 'dependencies_url',
+ 'description': 'description',
+ 'display_name': 'display_name',
+ 'distro': 'distro',
+ 'distro_version': 'distro_version',
+ 'downloads': 'downloads',
+ 'epoch': 'epoch',
+ 'extension': 'extension',
+ 'filename': 'filename',
+ 'files': 'files',
+ 'format': 'format',
+ 'format_url': 'format_url',
+ 'freeable_storage': 'freeable_storage',
+ 'fully_qualified_name': 'fully_qualified_name',
+ 'identifier_perm': 'identifier_perm',
+ 'identifiers': 'identifiers',
+ 'indexed': 'indexed',
+ 'is_cancellable': 'is_cancellable',
+ 'is_copyable': 'is_copyable',
+ 'is_deleteable': 'is_deleteable',
+ 'is_downloadable': 'is_downloadable',
+ 'is_moveable': 'is_moveable',
+ 'is_quarantinable': 'is_quarantinable',
+ 'is_quarantined': 'is_quarantined',
+ 'is_resyncable': 'is_resyncable',
+ 'is_security_scannable': 'is_security_scannable',
+ 'is_sync_awaiting': 'is_sync_awaiting',
+ 'is_sync_completed': 'is_sync_completed',
+ 'is_sync_failed': 'is_sync_failed',
+ 'is_sync_in_flight': 'is_sync_in_flight',
+ 'is_sync_in_progress': 'is_sync_in_progress',
+ 'license': 'license',
+ 'name': 'name',
+ 'namespace': 'namespace',
+ 'namespace_url': 'namespace_url',
+ 'num_files': 'num_files',
+ 'origin_repository': 'origin_repository',
+ 'origin_repository_url': 'origin_repository_url',
+ 'package_type': 'package_type',
+ 'policy_violated': 'policy_violated',
+ 'raw_license': 'raw_license',
+ 'release': 'release',
+ 'repository': 'repository',
+ 'repository_url': 'repository_url',
+ 'security_scan_completed_at': 'security_scan_completed_at',
+ 'security_scan_started_at': 'security_scan_started_at',
+ 'security_scan_status': 'security_scan_status',
+ 'security_scan_status_updated_at': 'security_scan_status_updated_at',
+ 'self_html_url': 'self_html_url',
+ 'self_url': 'self_url',
+ 'signature_url': 'signature_url',
+ 'size': 'size',
+ 'slug': 'slug',
+ 'slug_perm': 'slug_perm',
+ 'spdx_license': 'spdx_license',
+ 'stage': 'stage',
+ 'stage_str': 'stage_str',
+ 'stage_updated_at': 'stage_updated_at',
+ 'status': 'status',
+ 'status_reason': 'status_reason',
+ 'status_str': 'status_str',
+ 'status_updated_at': 'status_updated_at',
+ 'status_url': 'status_url',
+ 'subtype': 'subtype',
+ 'summary': 'summary',
+ 'sync_finished_at': 'sync_finished_at',
+ 'sync_progress': 'sync_progress',
+ 'tags_automatic': 'tags_automatic',
+ 'tags_immutable': 'tags_immutable',
+ 'type_display': 'type_display',
+ 'uploaded_at': 'uploaded_at',
+ 'uploader': 'uploader',
+ 'uploader_url': 'uploader_url',
+ 'version': 'version',
+ 'version_orig': 'version_orig',
+ 'vulnerability_scan_results_url': 'vulnerability_scan_results_url'
+ }
+
+ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
+ """GenericPackageUpload - a model defined in Swagger""" # noqa: E501
+ if _configuration is None:
+ _configuration = Configuration()
+ self._configuration = _configuration
+
+ self._architectures = None
+ self._cdn_url = None
+ self._checksum_md5 = None
+ self._checksum_sha1 = None
+ self._checksum_sha256 = None
+ self._checksum_sha512 = None
+ self._dependencies_checksum_md5 = None
+ self._dependencies_url = None
+ self._description = None
+ self._display_name = None
+ self._distro = None
+ self._distro_version = None
+ self._downloads = None
+ self._epoch = None
+ self._extension = None
+ self._filename = None
+ self._files = None
+ self._format = None
+ self._format_url = None
+ self._freeable_storage = None
+ self._fully_qualified_name = None
+ self._identifier_perm = None
+ self._identifiers = None
+ self._indexed = None
+ self._is_cancellable = None
+ self._is_copyable = None
+ self._is_deleteable = None
+ self._is_downloadable = None
+ self._is_moveable = None
+ self._is_quarantinable = None
+ self._is_quarantined = None
+ self._is_resyncable = None
+ self._is_security_scannable = None
+ self._is_sync_awaiting = None
+ self._is_sync_completed = None
+ self._is_sync_failed = None
+ self._is_sync_in_flight = None
+ self._is_sync_in_progress = None
+ self._license = None
+ self._name = None
+ self._namespace = None
+ self._namespace_url = None
+ self._num_files = None
+ self._origin_repository = None
+ self._origin_repository_url = None
+ self._package_type = None
+ self._policy_violated = None
+ self._raw_license = None
+ self._release = None
+ self._repository = None
+ self._repository_url = None
+ self._security_scan_completed_at = None
+ self._security_scan_started_at = None
+ self._security_scan_status = None
+ self._security_scan_status_updated_at = None
+ self._self_html_url = None
+ self._self_url = None
+ self._signature_url = None
+ self._size = None
+ self._slug = None
+ self._slug_perm = None
+ self._spdx_license = None
+ self._stage = None
+ self._stage_str = None
+ self._stage_updated_at = None
+ self._status = None
+ self._status_reason = None
+ self._status_str = None
+ self._status_updated_at = None
+ self._status_url = None
+ self._subtype = None
+ self._summary = None
+ self._sync_finished_at = None
+ self._sync_progress = None
+ self._tags_automatic = None
+ self._tags_immutable = None
+ self._type_display = None
+ self._uploaded_at = None
+ self._uploader = None
+ self._uploader_url = None
+ self._version = None
+ self._version_orig = None
+ self._vulnerability_scan_results_url = None
+ self.discriminator = None
+
+ if architectures is not None:
+ self.architectures = architectures
+ if cdn_url is not None:
+ self.cdn_url = cdn_url
+ if checksum_md5 is not None:
+ self.checksum_md5 = checksum_md5
+ if checksum_sha1 is not None:
+ self.checksum_sha1 = checksum_sha1
+ if checksum_sha256 is not None:
+ self.checksum_sha256 = checksum_sha256
+ if checksum_sha512 is not None:
+ self.checksum_sha512 = checksum_sha512
+ if dependencies_checksum_md5 is not None:
+ self.dependencies_checksum_md5 = dependencies_checksum_md5
+ if dependencies_url is not None:
+ self.dependencies_url = dependencies_url
+ if description is not None:
+ self.description = description
+ if display_name is not None:
+ self.display_name = display_name
+ if distro is not None:
+ self.distro = distro
+ if distro_version is not None:
+ self.distro_version = distro_version
+ if downloads is not None:
+ self.downloads = downloads
+ if epoch is not None:
+ self.epoch = epoch
+ if extension is not None:
+ self.extension = extension
+ if filename is not None:
+ self.filename = filename
+ if files is not None:
+ self.files = files
+ if format is not None:
+ self.format = format
+ if format_url is not None:
+ self.format_url = format_url
+ if freeable_storage is not None:
+ self.freeable_storage = freeable_storage
+ if fully_qualified_name is not None:
+ self.fully_qualified_name = fully_qualified_name
+ if identifier_perm is not None:
+ self.identifier_perm = identifier_perm
+ if identifiers is not None:
+ self.identifiers = identifiers
+ if indexed is not None:
+ self.indexed = indexed
+ if is_cancellable is not None:
+ self.is_cancellable = is_cancellable
+ if is_copyable is not None:
+ self.is_copyable = is_copyable
+ if is_deleteable is not None:
+ self.is_deleteable = is_deleteable
+ if is_downloadable is not None:
+ self.is_downloadable = is_downloadable
+ if is_moveable is not None:
+ self.is_moveable = is_moveable
+ if is_quarantinable is not None:
+ self.is_quarantinable = is_quarantinable
+ if is_quarantined is not None:
+ self.is_quarantined = is_quarantined
+ if is_resyncable is not None:
+ self.is_resyncable = is_resyncable
+ if is_security_scannable is not None:
+ self.is_security_scannable = is_security_scannable
+ if is_sync_awaiting is not None:
+ self.is_sync_awaiting = is_sync_awaiting
+ if is_sync_completed is not None:
+ self.is_sync_completed = is_sync_completed
+ if is_sync_failed is not None:
+ self.is_sync_failed = is_sync_failed
+ if is_sync_in_flight is not None:
+ self.is_sync_in_flight = is_sync_in_flight
+ if is_sync_in_progress is not None:
+ self.is_sync_in_progress = is_sync_in_progress
+ if license is not None:
+ self.license = license
+ if name is not None:
+ self.name = name
+ if namespace is not None:
+ self.namespace = namespace
+ if namespace_url is not None:
+ self.namespace_url = namespace_url
+ if num_files is not None:
+ self.num_files = num_files
+ if origin_repository is not None:
+ self.origin_repository = origin_repository
+ if origin_repository_url is not None:
+ self.origin_repository_url = origin_repository_url
+ if package_type is not None:
+ self.package_type = package_type
+ if policy_violated is not None:
+ self.policy_violated = policy_violated
+ if raw_license is not None:
+ self.raw_license = raw_license
+ if release is not None:
+ self.release = release
+ if repository is not None:
+ self.repository = repository
+ if repository_url is not None:
+ self.repository_url = repository_url
+ if security_scan_completed_at is not None:
+ self.security_scan_completed_at = security_scan_completed_at
+ if security_scan_started_at is not None:
+ self.security_scan_started_at = security_scan_started_at
+ if security_scan_status is not None:
+ self.security_scan_status = security_scan_status
+ if security_scan_status_updated_at is not None:
+ self.security_scan_status_updated_at = security_scan_status_updated_at
+ if self_html_url is not None:
+ self.self_html_url = self_html_url
+ if self_url is not None:
+ self.self_url = self_url
+ if signature_url is not None:
+ self.signature_url = signature_url
+ if size is not None:
+ self.size = size
+ if slug is not None:
+ self.slug = slug
+ if slug_perm is not None:
+ self.slug_perm = slug_perm
+ if spdx_license is not None:
+ self.spdx_license = spdx_license
+ if stage is not None:
+ self.stage = stage
+ if stage_str is not None:
+ self.stage_str = stage_str
+ if stage_updated_at is not None:
+ self.stage_updated_at = stage_updated_at
+ if status is not None:
+ self.status = status
+ if status_reason is not None:
+ self.status_reason = status_reason
+ if status_str is not None:
+ self.status_str = status_str
+ if status_updated_at is not None:
+ self.status_updated_at = status_updated_at
+ if status_url is not None:
+ self.status_url = status_url
+ if subtype is not None:
+ self.subtype = subtype
+ if summary is not None:
+ self.summary = summary
+ if sync_finished_at is not None:
+ self.sync_finished_at = sync_finished_at
+ if sync_progress is not None:
+ self.sync_progress = sync_progress
+ if tags_automatic is not None:
+ self.tags_automatic = tags_automatic
+ if tags_immutable is not None:
+ self.tags_immutable = tags_immutable
+ if type_display is not None:
+ self.type_display = type_display
+ if uploaded_at is not None:
+ self.uploaded_at = uploaded_at
+ if uploader is not None:
+ self.uploader = uploader
+ if uploader_url is not None:
+ self.uploader_url = uploader_url
+ if version is not None:
+ self.version = version
+ if version_orig is not None:
+ self.version_orig = version_orig
+ if vulnerability_scan_results_url is not None:
+ self.vulnerability_scan_results_url = vulnerability_scan_results_url
+
+ @property
+ def architectures(self):
+ """Gets the architectures of this GenericPackageUpload.
+
+
+ :return: The architectures of this GenericPackageUpload.
+ :rtype: list[Architecture]
+ """
+ return self._architectures
+
+ @architectures.setter
+ def architectures(self, architectures):
+ """Sets the architectures of this GenericPackageUpload.
+
+
+ :param architectures: The architectures of this GenericPackageUpload.
+ :type: list[Architecture]
+ """
+
+ self._architectures = architectures
+
+ @property
+ def cdn_url(self):
+ """Gets the cdn_url of this GenericPackageUpload.
+
+
+ :return: The cdn_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._cdn_url
+
+ @cdn_url.setter
+ def cdn_url(self, cdn_url):
+ """Sets the cdn_url of this GenericPackageUpload.
+
+
+ :param cdn_url: The cdn_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._cdn_url = cdn_url
+
+ @property
+ def checksum_md5(self):
+ """Gets the checksum_md5 of this GenericPackageUpload.
+
+
+ :return: The checksum_md5 of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._checksum_md5
+
+ @checksum_md5.setter
+ def checksum_md5(self, checksum_md5):
+ """Sets the checksum_md5 of this GenericPackageUpload.
+
+
+ :param checksum_md5: The checksum_md5 of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._checksum_md5 = checksum_md5
+
+ @property
+ def checksum_sha1(self):
+ """Gets the checksum_sha1 of this GenericPackageUpload.
+
+
+ :return: The checksum_sha1 of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._checksum_sha1
+
+ @checksum_sha1.setter
+ def checksum_sha1(self, checksum_sha1):
+ """Sets the checksum_sha1 of this GenericPackageUpload.
+
+
+ :param checksum_sha1: The checksum_sha1 of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._checksum_sha1 = checksum_sha1
+
+ @property
+ def checksum_sha256(self):
+ """Gets the checksum_sha256 of this GenericPackageUpload.
+
+
+ :return: The checksum_sha256 of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._checksum_sha256
+
+ @checksum_sha256.setter
+ def checksum_sha256(self, checksum_sha256):
+ """Sets the checksum_sha256 of this GenericPackageUpload.
+
+
+ :param checksum_sha256: The checksum_sha256 of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._checksum_sha256 = checksum_sha256
+
+ @property
+ def checksum_sha512(self):
+ """Gets the checksum_sha512 of this GenericPackageUpload.
+
+
+ :return: The checksum_sha512 of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._checksum_sha512
+
+ @checksum_sha512.setter
+ def checksum_sha512(self, checksum_sha512):
+ """Sets the checksum_sha512 of this GenericPackageUpload.
+
+
+ :param checksum_sha512: The checksum_sha512 of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._checksum_sha512 = checksum_sha512
+
+ @property
+ def dependencies_checksum_md5(self):
+ """Gets the dependencies_checksum_md5 of this GenericPackageUpload.
+
+ A checksum of all of the package's dependencies.
+
+ :return: The dependencies_checksum_md5 of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._dependencies_checksum_md5
+
+ @dependencies_checksum_md5.setter
+ def dependencies_checksum_md5(self, dependencies_checksum_md5):
+ """Sets the dependencies_checksum_md5 of this GenericPackageUpload.
+
+ A checksum of all of the package's dependencies.
+
+ :param dependencies_checksum_md5: The dependencies_checksum_md5 of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._dependencies_checksum_md5 = dependencies_checksum_md5
+
+ @property
+ def dependencies_url(self):
+ """Gets the dependencies_url of this GenericPackageUpload.
+
+
+ :return: The dependencies_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._dependencies_url
+
+ @dependencies_url.setter
+ def dependencies_url(self, dependencies_url):
+ """Sets the dependencies_url of this GenericPackageUpload.
+
+
+ :param dependencies_url: The dependencies_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._dependencies_url = dependencies_url
+
+ @property
+ def description(self):
+ """Gets the description of this GenericPackageUpload.
+
+ A textual description of this package.
+
+ :return: The description of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._description
+
+ @description.setter
+ def description(self, description):
+ """Sets the description of this GenericPackageUpload.
+
+ A textual description of this package.
+
+ :param description: The description of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._description = description
+
+ @property
+ def display_name(self):
+ """Gets the display_name of this GenericPackageUpload.
+
+
+ :return: The display_name of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._display_name
+
+ @display_name.setter
+ def display_name(self, display_name):
+ """Sets the display_name of this GenericPackageUpload.
+
+
+ :param display_name: The display_name of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._display_name = display_name
+
+ @property
+ def distro(self):
+ """Gets the distro of this GenericPackageUpload.
+
+
+ :return: The distro of this GenericPackageUpload.
+ :rtype: Distribution
+ """
+ return self._distro
+
+ @distro.setter
+ def distro(self, distro):
+ """Sets the distro of this GenericPackageUpload.
+
+
+ :param distro: The distro of this GenericPackageUpload.
+ :type: Distribution
+ """
+
+ self._distro = distro
+
+ @property
+ def distro_version(self):
+ """Gets the distro_version of this GenericPackageUpload.
+
+
+ :return: The distro_version of this GenericPackageUpload.
+ :rtype: DistributionVersion
+ """
+ return self._distro_version
+
+ @distro_version.setter
+ def distro_version(self, distro_version):
+ """Sets the distro_version of this GenericPackageUpload.
+
+
+ :param distro_version: The distro_version of this GenericPackageUpload.
+ :type: DistributionVersion
+ """
+
+ self._distro_version = distro_version
+
+ @property
+ def downloads(self):
+ """Gets the downloads of this GenericPackageUpload.
+
+
+ :return: The downloads of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._downloads
+
+ @downloads.setter
+ def downloads(self, downloads):
+ """Sets the downloads of this GenericPackageUpload.
+
+
+ :param downloads: The downloads of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._downloads = downloads
+
+ @property
+ def epoch(self):
+ """Gets the epoch of this GenericPackageUpload.
+
+ The epoch of the package version (if any).
+
+ :return: The epoch of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._epoch
+
+ @epoch.setter
+ def epoch(self, epoch):
+ """Sets the epoch of this GenericPackageUpload.
+
+ The epoch of the package version (if any).
+
+ :param epoch: The epoch of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._epoch = epoch
+
+ @property
+ def extension(self):
+ """Gets the extension of this GenericPackageUpload.
+
+
+ :return: The extension of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._extension
+
+ @extension.setter
+ def extension(self, extension):
+ """Sets the extension of this GenericPackageUpload.
+
+
+ :param extension: The extension of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._extension = extension
+
+ @property
+ def filename(self):
+ """Gets the filename of this GenericPackageUpload.
+
+
+ :return: The filename of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._filename
+
+ @filename.setter
+ def filename(self, filename):
+ """Sets the filename of this GenericPackageUpload.
+
+
+ :param filename: The filename of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ filename is not None and len(filename) < 1):
+ raise ValueError("Invalid value for `filename`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filename = filename
+
+ @property
+ def files(self):
+ """Gets the files of this GenericPackageUpload.
+
+
+ :return: The files of this GenericPackageUpload.
+ :rtype: list[PackageFile]
+ """
+ return self._files
+
+ @files.setter
+ def files(self, files):
+ """Sets the files of this GenericPackageUpload.
+
+
+ :param files: The files of this GenericPackageUpload.
+ :type: list[PackageFile]
+ """
+
+ self._files = files
+
+ @property
+ def format(self):
+ """Gets the format of this GenericPackageUpload.
+
+
+ :return: The format of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._format
+
+ @format.setter
+ def format(self, format):
+ """Sets the format of this GenericPackageUpload.
+
+
+ :param format: The format of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ format is not None and len(format) < 1):
+ raise ValueError("Invalid value for `format`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._format = format
+
+ @property
+ def format_url(self):
+ """Gets the format_url of this GenericPackageUpload.
+
+
+ :return: The format_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._format_url
+
+ @format_url.setter
+ def format_url(self, format_url):
+ """Sets the format_url of this GenericPackageUpload.
+
+
+ :param format_url: The format_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._format_url = format_url
+
+ @property
+ def freeable_storage(self):
+ """Gets the freeable_storage of this GenericPackageUpload.
+
+ Amount of storage that will be freed if this package is deleted
+
+ :return: The freeable_storage of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._freeable_storage
+
+ @freeable_storage.setter
+ def freeable_storage(self, freeable_storage):
+ """Sets the freeable_storage of this GenericPackageUpload.
+
+ Amount of storage that will be freed if this package is deleted
+
+ :param freeable_storage: The freeable_storage of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._freeable_storage = freeable_storage
+
+ @property
+ def fully_qualified_name(self):
+ """Gets the fully_qualified_name of this GenericPackageUpload.
+
+
+ :return: The fully_qualified_name of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._fully_qualified_name
+
+ @fully_qualified_name.setter
+ def fully_qualified_name(self, fully_qualified_name):
+ """Sets the fully_qualified_name of this GenericPackageUpload.
+
+
+ :param fully_qualified_name: The fully_qualified_name of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ fully_qualified_name is not None and len(fully_qualified_name) < 1):
+ raise ValueError("Invalid value for `fully_qualified_name`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._fully_qualified_name = fully_qualified_name
+
+ @property
+ def identifier_perm(self):
+ """Gets the identifier_perm of this GenericPackageUpload.
+
+ Unique and permanent identifier for the package.
+
+ :return: The identifier_perm of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._identifier_perm
+
+ @identifier_perm.setter
+ def identifier_perm(self, identifier_perm):
+ """Sets the identifier_perm of this GenericPackageUpload.
+
+ Unique and permanent identifier for the package.
+
+ :param identifier_perm: The identifier_perm of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ identifier_perm is not None and len(identifier_perm) < 1):
+ raise ValueError("Invalid value for `identifier_perm`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._identifier_perm = identifier_perm
+
+ @property
+ def identifiers(self):
+ """Gets the identifiers of this GenericPackageUpload.
+
+ Return a map of identifier field names and their values.
+
+ :return: The identifiers of this GenericPackageUpload.
+ :rtype: dict(str, str)
+ """
+ return self._identifiers
+
+ @identifiers.setter
+ def identifiers(self, identifiers):
+ """Sets the identifiers of this GenericPackageUpload.
+
+ Return a map of identifier field names and their values.
+
+ :param identifiers: The identifiers of this GenericPackageUpload.
+ :type: dict(str, str)
+ """
+
+ self._identifiers = identifiers
+
+ @property
+ def indexed(self):
+ """Gets the indexed of this GenericPackageUpload.
+
+
+ :return: The indexed of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._indexed
+
+ @indexed.setter
+ def indexed(self, indexed):
+ """Sets the indexed of this GenericPackageUpload.
+
+
+ :param indexed: The indexed of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._indexed = indexed
+
+ @property
+ def is_cancellable(self):
+ """Gets the is_cancellable of this GenericPackageUpload.
+
+
+ :return: The is_cancellable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_cancellable
+
+ @is_cancellable.setter
+ def is_cancellable(self, is_cancellable):
+ """Sets the is_cancellable of this GenericPackageUpload.
+
+
+ :param is_cancellable: The is_cancellable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_cancellable = is_cancellable
+
+ @property
+ def is_copyable(self):
+ """Gets the is_copyable of this GenericPackageUpload.
+
+
+ :return: The is_copyable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_copyable
+
+ @is_copyable.setter
+ def is_copyable(self, is_copyable):
+ """Sets the is_copyable of this GenericPackageUpload.
+
+
+ :param is_copyable: The is_copyable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_copyable = is_copyable
+
+ @property
+ def is_deleteable(self):
+ """Gets the is_deleteable of this GenericPackageUpload.
+
+
+ :return: The is_deleteable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_deleteable
+
+ @is_deleteable.setter
+ def is_deleteable(self, is_deleteable):
+ """Sets the is_deleteable of this GenericPackageUpload.
+
+
+ :param is_deleteable: The is_deleteable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_deleteable = is_deleteable
+
+ @property
+ def is_downloadable(self):
+ """Gets the is_downloadable of this GenericPackageUpload.
+
+
+ :return: The is_downloadable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_downloadable
+
+ @is_downloadable.setter
+ def is_downloadable(self, is_downloadable):
+ """Sets the is_downloadable of this GenericPackageUpload.
+
+
+ :param is_downloadable: The is_downloadable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_downloadable = is_downloadable
+
+ @property
+ def is_moveable(self):
+ """Gets the is_moveable of this GenericPackageUpload.
+
+
+ :return: The is_moveable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_moveable
+
+ @is_moveable.setter
+ def is_moveable(self, is_moveable):
+ """Sets the is_moveable of this GenericPackageUpload.
+
+
+ :param is_moveable: The is_moveable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_moveable = is_moveable
+
+ @property
+ def is_quarantinable(self):
+ """Gets the is_quarantinable of this GenericPackageUpload.
+
+
+ :return: The is_quarantinable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_quarantinable
+
+ @is_quarantinable.setter
+ def is_quarantinable(self, is_quarantinable):
+ """Sets the is_quarantinable of this GenericPackageUpload.
+
+
+ :param is_quarantinable: The is_quarantinable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_quarantinable = is_quarantinable
+
+ @property
+ def is_quarantined(self):
+ """Gets the is_quarantined of this GenericPackageUpload.
+
+
+ :return: The is_quarantined of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_quarantined
+
+ @is_quarantined.setter
+ def is_quarantined(self, is_quarantined):
+ """Sets the is_quarantined of this GenericPackageUpload.
+
+
+ :param is_quarantined: The is_quarantined of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_quarantined = is_quarantined
+
+ @property
+ def is_resyncable(self):
+ """Gets the is_resyncable of this GenericPackageUpload.
+
+
+ :return: The is_resyncable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_resyncable
+
+ @is_resyncable.setter
+ def is_resyncable(self, is_resyncable):
+ """Sets the is_resyncable of this GenericPackageUpload.
+
+
+ :param is_resyncable: The is_resyncable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_resyncable = is_resyncable
+
+ @property
+ def is_security_scannable(self):
+ """Gets the is_security_scannable of this GenericPackageUpload.
+
+
+ :return: The is_security_scannable of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_security_scannable
+
+ @is_security_scannable.setter
+ def is_security_scannable(self, is_security_scannable):
+ """Sets the is_security_scannable of this GenericPackageUpload.
+
+
+ :param is_security_scannable: The is_security_scannable of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_security_scannable = is_security_scannable
+
+ @property
+ def is_sync_awaiting(self):
+ """Gets the is_sync_awaiting of this GenericPackageUpload.
+
+
+ :return: The is_sync_awaiting of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_sync_awaiting
+
+ @is_sync_awaiting.setter
+ def is_sync_awaiting(self, is_sync_awaiting):
+ """Sets the is_sync_awaiting of this GenericPackageUpload.
+
+
+ :param is_sync_awaiting: The is_sync_awaiting of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_sync_awaiting = is_sync_awaiting
+
+ @property
+ def is_sync_completed(self):
+ """Gets the is_sync_completed of this GenericPackageUpload.
+
+
+ :return: The is_sync_completed of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_sync_completed
+
+ @is_sync_completed.setter
+ def is_sync_completed(self, is_sync_completed):
+ """Sets the is_sync_completed of this GenericPackageUpload.
+
+
+ :param is_sync_completed: The is_sync_completed of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_sync_completed = is_sync_completed
+
+ @property
+ def is_sync_failed(self):
+ """Gets the is_sync_failed of this GenericPackageUpload.
+
+
+ :return: The is_sync_failed of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_sync_failed
+
+ @is_sync_failed.setter
+ def is_sync_failed(self, is_sync_failed):
+ """Sets the is_sync_failed of this GenericPackageUpload.
+
+
+ :param is_sync_failed: The is_sync_failed of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_sync_failed = is_sync_failed
+
+ @property
+ def is_sync_in_flight(self):
+ """Gets the is_sync_in_flight of this GenericPackageUpload.
+
+
+ :return: The is_sync_in_flight of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_sync_in_flight
+
+ @is_sync_in_flight.setter
+ def is_sync_in_flight(self, is_sync_in_flight):
+ """Sets the is_sync_in_flight of this GenericPackageUpload.
+
+
+ :param is_sync_in_flight: The is_sync_in_flight of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_sync_in_flight = is_sync_in_flight
+
+ @property
+ def is_sync_in_progress(self):
+ """Gets the is_sync_in_progress of this GenericPackageUpload.
+
+
+ :return: The is_sync_in_progress of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._is_sync_in_progress
+
+ @is_sync_in_progress.setter
+ def is_sync_in_progress(self, is_sync_in_progress):
+ """Sets the is_sync_in_progress of this GenericPackageUpload.
+
+
+ :param is_sync_in_progress: The is_sync_in_progress of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._is_sync_in_progress = is_sync_in_progress
+
+ @property
+ def license(self):
+ """Gets the license of this GenericPackageUpload.
+
+ The license of this package.
+
+ :return: The license of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._license
+
+ @license.setter
+ def license(self, license):
+ """Sets the license of this GenericPackageUpload.
+
+ The license of this package.
+
+ :param license: The license of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._license = license
+
+ @property
+ def name(self):
+ """Gets the name of this GenericPackageUpload.
+
+ The name of this package.
+
+ :return: The name of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._name
+
+ @name.setter
+ def name(self, name):
+ """Sets the name of this GenericPackageUpload.
+
+ The name of this package.
+
+ :param name: The name of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) > 200):
+ raise ValueError("Invalid value for `name`, length must be less than or equal to `200`") # noqa: E501
+
+ self._name = name
+
+ @property
+ def namespace(self):
+ """Gets the namespace of this GenericPackageUpload.
+
+
+ :return: The namespace of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._namespace
+
+ @namespace.setter
+ def namespace(self, namespace):
+ """Sets the namespace of this GenericPackageUpload.
+
+
+ :param namespace: The namespace of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ namespace is not None and len(namespace) < 1):
+ raise ValueError("Invalid value for `namespace`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._namespace = namespace
+
+ @property
+ def namespace_url(self):
+ """Gets the namespace_url of this GenericPackageUpload.
+
+
+ :return: The namespace_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._namespace_url
+
+ @namespace_url.setter
+ def namespace_url(self, namespace_url):
+ """Sets the namespace_url of this GenericPackageUpload.
+
+
+ :param namespace_url: The namespace_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._namespace_url = namespace_url
+
+ @property
+ def num_files(self):
+ """Gets the num_files of this GenericPackageUpload.
+
+
+ :return: The num_files of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._num_files
+
+ @num_files.setter
+ def num_files(self, num_files):
+ """Sets the num_files of this GenericPackageUpload.
+
+
+ :param num_files: The num_files of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._num_files = num_files
+
+ @property
+ def origin_repository(self):
+ """Gets the origin_repository of this GenericPackageUpload.
+
+
+ :return: The origin_repository of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._origin_repository
+
+ @origin_repository.setter
+ def origin_repository(self, origin_repository):
+ """Sets the origin_repository of this GenericPackageUpload.
+
+
+ :param origin_repository: The origin_repository of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ origin_repository is not None and len(origin_repository) < 1):
+ raise ValueError("Invalid value for `origin_repository`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._origin_repository = origin_repository
+
+ @property
+ def origin_repository_url(self):
+ """Gets the origin_repository_url of this GenericPackageUpload.
+
+
+ :return: The origin_repository_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._origin_repository_url
+
+ @origin_repository_url.setter
+ def origin_repository_url(self, origin_repository_url):
+ """Sets the origin_repository_url of this GenericPackageUpload.
+
+
+ :param origin_repository_url: The origin_repository_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._origin_repository_url = origin_repository_url
+
+ @property
+ def package_type(self):
+ """Gets the package_type of this GenericPackageUpload.
+
+ The type of package contents.
+
+ :return: The package_type of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._package_type
+
+ @package_type.setter
+ def package_type(self, package_type):
+ """Sets the package_type of this GenericPackageUpload.
+
+ The type of package contents.
+
+ :param package_type: The package_type of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._package_type = package_type
+
+ @property
+ def policy_violated(self):
+ """Gets the policy_violated of this GenericPackageUpload.
+
+ Whether or not the package has violated any policy.
+
+ :return: The policy_violated of this GenericPackageUpload.
+ :rtype: bool
+ """
+ return self._policy_violated
+
+ @policy_violated.setter
+ def policy_violated(self, policy_violated):
+ """Sets the policy_violated of this GenericPackageUpload.
+
+ Whether or not the package has violated any policy.
+
+ :param policy_violated: The policy_violated of this GenericPackageUpload.
+ :type: bool
+ """
+
+ self._policy_violated = policy_violated
+
+ @property
+ def raw_license(self):
+ """Gets the raw_license of this GenericPackageUpload.
+
+ The raw license string.
+
+ :return: The raw_license of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._raw_license
+
+ @raw_license.setter
+ def raw_license(self, raw_license):
+ """Sets the raw_license of this GenericPackageUpload.
+
+ The raw license string.
+
+ :param raw_license: The raw_license of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ raw_license is not None and len(raw_license) < 1):
+ raise ValueError("Invalid value for `raw_license`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._raw_license = raw_license
+
+ @property
+ def release(self):
+ """Gets the release of this GenericPackageUpload.
+
+ The release of the package version (if any).
+
+ :return: The release of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._release
+
+ @release.setter
+ def release(self, release):
+ """Sets the release of this GenericPackageUpload.
+
+ The release of the package version (if any).
+
+ :param release: The release of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._release = release
+
+ @property
+ def repository(self):
+ """Gets the repository of this GenericPackageUpload.
+
+
+ :return: The repository of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._repository
+
+ @repository.setter
+ def repository(self, repository):
+ """Sets the repository of this GenericPackageUpload.
+
+
+ :param repository: The repository of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ repository is not None and len(repository) < 1):
+ raise ValueError("Invalid value for `repository`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._repository = repository
+
+ @property
+ def repository_url(self):
+ """Gets the repository_url of this GenericPackageUpload.
+
+
+ :return: The repository_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._repository_url
+
+ @repository_url.setter
+ def repository_url(self, repository_url):
+ """Sets the repository_url of this GenericPackageUpload.
+
+
+ :param repository_url: The repository_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._repository_url = repository_url
+
+ @property
+ def security_scan_completed_at(self):
+ """Gets the security_scan_completed_at of this GenericPackageUpload.
+
+ The datetime the security scanning was completed.
+
+ :return: The security_scan_completed_at of this GenericPackageUpload.
+ :rtype: datetime
+ """
+ return self._security_scan_completed_at
+
+ @security_scan_completed_at.setter
+ def security_scan_completed_at(self, security_scan_completed_at):
+ """Sets the security_scan_completed_at of this GenericPackageUpload.
+
+ The datetime the security scanning was completed.
+
+ :param security_scan_completed_at: The security_scan_completed_at of this GenericPackageUpload.
+ :type: datetime
+ """
+
+ self._security_scan_completed_at = security_scan_completed_at
+
+ @property
+ def security_scan_started_at(self):
+ """Gets the security_scan_started_at of this GenericPackageUpload.
+
+ The datetime the security scanning was started.
+
+ :return: The security_scan_started_at of this GenericPackageUpload.
+ :rtype: datetime
+ """
+ return self._security_scan_started_at
+
+ @security_scan_started_at.setter
+ def security_scan_started_at(self, security_scan_started_at):
+ """Sets the security_scan_started_at of this GenericPackageUpload.
+
+ The datetime the security scanning was started.
+
+ :param security_scan_started_at: The security_scan_started_at of this GenericPackageUpload.
+ :type: datetime
+ """
+
+ self._security_scan_started_at = security_scan_started_at
+
+ @property
+ def security_scan_status(self):
+ """Gets the security_scan_status of this GenericPackageUpload.
+
+
+ :return: The security_scan_status of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._security_scan_status
+
+ @security_scan_status.setter
+ def security_scan_status(self, security_scan_status):
+ """Sets the security_scan_status of this GenericPackageUpload.
+
+
+ :param security_scan_status: The security_scan_status of this GenericPackageUpload.
+ :type: str
+ """
+ allowed_values = ["Awaiting Security Scan", "Security Scanning in Progress", "Scan Detected Vulnerabilities", "Scan Detected No Vulnerabilities", "Security Scanning Disabled", "Security Scanning Failed", "Security Scanning Skipped", "Security Scanning Not Supported"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ security_scan_status not in allowed_values):
+ raise ValueError(
+ "Invalid value for `security_scan_status` ({0}), must be one of {1}" # noqa: E501
+ .format(security_scan_status, allowed_values)
+ )
+
+ self._security_scan_status = security_scan_status
+
+ @property
+ def security_scan_status_updated_at(self):
+ """Gets the security_scan_status_updated_at of this GenericPackageUpload.
+
+ The datetime the security scanning status was updated.
+
+ :return: The security_scan_status_updated_at of this GenericPackageUpload.
+ :rtype: datetime
+ """
+ return self._security_scan_status_updated_at
+
+ @security_scan_status_updated_at.setter
+ def security_scan_status_updated_at(self, security_scan_status_updated_at):
+ """Sets the security_scan_status_updated_at of this GenericPackageUpload.
+
+ The datetime the security scanning status was updated.
+
+ :param security_scan_status_updated_at: The security_scan_status_updated_at of this GenericPackageUpload.
+ :type: datetime
+ """
+
+ self._security_scan_status_updated_at = security_scan_status_updated_at
+
+ @property
+ def self_html_url(self):
+ """Gets the self_html_url of this GenericPackageUpload.
+
+
+ :return: The self_html_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._self_html_url
+
+ @self_html_url.setter
+ def self_html_url(self, self_html_url):
+ """Sets the self_html_url of this GenericPackageUpload.
+
+
+ :param self_html_url: The self_html_url of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ self_html_url is not None and len(self_html_url) < 1):
+ raise ValueError("Invalid value for `self_html_url`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._self_html_url = self_html_url
+
+ @property
+ def self_url(self):
+ """Gets the self_url of this GenericPackageUpload.
+
+
+ :return: The self_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._self_url
+
+ @self_url.setter
+ def self_url(self, self_url):
+ """Sets the self_url of this GenericPackageUpload.
+
+
+ :param self_url: The self_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._self_url = self_url
+
+ @property
+ def signature_url(self):
+ """Gets the signature_url of this GenericPackageUpload.
+
+
+ :return: The signature_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._signature_url
+
+ @signature_url.setter
+ def signature_url(self, signature_url):
+ """Sets the signature_url of this GenericPackageUpload.
+
+
+ :param signature_url: The signature_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._signature_url = signature_url
+
+ @property
+ def size(self):
+ """Gets the size of this GenericPackageUpload.
+
+ The calculated size of the package.
+
+ :return: The size of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._size
+
+ @size.setter
+ def size(self, size):
+ """Sets the size of this GenericPackageUpload.
+
+ The calculated size of the package.
+
+ :param size: The size of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._size = size
+
+ @property
+ def slug(self):
+ """Gets the slug of this GenericPackageUpload.
+
+ The public unique identifier for the package.
+
+ :return: The slug of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._slug
+
+ @slug.setter
+ def slug(self, slug):
+ """Sets the slug of this GenericPackageUpload.
+
+ The public unique identifier for the package.
+
+ :param slug: The slug of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ slug is not None and len(slug) < 1):
+ raise ValueError("Invalid value for `slug`, length must be greater than or equal to `1`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ slug is not None and not re.search('^[-a-zA-Z0-9_]+$', slug)): # noqa: E501
+ raise ValueError(r"Invalid value for `slug`, must be a follow pattern or equal to `/^[-a-zA-Z0-9_]+$/`") # noqa: E501
+
+ self._slug = slug
+
+ @property
+ def slug_perm(self):
+ """Gets the slug_perm of this GenericPackageUpload.
+
+
+ :return: The slug_perm of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._slug_perm
+
+ @slug_perm.setter
+ def slug_perm(self, slug_perm):
+ """Sets the slug_perm of this GenericPackageUpload.
+
+
+ :param slug_perm: The slug_perm of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ slug_perm is not None and len(slug_perm) < 1):
+ raise ValueError("Invalid value for `slug_perm`, length must be greater than or equal to `1`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ slug_perm is not None and not re.search('^[-a-zA-Z0-9_]+$', slug_perm)): # noqa: E501
+ raise ValueError(r"Invalid value for `slug_perm`, must be a follow pattern or equal to `/^[-a-zA-Z0-9_]+$/`") # noqa: E501
+
+ self._slug_perm = slug_perm
+
+ @property
+ def spdx_license(self):
+ """Gets the spdx_license of this GenericPackageUpload.
+
+ The SPDX license identifier for this package.
+
+ :return: The spdx_license of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._spdx_license
+
+ @spdx_license.setter
+ def spdx_license(self, spdx_license):
+ """Sets the spdx_license of this GenericPackageUpload.
+
+ The SPDX license identifier for this package.
+
+ :param spdx_license: The spdx_license of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ spdx_license is not None and len(spdx_license) < 1):
+ raise ValueError("Invalid value for `spdx_license`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._spdx_license = spdx_license
+
+ @property
+ def stage(self):
+ """Gets the stage of this GenericPackageUpload.
+
+ The synchronisation (in progress) stage of the package.
+
+ :return: The stage of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._stage
+
+ @stage.setter
+ def stage(self, stage):
+ """Sets the stage of this GenericPackageUpload.
+
+ The synchronisation (in progress) stage of the package.
+
+ :param stage: The stage of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._stage = stage
+
+ @property
+ def stage_str(self):
+ """Gets the stage_str of this GenericPackageUpload.
+
+
+ :return: The stage_str of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._stage_str
+
+ @stage_str.setter
+ def stage_str(self, stage_str):
+ """Sets the stage_str of this GenericPackageUpload.
+
+
+ :param stage_str: The stage_str of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._stage_str = stage_str
+
+ @property
+ def stage_updated_at(self):
+ """Gets the stage_updated_at of this GenericPackageUpload.
+
+ The datetime the package stage was updated at.
+
+ :return: The stage_updated_at of this GenericPackageUpload.
+ :rtype: datetime
+ """
+ return self._stage_updated_at
+
+ @stage_updated_at.setter
+ def stage_updated_at(self, stage_updated_at):
+ """Sets the stage_updated_at of this GenericPackageUpload.
+
+ The datetime the package stage was updated at.
+
+ :param stage_updated_at: The stage_updated_at of this GenericPackageUpload.
+ :type: datetime
+ """
+
+ self._stage_updated_at = stage_updated_at
+
+ @property
+ def status(self):
+ """Gets the status of this GenericPackageUpload.
+
+ The synchronisation status of the package.
+
+ :return: The status of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._status
+
+ @status.setter
+ def status(self, status):
+ """Sets the status of this GenericPackageUpload.
+
+ The synchronisation status of the package.
+
+ :param status: The status of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._status = status
+
+ @property
+ def status_reason(self):
+ """Gets the status_reason of this GenericPackageUpload.
+
+ A textual description for the synchronous status reason (if any
+
+ :return: The status_reason of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._status_reason
+
+ @status_reason.setter
+ def status_reason(self, status_reason):
+ """Sets the status_reason of this GenericPackageUpload.
+
+ A textual description for the synchronous status reason (if any
+
+ :param status_reason: The status_reason of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._status_reason = status_reason
+
+ @property
+ def status_str(self):
+ """Gets the status_str of this GenericPackageUpload.
+
+
+ :return: The status_str of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._status_str
+
+ @status_str.setter
+ def status_str(self, status_str):
+ """Sets the status_str of this GenericPackageUpload.
+
+
+ :param status_str: The status_str of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._status_str = status_str
+
+ @property
+ def status_updated_at(self):
+ """Gets the status_updated_at of this GenericPackageUpload.
+
+ The datetime the package status was updated at.
+
+ :return: The status_updated_at of this GenericPackageUpload.
+ :rtype: datetime
+ """
+ return self._status_updated_at
+
+ @status_updated_at.setter
+ def status_updated_at(self, status_updated_at):
+ """Sets the status_updated_at of this GenericPackageUpload.
+
+ The datetime the package status was updated at.
+
+ :param status_updated_at: The status_updated_at of this GenericPackageUpload.
+ :type: datetime
+ """
+
+ self._status_updated_at = status_updated_at
+
+ @property
+ def status_url(self):
+ """Gets the status_url of this GenericPackageUpload.
+
+
+ :return: The status_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._status_url
+
+ @status_url.setter
+ def status_url(self, status_url):
+ """Sets the status_url of this GenericPackageUpload.
+
+
+ :param status_url: The status_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._status_url = status_url
+
+ @property
+ def subtype(self):
+ """Gets the subtype of this GenericPackageUpload.
+
+
+ :return: The subtype of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._subtype
+
+ @subtype.setter
+ def subtype(self, subtype):
+ """Sets the subtype of this GenericPackageUpload.
+
+
+ :param subtype: The subtype of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._subtype = subtype
+
+ @property
+ def summary(self):
+ """Gets the summary of this GenericPackageUpload.
+
+ A one-liner synopsis of this package.
+
+ :return: The summary of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._summary
+
+ @summary.setter
+ def summary(self, summary):
+ """Sets the summary of this GenericPackageUpload.
+
+ A one-liner synopsis of this package.
+
+ :param summary: The summary of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._summary = summary
+
+ @property
+ def sync_finished_at(self):
+ """Gets the sync_finished_at of this GenericPackageUpload.
+
+ The datetime the package sync was finished at.
+
+ :return: The sync_finished_at of this GenericPackageUpload.
+ :rtype: datetime
+ """
+ return self._sync_finished_at
+
+ @sync_finished_at.setter
+ def sync_finished_at(self, sync_finished_at):
+ """Sets the sync_finished_at of this GenericPackageUpload.
+
+ The datetime the package sync was finished at.
+
+ :param sync_finished_at: The sync_finished_at of this GenericPackageUpload.
+ :type: datetime
+ """
+
+ self._sync_finished_at = sync_finished_at
+
+ @property
+ def sync_progress(self):
+ """Gets the sync_progress of this GenericPackageUpload.
+
+ Synchronisation progress (from 0-100)
+
+ :return: The sync_progress of this GenericPackageUpload.
+ :rtype: int
+ """
+ return self._sync_progress
+
+ @sync_progress.setter
+ def sync_progress(self, sync_progress):
+ """Sets the sync_progress of this GenericPackageUpload.
+
+ Synchronisation progress (from 0-100)
+
+ :param sync_progress: The sync_progress of this GenericPackageUpload.
+ :type: int
+ """
+
+ self._sync_progress = sync_progress
+
+ @property
+ def tags_automatic(self):
+ """Gets the tags_automatic of this GenericPackageUpload.
+
+
+ :return: The tags_automatic of this GenericPackageUpload.
+ :rtype: Tags
+ """
+ return self._tags_automatic
+
+ @tags_automatic.setter
+ def tags_automatic(self, tags_automatic):
+ """Sets the tags_automatic of this GenericPackageUpload.
+
+
+ :param tags_automatic: The tags_automatic of this GenericPackageUpload.
+ :type: Tags
+ """
+
+ self._tags_automatic = tags_automatic
+
+ @property
+ def tags_immutable(self):
+ """Gets the tags_immutable of this GenericPackageUpload.
+
+
+ :return: The tags_immutable of this GenericPackageUpload.
+ :rtype: Tags
+ """
+ return self._tags_immutable
+
+ @tags_immutable.setter
+ def tags_immutable(self, tags_immutable):
+ """Sets the tags_immutable of this GenericPackageUpload.
+
+
+ :param tags_immutable: The tags_immutable of this GenericPackageUpload.
+ :type: Tags
+ """
+
+ self._tags_immutable = tags_immutable
+
+ @property
+ def type_display(self):
+ """Gets the type_display of this GenericPackageUpload.
+
+
+ :return: The type_display of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._type_display
+
+ @type_display.setter
+ def type_display(self, type_display):
+ """Sets the type_display of this GenericPackageUpload.
+
+
+ :param type_display: The type_display of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._type_display = type_display
+
+ @property
+ def uploaded_at(self):
+ """Gets the uploaded_at of this GenericPackageUpload.
+
+ The date this package was uploaded.
+
+ :return: The uploaded_at of this GenericPackageUpload.
+ :rtype: datetime
+ """
+ return self._uploaded_at
+
+ @uploaded_at.setter
+ def uploaded_at(self, uploaded_at):
+ """Sets the uploaded_at of this GenericPackageUpload.
+
+ The date this package was uploaded.
+
+ :param uploaded_at: The uploaded_at of this GenericPackageUpload.
+ :type: datetime
+ """
+
+ self._uploaded_at = uploaded_at
+
+ @property
+ def uploader(self):
+ """Gets the uploader of this GenericPackageUpload.
+
+
+ :return: The uploader of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._uploader
+
+ @uploader.setter
+ def uploader(self, uploader):
+ """Sets the uploader of this GenericPackageUpload.
+
+
+ :param uploader: The uploader of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ uploader is not None and len(uploader) < 1):
+ raise ValueError("Invalid value for `uploader`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._uploader = uploader
+
+ @property
+ def uploader_url(self):
+ """Gets the uploader_url of this GenericPackageUpload.
+
+
+ :return: The uploader_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._uploader_url
+
+ @uploader_url.setter
+ def uploader_url(self, uploader_url):
+ """Sets the uploader_url of this GenericPackageUpload.
+
+
+ :param uploader_url: The uploader_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._uploader_url = uploader_url
+
+ @property
+ def version(self):
+ """Gets the version of this GenericPackageUpload.
+
+ The raw version for this package.
+
+ :return: The version of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._version
+
+ @version.setter
+ def version(self, version):
+ """Sets the version of this GenericPackageUpload.
+
+ The raw version for this package.
+
+ :param version: The version of this GenericPackageUpload.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ version is not None and len(version) > 255):
+ raise ValueError("Invalid value for `version`, length must be less than or equal to `255`") # noqa: E501
+
+ self._version = version
+
+ @property
+ def version_orig(self):
+ """Gets the version_orig of this GenericPackageUpload.
+
+
+ :return: The version_orig of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._version_orig
+
+ @version_orig.setter
+ def version_orig(self, version_orig):
+ """Sets the version_orig of this GenericPackageUpload.
+
+
+ :param version_orig: The version_orig of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._version_orig = version_orig
+
+ @property
+ def vulnerability_scan_results_url(self):
+ """Gets the vulnerability_scan_results_url of this GenericPackageUpload.
+
+
+ :return: The vulnerability_scan_results_url of this GenericPackageUpload.
+ :rtype: str
+ """
+ return self._vulnerability_scan_results_url
+
+ @vulnerability_scan_results_url.setter
+ def vulnerability_scan_results_url(self, vulnerability_scan_results_url):
+ """Sets the vulnerability_scan_results_url of this GenericPackageUpload.
+
+
+ :param vulnerability_scan_results_url: The vulnerability_scan_results_url of this GenericPackageUpload.
+ :type: str
+ """
+
+ self._vulnerability_scan_results_url = vulnerability_scan_results_url
+
+ def to_dict(self):
+ """Returns the model properties as a dict"""
+ result = {}
+
+ for attr, _ in six.iteritems(self.swagger_types):
+ value = getattr(self, attr)
+ if isinstance(value, list):
+ result[attr] = list(map(
+ lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
+ value
+ ))
+ elif hasattr(value, "to_dict"):
+ result[attr] = value.to_dict()
+ elif isinstance(value, dict):
+ result[attr] = dict(map(
+ lambda item: (item[0], item[1].to_dict())
+ if hasattr(item[1], "to_dict") else item,
+ value.items()
+ ))
+ else:
+ result[attr] = value
+ if issubclass(GenericPackageUpload, dict):
+ for key, value in self.items():
+ result[key] = value
+
+ return result
+
+ def to_str(self):
+ """Returns the string representation of the model"""
+ return pprint.pformat(self.to_dict())
+
+ def __repr__(self):
+ """For `print` and `pprint`"""
+ return self.to_str()
+
+ def __eq__(self, other):
+ """Returns true if both objects are equal"""
+ if not isinstance(other, GenericPackageUpload):
+ return False
+
+ return self.to_dict() == other.to_dict()
+
+ def __ne__(self, other):
+ """Returns true if both objects are not equal"""
+ if not isinstance(other, GenericPackageUpload):
+ return True
+
+ return self.to_dict() != other.to_dict()
+
diff --git a/bindings/python/src/cloudsmith_api/models/generic_package_upload_request.py b/bindings/python/src/cloudsmith_api/models/generic_package_upload_request.py
new file mode 100644
index 00000000..e498fd66
--- /dev/null
+++ b/bindings/python/src/cloudsmith_api/models/generic_package_upload_request.py
@@ -0,0 +1,289 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+import pprint
+import re # noqa: F401
+
+import six
+
+from cloudsmith_api.configuration import Configuration
+
+
+class GenericPackageUploadRequest(object):
+ """NOTE: This class is auto generated by the swagger code generator program.
+
+ Do not edit the class manually.
+ """
+
+ """
+ Attributes:
+ swagger_types (dict): The key is attribute name
+ and the value is attribute type.
+ attribute_map (dict): The key is attribute name
+ and the value is json key in definition.
+ """
+ swagger_types = {
+ 'filepath': 'str',
+ 'name': 'str',
+ 'package_file': 'str',
+ 'republish': 'bool',
+ 'tags': 'str',
+ 'version': 'str'
+ }
+
+ attribute_map = {
+ 'filepath': 'filepath',
+ 'name': 'name',
+ 'package_file': 'package_file',
+ 'republish': 'republish',
+ 'tags': 'tags',
+ 'version': 'version'
+ }
+
+ def __init__(self, filepath=None, name=None, package_file=None, republish=None, tags=None, version=None, _configuration=None): # noqa: E501
+ """GenericPackageUploadRequest - a model defined in Swagger""" # noqa: E501
+ if _configuration is None:
+ _configuration = Configuration()
+ self._configuration = _configuration
+
+ self._filepath = None
+ self._name = None
+ self._package_file = None
+ self._republish = None
+ self._tags = None
+ self._version = None
+ self.discriminator = None
+
+ self.filepath = filepath
+ if name is not None:
+ self.name = name
+ self.package_file = package_file
+ if republish is not None:
+ self.republish = republish
+ if tags is not None:
+ self.tags = tags
+ if version is not None:
+ self.version = version
+
+ @property
+ def filepath(self):
+ """Gets the filepath of this GenericPackageUploadRequest.
+
+ The full filepath of the package including filename.
+
+ :return: The filepath of this GenericPackageUploadRequest.
+ :rtype: str
+ """
+ return self._filepath
+
+ @filepath.setter
+ def filepath(self, filepath):
+ """Sets the filepath of this GenericPackageUploadRequest.
+
+ The full filepath of the package including filename.
+
+ :param filepath: The filepath of this GenericPackageUploadRequest.
+ :type: str
+ """
+ if self._configuration.client_side_validation and filepath is None:
+ raise ValueError("Invalid value for `filepath`, must not be `None`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) > 2083):
+ raise ValueError("Invalid value for `filepath`, length must be less than or equal to `2083`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) < 1):
+ raise ValueError("Invalid value for `filepath`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filepath = filepath
+
+ @property
+ def name(self):
+ """Gets the name of this GenericPackageUploadRequest.
+
+ The name of this package.
+
+ :return: The name of this GenericPackageUploadRequest.
+ :rtype: str
+ """
+ return self._name
+
+ @name.setter
+ def name(self, name):
+ """Sets the name of this GenericPackageUploadRequest.
+
+ The name of this package.
+
+ :param name: The name of this GenericPackageUploadRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) > 200):
+ raise ValueError("Invalid value for `name`, length must be less than or equal to `200`") # noqa: E501
+
+ self._name = name
+
+ @property
+ def package_file(self):
+ """Gets the package_file of this GenericPackageUploadRequest.
+
+ The primary file for the package.
+
+ :return: The package_file of this GenericPackageUploadRequest.
+ :rtype: str
+ """
+ return self._package_file
+
+ @package_file.setter
+ def package_file(self, package_file):
+ """Sets the package_file of this GenericPackageUploadRequest.
+
+ The primary file for the package.
+
+ :param package_file: The package_file of this GenericPackageUploadRequest.
+ :type: str
+ """
+ if self._configuration.client_side_validation and package_file is None:
+ raise ValueError("Invalid value for `package_file`, must not be `None`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ package_file is not None and len(package_file) < 1):
+ raise ValueError("Invalid value for `package_file`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._package_file = package_file
+
+ @property
+ def republish(self):
+ """Gets the republish of this GenericPackageUploadRequest.
+
+ If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate.
+
+ :return: The republish of this GenericPackageUploadRequest.
+ :rtype: bool
+ """
+ return self._republish
+
+ @republish.setter
+ def republish(self, republish):
+ """Sets the republish of this GenericPackageUploadRequest.
+
+ If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate.
+
+ :param republish: The republish of this GenericPackageUploadRequest.
+ :type: bool
+ """
+
+ self._republish = republish
+
+ @property
+ def tags(self):
+ """Gets the tags of this GenericPackageUploadRequest.
+
+ A comma-separated values list of tags to add to the package.
+
+ :return: The tags of this GenericPackageUploadRequest.
+ :rtype: str
+ """
+ return self._tags
+
+ @tags.setter
+ def tags(self, tags):
+ """Sets the tags of this GenericPackageUploadRequest.
+
+ A comma-separated values list of tags to add to the package.
+
+ :param tags: The tags of this GenericPackageUploadRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ tags is not None and len(tags) > 1024):
+ raise ValueError("Invalid value for `tags`, length must be less than or equal to `1024`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ tags is not None and len(tags) < 1):
+ raise ValueError("Invalid value for `tags`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._tags = tags
+
+ @property
+ def version(self):
+ """Gets the version of this GenericPackageUploadRequest.
+
+ The raw version for this package.
+
+ :return: The version of this GenericPackageUploadRequest.
+ :rtype: str
+ """
+ return self._version
+
+ @version.setter
+ def version(self, version):
+ """Sets the version of this GenericPackageUploadRequest.
+
+ The raw version for this package.
+
+ :param version: The version of this GenericPackageUploadRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ version is not None and len(version) > 255):
+ raise ValueError("Invalid value for `version`, length must be less than or equal to `255`") # noqa: E501
+
+ self._version = version
+
+ def to_dict(self):
+ """Returns the model properties as a dict"""
+ result = {}
+
+ for attr, _ in six.iteritems(self.swagger_types):
+ value = getattr(self, attr)
+ if isinstance(value, list):
+ result[attr] = list(map(
+ lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
+ value
+ ))
+ elif hasattr(value, "to_dict"):
+ result[attr] = value.to_dict()
+ elif isinstance(value, dict):
+ result[attr] = dict(map(
+ lambda item: (item[0], item[1].to_dict())
+ if hasattr(item[1], "to_dict") else item,
+ value.items()
+ ))
+ else:
+ result[attr] = value
+ if issubclass(GenericPackageUploadRequest, dict):
+ for key, value in self.items():
+ result[key] = value
+
+ return result
+
+ def to_str(self):
+ """Returns the string representation of the model"""
+ return pprint.pformat(self.to_dict())
+
+ def __repr__(self):
+ """For `print` and `pprint`"""
+ return self.to_str()
+
+ def __eq__(self, other):
+ """Returns true if both objects are equal"""
+ if not isinstance(other, GenericPackageUploadRequest):
+ return False
+
+ return self.to_dict() == other.to_dict()
+
+ def __ne__(self, other):
+ """Returns true if both objects are not equal"""
+ if not isinstance(other, GenericPackageUploadRequest):
+ return True
+
+ return self.to_dict() != other.to_dict()
+
diff --git a/bindings/python/src/cloudsmith_api/models/generic_upstream.py b/bindings/python/src/cloudsmith_api/models/generic_upstream.py
new file mode 100644
index 00000000..1c92f708
--- /dev/null
+++ b/bindings/python/src/cloudsmith_api/models/generic_upstream.py
@@ -0,0 +1,900 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+import pprint
+import re # noqa: F401
+
+import six
+
+from cloudsmith_api.configuration import Configuration
+
+
+class GenericUpstream(object):
+ """NOTE: This class is auto generated by the swagger code generator program.
+
+ Do not edit the class manually.
+ """
+
+ """
+ Attributes:
+ swagger_types (dict): The key is attribute name
+ and the value is attribute type.
+ attribute_map (dict): The key is attribute name
+ and the value is json key in definition.
+ """
+ swagger_types = {
+ 'auth_mode': 'str',
+ 'auth_secret': 'str',
+ 'auth_username': 'str',
+ 'available': 'str',
+ 'can_reindex': 'str',
+ 'created_at': 'datetime',
+ 'disable_reason': 'str',
+ 'disable_reason_text': 'str',
+ 'extra_header_1': 'str',
+ 'extra_header_2': 'str',
+ 'extra_value_1': 'str',
+ 'extra_value_2': 'str',
+ 'has_failed_signature_verification': 'str',
+ 'index_package_count': 'str',
+ 'index_status': 'str',
+ 'is_active': 'bool',
+ 'last_indexed': 'str',
+ 'mode': 'str',
+ 'name': 'str',
+ 'pending_validation': 'bool',
+ 'priority': 'int',
+ 'slug_perm': 'str',
+ 'updated_at': 'datetime',
+ 'upstream_prefix': 'str',
+ 'upstream_url': 'str',
+ 'verify_ssl': 'bool'
+ }
+
+ attribute_map = {
+ 'auth_mode': 'auth_mode',
+ 'auth_secret': 'auth_secret',
+ 'auth_username': 'auth_username',
+ 'available': 'available',
+ 'can_reindex': 'can_reindex',
+ 'created_at': 'created_at',
+ 'disable_reason': 'disable_reason',
+ 'disable_reason_text': 'disable_reason_text',
+ 'extra_header_1': 'extra_header_1',
+ 'extra_header_2': 'extra_header_2',
+ 'extra_value_1': 'extra_value_1',
+ 'extra_value_2': 'extra_value_2',
+ 'has_failed_signature_verification': 'has_failed_signature_verification',
+ 'index_package_count': 'index_package_count',
+ 'index_status': 'index_status',
+ 'is_active': 'is_active',
+ 'last_indexed': 'last_indexed',
+ 'mode': 'mode',
+ 'name': 'name',
+ 'pending_validation': 'pending_validation',
+ 'priority': 'priority',
+ 'slug_perm': 'slug_perm',
+ 'updated_at': 'updated_at',
+ 'upstream_prefix': 'upstream_prefix',
+ 'upstream_url': 'upstream_url',
+ 'verify_ssl': 'verify_ssl'
+ }
+
+ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, available=None, can_reindex=None, created_at=None, disable_reason='N/A', disable_reason_text=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, has_failed_signature_verification=None, index_package_count=None, index_status=None, is_active=None, last_indexed=None, mode='Proxy Only', name=None, pending_validation=None, priority=None, slug_perm=None, updated_at=None, upstream_prefix=None, upstream_url=None, verify_ssl=None, _configuration=None): # noqa: E501
+ """GenericUpstream - a model defined in Swagger""" # noqa: E501
+ if _configuration is None:
+ _configuration = Configuration()
+ self._configuration = _configuration
+
+ self._auth_mode = None
+ self._auth_secret = None
+ self._auth_username = None
+ self._available = None
+ self._can_reindex = None
+ self._created_at = None
+ self._disable_reason = None
+ self._disable_reason_text = None
+ self._extra_header_1 = None
+ self._extra_header_2 = None
+ self._extra_value_1 = None
+ self._extra_value_2 = None
+ self._has_failed_signature_verification = None
+ self._index_package_count = None
+ self._index_status = None
+ self._is_active = None
+ self._last_indexed = None
+ self._mode = None
+ self._name = None
+ self._pending_validation = None
+ self._priority = None
+ self._slug_perm = None
+ self._updated_at = None
+ self._upstream_prefix = None
+ self._upstream_url = None
+ self._verify_ssl = None
+ self.discriminator = None
+
+ if auth_mode is not None:
+ self.auth_mode = auth_mode
+ if auth_secret is not None:
+ self.auth_secret = auth_secret
+ if auth_username is not None:
+ self.auth_username = auth_username
+ if available is not None:
+ self.available = available
+ if can_reindex is not None:
+ self.can_reindex = can_reindex
+ if created_at is not None:
+ self.created_at = created_at
+ if disable_reason is not None:
+ self.disable_reason = disable_reason
+ if disable_reason_text is not None:
+ self.disable_reason_text = disable_reason_text
+ if extra_header_1 is not None:
+ self.extra_header_1 = extra_header_1
+ if extra_header_2 is not None:
+ self.extra_header_2 = extra_header_2
+ if extra_value_1 is not None:
+ self.extra_value_1 = extra_value_1
+ if extra_value_2 is not None:
+ self.extra_value_2 = extra_value_2
+ if has_failed_signature_verification is not None:
+ self.has_failed_signature_verification = has_failed_signature_verification
+ if index_package_count is not None:
+ self.index_package_count = index_package_count
+ if index_status is not None:
+ self.index_status = index_status
+ if is_active is not None:
+ self.is_active = is_active
+ if last_indexed is not None:
+ self.last_indexed = last_indexed
+ if mode is not None:
+ self.mode = mode
+ self.name = name
+ if pending_validation is not None:
+ self.pending_validation = pending_validation
+ if priority is not None:
+ self.priority = priority
+ if slug_perm is not None:
+ self.slug_perm = slug_perm
+ if updated_at is not None:
+ self.updated_at = updated_at
+ if upstream_prefix is not None:
+ self.upstream_prefix = upstream_prefix
+ self.upstream_url = upstream_url
+ if verify_ssl is not None:
+ self.verify_ssl = verify_ssl
+
+ @property
+ def auth_mode(self):
+ """Gets the auth_mode of this GenericUpstream.
+
+ The authentication mode to use when accessing this upstream.
+
+ :return: The auth_mode of this GenericUpstream.
+ :rtype: str
+ """
+ return self._auth_mode
+
+ @auth_mode.setter
+ def auth_mode(self, auth_mode):
+ """Sets the auth_mode of this GenericUpstream.
+
+ The authentication mode to use when accessing this upstream.
+
+ :param auth_mode: The auth_mode of this GenericUpstream.
+ :type: str
+ """
+ allowed_values = ["None", "Username and Password", "Token"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ auth_mode not in allowed_values):
+ raise ValueError(
+ "Invalid value for `auth_mode` ({0}), must be one of {1}" # noqa: E501
+ .format(auth_mode, allowed_values)
+ )
+
+ self._auth_mode = auth_mode
+
+ @property
+ def auth_secret(self):
+ """Gets the auth_secret of this GenericUpstream.
+
+ Secret to provide with requests to upstream.
+
+ :return: The auth_secret of this GenericUpstream.
+ :rtype: str
+ """
+ return self._auth_secret
+
+ @auth_secret.setter
+ def auth_secret(self, auth_secret):
+ """Sets the auth_secret of this GenericUpstream.
+
+ Secret to provide with requests to upstream.
+
+ :param auth_secret: The auth_secret of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ auth_secret is not None and len(auth_secret) > 4096):
+ raise ValueError("Invalid value for `auth_secret`, length must be less than or equal to `4096`") # noqa: E501
+
+ self._auth_secret = auth_secret
+
+ @property
+ def auth_username(self):
+ """Gets the auth_username of this GenericUpstream.
+
+ Username to provide with requests to upstream.
+
+ :return: The auth_username of this GenericUpstream.
+ :rtype: str
+ """
+ return self._auth_username
+
+ @auth_username.setter
+ def auth_username(self, auth_username):
+ """Sets the auth_username of this GenericUpstream.
+
+ Username to provide with requests to upstream.
+
+ :param auth_username: The auth_username of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ auth_username is not None and len(auth_username) > 64):
+ raise ValueError("Invalid value for `auth_username`, length must be less than or equal to `64`") # noqa: E501
+
+ self._auth_username = auth_username
+
+ @property
+ def available(self):
+ """Gets the available of this GenericUpstream.
+
+
+ :return: The available of this GenericUpstream.
+ :rtype: str
+ """
+ return self._available
+
+ @available.setter
+ def available(self, available):
+ """Sets the available of this GenericUpstream.
+
+
+ :param available: The available of this GenericUpstream.
+ :type: str
+ """
+
+ self._available = available
+
+ @property
+ def can_reindex(self):
+ """Gets the can_reindex of this GenericUpstream.
+
+
+ :return: The can_reindex of this GenericUpstream.
+ :rtype: str
+ """
+ return self._can_reindex
+
+ @can_reindex.setter
+ def can_reindex(self, can_reindex):
+ """Sets the can_reindex of this GenericUpstream.
+
+
+ :param can_reindex: The can_reindex of this GenericUpstream.
+ :type: str
+ """
+
+ self._can_reindex = can_reindex
+
+ @property
+ def created_at(self):
+ """Gets the created_at of this GenericUpstream.
+
+ The datetime the upstream source was created.
+
+ :return: The created_at of this GenericUpstream.
+ :rtype: datetime
+ """
+ return self._created_at
+
+ @created_at.setter
+ def created_at(self, created_at):
+ """Sets the created_at of this GenericUpstream.
+
+ The datetime the upstream source was created.
+
+ :param created_at: The created_at of this GenericUpstream.
+ :type: datetime
+ """
+
+ self._created_at = created_at
+
+ @property
+ def disable_reason(self):
+ """Gets the disable_reason of this GenericUpstream.
+
+
+ :return: The disable_reason of this GenericUpstream.
+ :rtype: str
+ """
+ return self._disable_reason
+
+ @disable_reason.setter
+ def disable_reason(self, disable_reason):
+ """Sets the disable_reason of this GenericUpstream.
+
+
+ :param disable_reason: The disable_reason of this GenericUpstream.
+ :type: str
+ """
+ allowed_values = ["N/A", "Upstream points to its own repository", "Missing upstream source", "Upstream was disabled by request of user"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ disable_reason not in allowed_values):
+ raise ValueError(
+ "Invalid value for `disable_reason` ({0}), must be one of {1}" # noqa: E501
+ .format(disable_reason, allowed_values)
+ )
+
+ self._disable_reason = disable_reason
+
+ @property
+ def disable_reason_text(self):
+ """Gets the disable_reason_text of this GenericUpstream.
+
+ Human-readable explanation of why this upstream is disabled
+
+ :return: The disable_reason_text of this GenericUpstream.
+ :rtype: str
+ """
+ return self._disable_reason_text
+
+ @disable_reason_text.setter
+ def disable_reason_text(self, disable_reason_text):
+ """Sets the disable_reason_text of this GenericUpstream.
+
+ Human-readable explanation of why this upstream is disabled
+
+ :param disable_reason_text: The disable_reason_text of this GenericUpstream.
+ :type: str
+ """
+
+ self._disable_reason_text = disable_reason_text
+
+ @property
+ def extra_header_1(self):
+ """Gets the extra_header_1 of this GenericUpstream.
+
+ The key for extra header #1 to send to upstream.
+
+ :return: The extra_header_1 of this GenericUpstream.
+ :rtype: str
+ """
+ return self._extra_header_1
+
+ @extra_header_1.setter
+ def extra_header_1(self, extra_header_1):
+ """Sets the extra_header_1 of this GenericUpstream.
+
+ The key for extra header #1 to send to upstream.
+
+ :param extra_header_1: The extra_header_1 of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_header_1 is not None and len(extra_header_1) > 64):
+ raise ValueError("Invalid value for `extra_header_1`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_header_1 is not None and not re.search('^[-\\w]+$', extra_header_1)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_header_1`, must be a follow pattern or equal to `/^[-\\w]+$/`") # noqa: E501
+
+ self._extra_header_1 = extra_header_1
+
+ @property
+ def extra_header_2(self):
+ """Gets the extra_header_2 of this GenericUpstream.
+
+ The key for extra header #2 to send to upstream.
+
+ :return: The extra_header_2 of this GenericUpstream.
+ :rtype: str
+ """
+ return self._extra_header_2
+
+ @extra_header_2.setter
+ def extra_header_2(self, extra_header_2):
+ """Sets the extra_header_2 of this GenericUpstream.
+
+ The key for extra header #2 to send to upstream.
+
+ :param extra_header_2: The extra_header_2 of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_header_2 is not None and len(extra_header_2) > 64):
+ raise ValueError("Invalid value for `extra_header_2`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_header_2 is not None and not re.search('^[-\\w]+$', extra_header_2)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_header_2`, must be a follow pattern or equal to `/^[-\\w]+$/`") # noqa: E501
+
+ self._extra_header_2 = extra_header_2
+
+ @property
+ def extra_value_1(self):
+ """Gets the extra_value_1 of this GenericUpstream.
+
+ The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :return: The extra_value_1 of this GenericUpstream.
+ :rtype: str
+ """
+ return self._extra_value_1
+
+ @extra_value_1.setter
+ def extra_value_1(self, extra_value_1):
+ """Sets the extra_value_1 of this GenericUpstream.
+
+ The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :param extra_value_1: The extra_value_1 of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_value_1 is not None and len(extra_value_1) > 128):
+ raise ValueError("Invalid value for `extra_value_1`, length must be less than or equal to `128`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_value_1 is not None and not re.search('^[^\\n\\r]+$', extra_value_1)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_value_1`, must be a follow pattern or equal to `/^[^\\n\\r]+$/`") # noqa: E501
+
+ self._extra_value_1 = extra_value_1
+
+ @property
+ def extra_value_2(self):
+ """Gets the extra_value_2 of this GenericUpstream.
+
+ The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :return: The extra_value_2 of this GenericUpstream.
+ :rtype: str
+ """
+ return self._extra_value_2
+
+ @extra_value_2.setter
+ def extra_value_2(self, extra_value_2):
+ """Sets the extra_value_2 of this GenericUpstream.
+
+ The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :param extra_value_2: The extra_value_2 of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_value_2 is not None and len(extra_value_2) > 128):
+ raise ValueError("Invalid value for `extra_value_2`, length must be less than or equal to `128`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_value_2 is not None and not re.search('^[^\\n\\r]+$', extra_value_2)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_value_2`, must be a follow pattern or equal to `/^[^\\n\\r]+$/`") # noqa: E501
+
+ self._extra_value_2 = extra_value_2
+
+ @property
+ def has_failed_signature_verification(self):
+ """Gets the has_failed_signature_verification of this GenericUpstream.
+
+
+ :return: The has_failed_signature_verification of this GenericUpstream.
+ :rtype: str
+ """
+ return self._has_failed_signature_verification
+
+ @has_failed_signature_verification.setter
+ def has_failed_signature_verification(self, has_failed_signature_verification):
+ """Sets the has_failed_signature_verification of this GenericUpstream.
+
+
+ :param has_failed_signature_verification: The has_failed_signature_verification of this GenericUpstream.
+ :type: str
+ """
+
+ self._has_failed_signature_verification = has_failed_signature_verification
+
+ @property
+ def index_package_count(self):
+ """Gets the index_package_count of this GenericUpstream.
+
+ The number of packages available in this upstream source
+
+ :return: The index_package_count of this GenericUpstream.
+ :rtype: str
+ """
+ return self._index_package_count
+
+ @index_package_count.setter
+ def index_package_count(self, index_package_count):
+ """Sets the index_package_count of this GenericUpstream.
+
+ The number of packages available in this upstream source
+
+ :param index_package_count: The index_package_count of this GenericUpstream.
+ :type: str
+ """
+
+ self._index_package_count = index_package_count
+
+ @property
+ def index_status(self):
+ """Gets the index_status of this GenericUpstream.
+
+ The current indexing status of this upstream source
+
+ :return: The index_status of this GenericUpstream.
+ :rtype: str
+ """
+ return self._index_status
+
+ @index_status.setter
+ def index_status(self, index_status):
+ """Sets the index_status of this GenericUpstream.
+
+ The current indexing status of this upstream source
+
+ :param index_status: The index_status of this GenericUpstream.
+ :type: str
+ """
+
+ self._index_status = index_status
+
+ @property
+ def is_active(self):
+ """Gets the is_active of this GenericUpstream.
+
+ Whether or not this upstream is active and ready for requests.
+
+ :return: The is_active of this GenericUpstream.
+ :rtype: bool
+ """
+ return self._is_active
+
+ @is_active.setter
+ def is_active(self, is_active):
+ """Sets the is_active of this GenericUpstream.
+
+ Whether or not this upstream is active and ready for requests.
+
+ :param is_active: The is_active of this GenericUpstream.
+ :type: bool
+ """
+
+ self._is_active = is_active
+
+ @property
+ def last_indexed(self):
+ """Gets the last_indexed of this GenericUpstream.
+
+ The last time this upstream source was indexed
+
+ :return: The last_indexed of this GenericUpstream.
+ :rtype: str
+ """
+ return self._last_indexed
+
+ @last_indexed.setter
+ def last_indexed(self, last_indexed):
+ """Sets the last_indexed of this GenericUpstream.
+
+ The last time this upstream source was indexed
+
+ :param last_indexed: The last_indexed of this GenericUpstream.
+ :type: str
+ """
+
+ self._last_indexed = last_indexed
+
+ @property
+ def mode(self):
+ """Gets the mode of this GenericUpstream.
+
+ The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+
+ :return: The mode of this GenericUpstream.
+ :rtype: str
+ """
+ return self._mode
+
+ @mode.setter
+ def mode(self, mode):
+ """Sets the mode of this GenericUpstream.
+
+ The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+
+ :param mode: The mode of this GenericUpstream.
+ :type: str
+ """
+ allowed_values = ["Proxy Only", "Cache and Proxy"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ mode not in allowed_values):
+ raise ValueError(
+ "Invalid value for `mode` ({0}), must be one of {1}" # noqa: E501
+ .format(mode, allowed_values)
+ )
+
+ self._mode = mode
+
+ @property
+ def name(self):
+ """Gets the name of this GenericUpstream.
+
+ A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+
+ :return: The name of this GenericUpstream.
+ :rtype: str
+ """
+ return self._name
+
+ @name.setter
+ def name(self, name):
+ """Sets the name of this GenericUpstream.
+
+ A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+
+ :param name: The name of this GenericUpstream.
+ :type: str
+ """
+ if self._configuration.client_side_validation and name is None:
+ raise ValueError("Invalid value for `name`, must not be `None`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) > 64):
+ raise ValueError("Invalid value for `name`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) < 1):
+ raise ValueError("Invalid value for `name`, length must be greater than or equal to `1`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and not re.search('^\\w[\\w \\-\'\\.\\/()]+$', name)): # noqa: E501
+ raise ValueError(r"Invalid value for `name`, must be a follow pattern or equal to `/^\\w[\\w \\-'\\.\/()]+$/`") # noqa: E501
+
+ self._name = name
+
+ @property
+ def pending_validation(self):
+ """Gets the pending_validation of this GenericUpstream.
+
+ When true, this upstream source is pending validation.
+
+ :return: The pending_validation of this GenericUpstream.
+ :rtype: bool
+ """
+ return self._pending_validation
+
+ @pending_validation.setter
+ def pending_validation(self, pending_validation):
+ """Sets the pending_validation of this GenericUpstream.
+
+ When true, this upstream source is pending validation.
+
+ :param pending_validation: The pending_validation of this GenericUpstream.
+ :type: bool
+ """
+
+ self._pending_validation = pending_validation
+
+ @property
+ def priority(self):
+ """Gets the priority of this GenericUpstream.
+
+ Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+
+ :return: The priority of this GenericUpstream.
+ :rtype: int
+ """
+ return self._priority
+
+ @priority.setter
+ def priority(self, priority):
+ """Sets the priority of this GenericUpstream.
+
+ Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+
+ :param priority: The priority of this GenericUpstream.
+ :type: int
+ """
+ if (self._configuration.client_side_validation and
+ priority is not None and priority > 32767): # noqa: E501
+ raise ValueError("Invalid value for `priority`, must be a value less than or equal to `32767`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ priority is not None and priority < 1): # noqa: E501
+ raise ValueError("Invalid value for `priority`, must be a value greater than or equal to `1`") # noqa: E501
+
+ self._priority = priority
+
+ @property
+ def slug_perm(self):
+ """Gets the slug_perm of this GenericUpstream.
+
+
+ :return: The slug_perm of this GenericUpstream.
+ :rtype: str
+ """
+ return self._slug_perm
+
+ @slug_perm.setter
+ def slug_perm(self, slug_perm):
+ """Sets the slug_perm of this GenericUpstream.
+
+
+ :param slug_perm: The slug_perm of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ slug_perm is not None and len(slug_perm) < 1):
+ raise ValueError("Invalid value for `slug_perm`, length must be greater than or equal to `1`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ slug_perm is not None and not re.search('^[-a-zA-Z0-9_]+$', slug_perm)): # noqa: E501
+ raise ValueError(r"Invalid value for `slug_perm`, must be a follow pattern or equal to `/^[-a-zA-Z0-9_]+$/`") # noqa: E501
+
+ self._slug_perm = slug_perm
+
+ @property
+ def updated_at(self):
+ """Gets the updated_at of this GenericUpstream.
+
+
+ :return: The updated_at of this GenericUpstream.
+ :rtype: datetime
+ """
+ return self._updated_at
+
+ @updated_at.setter
+ def updated_at(self, updated_at):
+ """Sets the updated_at of this GenericUpstream.
+
+
+ :param updated_at: The updated_at of this GenericUpstream.
+ :type: datetime
+ """
+
+ self._updated_at = updated_at
+
+ @property
+ def upstream_prefix(self):
+ """Gets the upstream_prefix of this GenericUpstream.
+
+ A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+
+ :return: The upstream_prefix of this GenericUpstream.
+ :rtype: str
+ """
+ return self._upstream_prefix
+
+ @upstream_prefix.setter
+ def upstream_prefix(self, upstream_prefix):
+ """Sets the upstream_prefix of this GenericUpstream.
+
+ A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+
+ :param upstream_prefix: The upstream_prefix of this GenericUpstream.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ upstream_prefix is not None and len(upstream_prefix) > 64):
+ raise ValueError("Invalid value for `upstream_prefix`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_prefix is not None and len(upstream_prefix) < 1):
+ raise ValueError("Invalid value for `upstream_prefix`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._upstream_prefix = upstream_prefix
+
+ @property
+ def upstream_url(self):
+ """Gets the upstream_url of this GenericUpstream.
+
+ The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+
+ :return: The upstream_url of this GenericUpstream.
+ :rtype: str
+ """
+ return self._upstream_url
+
+ @upstream_url.setter
+ def upstream_url(self, upstream_url):
+ """Sets the upstream_url of this GenericUpstream.
+
+ The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+
+ :param upstream_url: The upstream_url of this GenericUpstream.
+ :type: str
+ """
+ if self._configuration.client_side_validation and upstream_url is None:
+ raise ValueError("Invalid value for `upstream_url`, must not be `None`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_url is not None and len(upstream_url) > 200):
+ raise ValueError("Invalid value for `upstream_url`, length must be less than or equal to `200`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_url is not None and len(upstream_url) < 1):
+ raise ValueError("Invalid value for `upstream_url`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._upstream_url = upstream_url
+
+ @property
+ def verify_ssl(self):
+ """Gets the verify_ssl of this GenericUpstream.
+
+ If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+
+ :return: The verify_ssl of this GenericUpstream.
+ :rtype: bool
+ """
+ return self._verify_ssl
+
+ @verify_ssl.setter
+ def verify_ssl(self, verify_ssl):
+ """Sets the verify_ssl of this GenericUpstream.
+
+ If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+
+ :param verify_ssl: The verify_ssl of this GenericUpstream.
+ :type: bool
+ """
+
+ self._verify_ssl = verify_ssl
+
+ def to_dict(self):
+ """Returns the model properties as a dict"""
+ result = {}
+
+ for attr, _ in six.iteritems(self.swagger_types):
+ value = getattr(self, attr)
+ if isinstance(value, list):
+ result[attr] = list(map(
+ lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
+ value
+ ))
+ elif hasattr(value, "to_dict"):
+ result[attr] = value.to_dict()
+ elif isinstance(value, dict):
+ result[attr] = dict(map(
+ lambda item: (item[0], item[1].to_dict())
+ if hasattr(item[1], "to_dict") else item,
+ value.items()
+ ))
+ else:
+ result[attr] = value
+ if issubclass(GenericUpstream, dict):
+ for key, value in self.items():
+ result[key] = value
+
+ return result
+
+ def to_str(self):
+ """Returns the string representation of the model"""
+ return pprint.pformat(self.to_dict())
+
+ def __repr__(self):
+ """For `print` and `pprint`"""
+ return self.to_str()
+
+ def __eq__(self, other):
+ """Returns true if both objects are equal"""
+ if not isinstance(other, GenericUpstream):
+ return False
+
+ return self.to_dict() == other.to_dict()
+
+ def __ne__(self, other):
+ """Returns true if both objects are not equal"""
+ if not isinstance(other, GenericUpstream):
+ return True
+
+ return self.to_dict() != other.to_dict()
+
diff --git a/bindings/python/src/cloudsmith_api/models/generic_upstream_request.py b/bindings/python/src/cloudsmith_api/models/generic_upstream_request.py
new file mode 100644
index 00000000..7a5bbb71
--- /dev/null
+++ b/bindings/python/src/cloudsmith_api/models/generic_upstream_request.py
@@ -0,0 +1,563 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+import pprint
+import re # noqa: F401
+
+import six
+
+from cloudsmith_api.configuration import Configuration
+
+
+class GenericUpstreamRequest(object):
+ """NOTE: This class is auto generated by the swagger code generator program.
+
+ Do not edit the class manually.
+ """
+
+ """
+ Attributes:
+ swagger_types (dict): The key is attribute name
+ and the value is attribute type.
+ attribute_map (dict): The key is attribute name
+ and the value is json key in definition.
+ """
+ swagger_types = {
+ 'auth_mode': 'str',
+ 'auth_secret': 'str',
+ 'auth_username': 'str',
+ 'extra_header_1': 'str',
+ 'extra_header_2': 'str',
+ 'extra_value_1': 'str',
+ 'extra_value_2': 'str',
+ 'is_active': 'bool',
+ 'mode': 'str',
+ 'name': 'str',
+ 'priority': 'int',
+ 'upstream_prefix': 'str',
+ 'upstream_url': 'str',
+ 'verify_ssl': 'bool'
+ }
+
+ attribute_map = {
+ 'auth_mode': 'auth_mode',
+ 'auth_secret': 'auth_secret',
+ 'auth_username': 'auth_username',
+ 'extra_header_1': 'extra_header_1',
+ 'extra_header_2': 'extra_header_2',
+ 'extra_value_1': 'extra_value_1',
+ 'extra_value_2': 'extra_value_2',
+ 'is_active': 'is_active',
+ 'mode': 'mode',
+ 'name': 'name',
+ 'priority': 'priority',
+ 'upstream_prefix': 'upstream_prefix',
+ 'upstream_url': 'upstream_url',
+ 'verify_ssl': 'verify_ssl'
+ }
+
+ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, is_active=None, mode='Proxy Only', name=None, priority=None, upstream_prefix=None, upstream_url=None, verify_ssl=None, _configuration=None): # noqa: E501
+ """GenericUpstreamRequest - a model defined in Swagger""" # noqa: E501
+ if _configuration is None:
+ _configuration = Configuration()
+ self._configuration = _configuration
+
+ self._auth_mode = None
+ self._auth_secret = None
+ self._auth_username = None
+ self._extra_header_1 = None
+ self._extra_header_2 = None
+ self._extra_value_1 = None
+ self._extra_value_2 = None
+ self._is_active = None
+ self._mode = None
+ self._name = None
+ self._priority = None
+ self._upstream_prefix = None
+ self._upstream_url = None
+ self._verify_ssl = None
+ self.discriminator = None
+
+ if auth_mode is not None:
+ self.auth_mode = auth_mode
+ if auth_secret is not None:
+ self.auth_secret = auth_secret
+ if auth_username is not None:
+ self.auth_username = auth_username
+ if extra_header_1 is not None:
+ self.extra_header_1 = extra_header_1
+ if extra_header_2 is not None:
+ self.extra_header_2 = extra_header_2
+ if extra_value_1 is not None:
+ self.extra_value_1 = extra_value_1
+ if extra_value_2 is not None:
+ self.extra_value_2 = extra_value_2
+ if is_active is not None:
+ self.is_active = is_active
+ if mode is not None:
+ self.mode = mode
+ self.name = name
+ if priority is not None:
+ self.priority = priority
+ if upstream_prefix is not None:
+ self.upstream_prefix = upstream_prefix
+ self.upstream_url = upstream_url
+ if verify_ssl is not None:
+ self.verify_ssl = verify_ssl
+
+ @property
+ def auth_mode(self):
+ """Gets the auth_mode of this GenericUpstreamRequest.
+
+ The authentication mode to use when accessing this upstream.
+
+ :return: The auth_mode of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._auth_mode
+
+ @auth_mode.setter
+ def auth_mode(self, auth_mode):
+ """Sets the auth_mode of this GenericUpstreamRequest.
+
+ The authentication mode to use when accessing this upstream.
+
+ :param auth_mode: The auth_mode of this GenericUpstreamRequest.
+ :type: str
+ """
+ allowed_values = ["None", "Username and Password", "Token"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ auth_mode not in allowed_values):
+ raise ValueError(
+ "Invalid value for `auth_mode` ({0}), must be one of {1}" # noqa: E501
+ .format(auth_mode, allowed_values)
+ )
+
+ self._auth_mode = auth_mode
+
+ @property
+ def auth_secret(self):
+ """Gets the auth_secret of this GenericUpstreamRequest.
+
+ Secret to provide with requests to upstream.
+
+ :return: The auth_secret of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._auth_secret
+
+ @auth_secret.setter
+ def auth_secret(self, auth_secret):
+ """Sets the auth_secret of this GenericUpstreamRequest.
+
+ Secret to provide with requests to upstream.
+
+ :param auth_secret: The auth_secret of this GenericUpstreamRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ auth_secret is not None and len(auth_secret) > 4096):
+ raise ValueError("Invalid value for `auth_secret`, length must be less than or equal to `4096`") # noqa: E501
+
+ self._auth_secret = auth_secret
+
+ @property
+ def auth_username(self):
+ """Gets the auth_username of this GenericUpstreamRequest.
+
+ Username to provide with requests to upstream.
+
+ :return: The auth_username of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._auth_username
+
+ @auth_username.setter
+ def auth_username(self, auth_username):
+ """Sets the auth_username of this GenericUpstreamRequest.
+
+ Username to provide with requests to upstream.
+
+ :param auth_username: The auth_username of this GenericUpstreamRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ auth_username is not None and len(auth_username) > 64):
+ raise ValueError("Invalid value for `auth_username`, length must be less than or equal to `64`") # noqa: E501
+
+ self._auth_username = auth_username
+
+ @property
+ def extra_header_1(self):
+ """Gets the extra_header_1 of this GenericUpstreamRequest.
+
+ The key for extra header #1 to send to upstream.
+
+ :return: The extra_header_1 of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._extra_header_1
+
+ @extra_header_1.setter
+ def extra_header_1(self, extra_header_1):
+ """Sets the extra_header_1 of this GenericUpstreamRequest.
+
+ The key for extra header #1 to send to upstream.
+
+ :param extra_header_1: The extra_header_1 of this GenericUpstreamRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_header_1 is not None and len(extra_header_1) > 64):
+ raise ValueError("Invalid value for `extra_header_1`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_header_1 is not None and not re.search('^[-\\w]+$', extra_header_1)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_header_1`, must be a follow pattern or equal to `/^[-\\w]+$/`") # noqa: E501
+
+ self._extra_header_1 = extra_header_1
+
+ @property
+ def extra_header_2(self):
+ """Gets the extra_header_2 of this GenericUpstreamRequest.
+
+ The key for extra header #2 to send to upstream.
+
+ :return: The extra_header_2 of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._extra_header_2
+
+ @extra_header_2.setter
+ def extra_header_2(self, extra_header_2):
+ """Sets the extra_header_2 of this GenericUpstreamRequest.
+
+ The key for extra header #2 to send to upstream.
+
+ :param extra_header_2: The extra_header_2 of this GenericUpstreamRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_header_2 is not None and len(extra_header_2) > 64):
+ raise ValueError("Invalid value for `extra_header_2`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_header_2 is not None and not re.search('^[-\\w]+$', extra_header_2)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_header_2`, must be a follow pattern or equal to `/^[-\\w]+$/`") # noqa: E501
+
+ self._extra_header_2 = extra_header_2
+
+ @property
+ def extra_value_1(self):
+ """Gets the extra_value_1 of this GenericUpstreamRequest.
+
+ The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :return: The extra_value_1 of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._extra_value_1
+
+ @extra_value_1.setter
+ def extra_value_1(self, extra_value_1):
+ """Sets the extra_value_1 of this GenericUpstreamRequest.
+
+ The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :param extra_value_1: The extra_value_1 of this GenericUpstreamRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_value_1 is not None and len(extra_value_1) > 128):
+ raise ValueError("Invalid value for `extra_value_1`, length must be less than or equal to `128`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_value_1 is not None and not re.search('^[^\\n\\r]+$', extra_value_1)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_value_1`, must be a follow pattern or equal to `/^[^\\n\\r]+$/`") # noqa: E501
+
+ self._extra_value_1 = extra_value_1
+
+ @property
+ def extra_value_2(self):
+ """Gets the extra_value_2 of this GenericUpstreamRequest.
+
+ The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :return: The extra_value_2 of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._extra_value_2
+
+ @extra_value_2.setter
+ def extra_value_2(self, extra_value_2):
+ """Sets the extra_value_2 of this GenericUpstreamRequest.
+
+ The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :param extra_value_2: The extra_value_2 of this GenericUpstreamRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_value_2 is not None and len(extra_value_2) > 128):
+ raise ValueError("Invalid value for `extra_value_2`, length must be less than or equal to `128`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_value_2 is not None and not re.search('^[^\\n\\r]+$', extra_value_2)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_value_2`, must be a follow pattern or equal to `/^[^\\n\\r]+$/`") # noqa: E501
+
+ self._extra_value_2 = extra_value_2
+
+ @property
+ def is_active(self):
+ """Gets the is_active of this GenericUpstreamRequest.
+
+ Whether or not this upstream is active and ready for requests.
+
+ :return: The is_active of this GenericUpstreamRequest.
+ :rtype: bool
+ """
+ return self._is_active
+
+ @is_active.setter
+ def is_active(self, is_active):
+ """Sets the is_active of this GenericUpstreamRequest.
+
+ Whether or not this upstream is active and ready for requests.
+
+ :param is_active: The is_active of this GenericUpstreamRequest.
+ :type: bool
+ """
+
+ self._is_active = is_active
+
+ @property
+ def mode(self):
+ """Gets the mode of this GenericUpstreamRequest.
+
+ The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+
+ :return: The mode of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._mode
+
+ @mode.setter
+ def mode(self, mode):
+ """Sets the mode of this GenericUpstreamRequest.
+
+ The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+
+ :param mode: The mode of this GenericUpstreamRequest.
+ :type: str
+ """
+ allowed_values = ["Proxy Only", "Cache and Proxy"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ mode not in allowed_values):
+ raise ValueError(
+ "Invalid value for `mode` ({0}), must be one of {1}" # noqa: E501
+ .format(mode, allowed_values)
+ )
+
+ self._mode = mode
+
+ @property
+ def name(self):
+ """Gets the name of this GenericUpstreamRequest.
+
+ A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+
+ :return: The name of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._name
+
+ @name.setter
+ def name(self, name):
+ """Sets the name of this GenericUpstreamRequest.
+
+ A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+
+ :param name: The name of this GenericUpstreamRequest.
+ :type: str
+ """
+ if self._configuration.client_side_validation and name is None:
+ raise ValueError("Invalid value for `name`, must not be `None`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) > 64):
+ raise ValueError("Invalid value for `name`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) < 1):
+ raise ValueError("Invalid value for `name`, length must be greater than or equal to `1`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and not re.search('^\\w[\\w \\-\'\\.\\/()]+$', name)): # noqa: E501
+ raise ValueError(r"Invalid value for `name`, must be a follow pattern or equal to `/^\\w[\\w \\-'\\.\/()]+$/`") # noqa: E501
+
+ self._name = name
+
+ @property
+ def priority(self):
+ """Gets the priority of this GenericUpstreamRequest.
+
+ Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+
+ :return: The priority of this GenericUpstreamRequest.
+ :rtype: int
+ """
+ return self._priority
+
+ @priority.setter
+ def priority(self, priority):
+ """Sets the priority of this GenericUpstreamRequest.
+
+ Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+
+ :param priority: The priority of this GenericUpstreamRequest.
+ :type: int
+ """
+ if (self._configuration.client_side_validation and
+ priority is not None and priority > 32767): # noqa: E501
+ raise ValueError("Invalid value for `priority`, must be a value less than or equal to `32767`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ priority is not None and priority < 1): # noqa: E501
+ raise ValueError("Invalid value for `priority`, must be a value greater than or equal to `1`") # noqa: E501
+
+ self._priority = priority
+
+ @property
+ def upstream_prefix(self):
+ """Gets the upstream_prefix of this GenericUpstreamRequest.
+
+ A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+
+ :return: The upstream_prefix of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._upstream_prefix
+
+ @upstream_prefix.setter
+ def upstream_prefix(self, upstream_prefix):
+ """Sets the upstream_prefix of this GenericUpstreamRequest.
+
+ A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+
+ :param upstream_prefix: The upstream_prefix of this GenericUpstreamRequest.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ upstream_prefix is not None and len(upstream_prefix) > 64):
+ raise ValueError("Invalid value for `upstream_prefix`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_prefix is not None and len(upstream_prefix) < 1):
+ raise ValueError("Invalid value for `upstream_prefix`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._upstream_prefix = upstream_prefix
+
+ @property
+ def upstream_url(self):
+ """Gets the upstream_url of this GenericUpstreamRequest.
+
+ The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+
+ :return: The upstream_url of this GenericUpstreamRequest.
+ :rtype: str
+ """
+ return self._upstream_url
+
+ @upstream_url.setter
+ def upstream_url(self, upstream_url):
+ """Sets the upstream_url of this GenericUpstreamRequest.
+
+ The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+
+ :param upstream_url: The upstream_url of this GenericUpstreamRequest.
+ :type: str
+ """
+ if self._configuration.client_side_validation and upstream_url is None:
+ raise ValueError("Invalid value for `upstream_url`, must not be `None`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_url is not None and len(upstream_url) > 200):
+ raise ValueError("Invalid value for `upstream_url`, length must be less than or equal to `200`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_url is not None and len(upstream_url) < 1):
+ raise ValueError("Invalid value for `upstream_url`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._upstream_url = upstream_url
+
+ @property
+ def verify_ssl(self):
+ """Gets the verify_ssl of this GenericUpstreamRequest.
+
+ If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+
+ :return: The verify_ssl of this GenericUpstreamRequest.
+ :rtype: bool
+ """
+ return self._verify_ssl
+
+ @verify_ssl.setter
+ def verify_ssl(self, verify_ssl):
+ """Sets the verify_ssl of this GenericUpstreamRequest.
+
+ If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+
+ :param verify_ssl: The verify_ssl of this GenericUpstreamRequest.
+ :type: bool
+ """
+
+ self._verify_ssl = verify_ssl
+
+ def to_dict(self):
+ """Returns the model properties as a dict"""
+ result = {}
+
+ for attr, _ in six.iteritems(self.swagger_types):
+ value = getattr(self, attr)
+ if isinstance(value, list):
+ result[attr] = list(map(
+ lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
+ value
+ ))
+ elif hasattr(value, "to_dict"):
+ result[attr] = value.to_dict()
+ elif isinstance(value, dict):
+ result[attr] = dict(map(
+ lambda item: (item[0], item[1].to_dict())
+ if hasattr(item[1], "to_dict") else item,
+ value.items()
+ ))
+ else:
+ result[attr] = value
+ if issubclass(GenericUpstreamRequest, dict):
+ for key, value in self.items():
+ result[key] = value
+
+ return result
+
+ def to_str(self):
+ """Returns the string representation of the model"""
+ return pprint.pformat(self.to_dict())
+
+ def __repr__(self):
+ """For `print` and `pprint`"""
+ return self.to_str()
+
+ def __eq__(self, other):
+ """Returns true if both objects are equal"""
+ if not isinstance(other, GenericUpstreamRequest):
+ return False
+
+ return self.to_dict() == other.to_dict()
+
+ def __ne__(self, other):
+ """Returns true if both objects are not equal"""
+ if not isinstance(other, GenericUpstreamRequest):
+ return True
+
+ return self.to_dict() != other.to_dict()
+
diff --git a/bindings/python/src/cloudsmith_api/models/generic_upstream_request_patch.py b/bindings/python/src/cloudsmith_api/models/generic_upstream_request_patch.py
new file mode 100644
index 00000000..976bf534
--- /dev/null
+++ b/bindings/python/src/cloudsmith_api/models/generic_upstream_request_patch.py
@@ -0,0 +1,561 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+import pprint
+import re # noqa: F401
+
+import six
+
+from cloudsmith_api.configuration import Configuration
+
+
+class GenericUpstreamRequestPatch(object):
+ """NOTE: This class is auto generated by the swagger code generator program.
+
+ Do not edit the class manually.
+ """
+
+ """
+ Attributes:
+ swagger_types (dict): The key is attribute name
+ and the value is attribute type.
+ attribute_map (dict): The key is attribute name
+ and the value is json key in definition.
+ """
+ swagger_types = {
+ 'auth_mode': 'str',
+ 'auth_secret': 'str',
+ 'auth_username': 'str',
+ 'extra_header_1': 'str',
+ 'extra_header_2': 'str',
+ 'extra_value_1': 'str',
+ 'extra_value_2': 'str',
+ 'is_active': 'bool',
+ 'mode': 'str',
+ 'name': 'str',
+ 'priority': 'int',
+ 'upstream_prefix': 'str',
+ 'upstream_url': 'str',
+ 'verify_ssl': 'bool'
+ }
+
+ attribute_map = {
+ 'auth_mode': 'auth_mode',
+ 'auth_secret': 'auth_secret',
+ 'auth_username': 'auth_username',
+ 'extra_header_1': 'extra_header_1',
+ 'extra_header_2': 'extra_header_2',
+ 'extra_value_1': 'extra_value_1',
+ 'extra_value_2': 'extra_value_2',
+ 'is_active': 'is_active',
+ 'mode': 'mode',
+ 'name': 'name',
+ 'priority': 'priority',
+ 'upstream_prefix': 'upstream_prefix',
+ 'upstream_url': 'upstream_url',
+ 'verify_ssl': 'verify_ssl'
+ }
+
+ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, is_active=None, mode='Proxy Only', name=None, priority=None, upstream_prefix=None, upstream_url=None, verify_ssl=None, _configuration=None): # noqa: E501
+ """GenericUpstreamRequestPatch - a model defined in Swagger""" # noqa: E501
+ if _configuration is None:
+ _configuration = Configuration()
+ self._configuration = _configuration
+
+ self._auth_mode = None
+ self._auth_secret = None
+ self._auth_username = None
+ self._extra_header_1 = None
+ self._extra_header_2 = None
+ self._extra_value_1 = None
+ self._extra_value_2 = None
+ self._is_active = None
+ self._mode = None
+ self._name = None
+ self._priority = None
+ self._upstream_prefix = None
+ self._upstream_url = None
+ self._verify_ssl = None
+ self.discriminator = None
+
+ if auth_mode is not None:
+ self.auth_mode = auth_mode
+ if auth_secret is not None:
+ self.auth_secret = auth_secret
+ if auth_username is not None:
+ self.auth_username = auth_username
+ if extra_header_1 is not None:
+ self.extra_header_1 = extra_header_1
+ if extra_header_2 is not None:
+ self.extra_header_2 = extra_header_2
+ if extra_value_1 is not None:
+ self.extra_value_1 = extra_value_1
+ if extra_value_2 is not None:
+ self.extra_value_2 = extra_value_2
+ if is_active is not None:
+ self.is_active = is_active
+ if mode is not None:
+ self.mode = mode
+ if name is not None:
+ self.name = name
+ if priority is not None:
+ self.priority = priority
+ if upstream_prefix is not None:
+ self.upstream_prefix = upstream_prefix
+ if upstream_url is not None:
+ self.upstream_url = upstream_url
+ if verify_ssl is not None:
+ self.verify_ssl = verify_ssl
+
+ @property
+ def auth_mode(self):
+ """Gets the auth_mode of this GenericUpstreamRequestPatch.
+
+ The authentication mode to use when accessing this upstream.
+
+ :return: The auth_mode of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._auth_mode
+
+ @auth_mode.setter
+ def auth_mode(self, auth_mode):
+ """Sets the auth_mode of this GenericUpstreamRequestPatch.
+
+ The authentication mode to use when accessing this upstream.
+
+ :param auth_mode: The auth_mode of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ allowed_values = ["None", "Username and Password", "Token"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ auth_mode not in allowed_values):
+ raise ValueError(
+ "Invalid value for `auth_mode` ({0}), must be one of {1}" # noqa: E501
+ .format(auth_mode, allowed_values)
+ )
+
+ self._auth_mode = auth_mode
+
+ @property
+ def auth_secret(self):
+ """Gets the auth_secret of this GenericUpstreamRequestPatch.
+
+ Secret to provide with requests to upstream.
+
+ :return: The auth_secret of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._auth_secret
+
+ @auth_secret.setter
+ def auth_secret(self, auth_secret):
+ """Sets the auth_secret of this GenericUpstreamRequestPatch.
+
+ Secret to provide with requests to upstream.
+
+ :param auth_secret: The auth_secret of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ auth_secret is not None and len(auth_secret) > 4096):
+ raise ValueError("Invalid value for `auth_secret`, length must be less than or equal to `4096`") # noqa: E501
+
+ self._auth_secret = auth_secret
+
+ @property
+ def auth_username(self):
+ """Gets the auth_username of this GenericUpstreamRequestPatch.
+
+ Username to provide with requests to upstream.
+
+ :return: The auth_username of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._auth_username
+
+ @auth_username.setter
+ def auth_username(self, auth_username):
+ """Sets the auth_username of this GenericUpstreamRequestPatch.
+
+ Username to provide with requests to upstream.
+
+ :param auth_username: The auth_username of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ auth_username is not None and len(auth_username) > 64):
+ raise ValueError("Invalid value for `auth_username`, length must be less than or equal to `64`") # noqa: E501
+
+ self._auth_username = auth_username
+
+ @property
+ def extra_header_1(self):
+ """Gets the extra_header_1 of this GenericUpstreamRequestPatch.
+
+ The key for extra header #1 to send to upstream.
+
+ :return: The extra_header_1 of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._extra_header_1
+
+ @extra_header_1.setter
+ def extra_header_1(self, extra_header_1):
+ """Sets the extra_header_1 of this GenericUpstreamRequestPatch.
+
+ The key for extra header #1 to send to upstream.
+
+ :param extra_header_1: The extra_header_1 of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_header_1 is not None and len(extra_header_1) > 64):
+ raise ValueError("Invalid value for `extra_header_1`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_header_1 is not None and not re.search('^[-\\w]+$', extra_header_1)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_header_1`, must be a follow pattern or equal to `/^[-\\w]+$/`") # noqa: E501
+
+ self._extra_header_1 = extra_header_1
+
+ @property
+ def extra_header_2(self):
+ """Gets the extra_header_2 of this GenericUpstreamRequestPatch.
+
+ The key for extra header #2 to send to upstream.
+
+ :return: The extra_header_2 of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._extra_header_2
+
+ @extra_header_2.setter
+ def extra_header_2(self, extra_header_2):
+ """Sets the extra_header_2 of this GenericUpstreamRequestPatch.
+
+ The key for extra header #2 to send to upstream.
+
+ :param extra_header_2: The extra_header_2 of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_header_2 is not None and len(extra_header_2) > 64):
+ raise ValueError("Invalid value for `extra_header_2`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_header_2 is not None and not re.search('^[-\\w]+$', extra_header_2)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_header_2`, must be a follow pattern or equal to `/^[-\\w]+$/`") # noqa: E501
+
+ self._extra_header_2 = extra_header_2
+
+ @property
+ def extra_value_1(self):
+ """Gets the extra_value_1 of this GenericUpstreamRequestPatch.
+
+ The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :return: The extra_value_1 of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._extra_value_1
+
+ @extra_value_1.setter
+ def extra_value_1(self, extra_value_1):
+ """Sets the extra_value_1 of this GenericUpstreamRequestPatch.
+
+ The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :param extra_value_1: The extra_value_1 of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_value_1 is not None and len(extra_value_1) > 128):
+ raise ValueError("Invalid value for `extra_value_1`, length must be less than or equal to `128`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_value_1 is not None and not re.search('^[^\\n\\r]+$', extra_value_1)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_value_1`, must be a follow pattern or equal to `/^[^\\n\\r]+$/`") # noqa: E501
+
+ self._extra_value_1 = extra_value_1
+
+ @property
+ def extra_value_2(self):
+ """Gets the extra_value_2 of this GenericUpstreamRequestPatch.
+
+ The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :return: The extra_value_2 of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._extra_value_2
+
+ @extra_value_2.setter
+ def extra_value_2(self, extra_value_2):
+ """Sets the extra_value_2 of this GenericUpstreamRequestPatch.
+
+ The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+
+ :param extra_value_2: The extra_value_2 of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ extra_value_2 is not None and len(extra_value_2) > 128):
+ raise ValueError("Invalid value for `extra_value_2`, length must be less than or equal to `128`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ extra_value_2 is not None and not re.search('^[^\\n\\r]+$', extra_value_2)): # noqa: E501
+ raise ValueError(r"Invalid value for `extra_value_2`, must be a follow pattern or equal to `/^[^\\n\\r]+$/`") # noqa: E501
+
+ self._extra_value_2 = extra_value_2
+
+ @property
+ def is_active(self):
+ """Gets the is_active of this GenericUpstreamRequestPatch.
+
+ Whether or not this upstream is active and ready for requests.
+
+ :return: The is_active of this GenericUpstreamRequestPatch.
+ :rtype: bool
+ """
+ return self._is_active
+
+ @is_active.setter
+ def is_active(self, is_active):
+ """Sets the is_active of this GenericUpstreamRequestPatch.
+
+ Whether or not this upstream is active and ready for requests.
+
+ :param is_active: The is_active of this GenericUpstreamRequestPatch.
+ :type: bool
+ """
+
+ self._is_active = is_active
+
+ @property
+ def mode(self):
+ """Gets the mode of this GenericUpstreamRequestPatch.
+
+ The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+
+ :return: The mode of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._mode
+
+ @mode.setter
+ def mode(self, mode):
+ """Sets the mode of this GenericUpstreamRequestPatch.
+
+ The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+
+ :param mode: The mode of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ allowed_values = ["Proxy Only", "Cache and Proxy"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ mode not in allowed_values):
+ raise ValueError(
+ "Invalid value for `mode` ({0}), must be one of {1}" # noqa: E501
+ .format(mode, allowed_values)
+ )
+
+ self._mode = mode
+
+ @property
+ def name(self):
+ """Gets the name of this GenericUpstreamRequestPatch.
+
+ A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+
+ :return: The name of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._name
+
+ @name.setter
+ def name(self, name):
+ """Sets the name of this GenericUpstreamRequestPatch.
+
+ A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+
+ :param name: The name of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) > 64):
+ raise ValueError("Invalid value for `name`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and len(name) < 1):
+ raise ValueError("Invalid value for `name`, length must be greater than or equal to `1`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ name is not None and not re.search('^\\w[\\w \\-\'\\.\\/()]+$', name)): # noqa: E501
+ raise ValueError(r"Invalid value for `name`, must be a follow pattern or equal to `/^\\w[\\w \\-'\\.\/()]+$/`") # noqa: E501
+
+ self._name = name
+
+ @property
+ def priority(self):
+ """Gets the priority of this GenericUpstreamRequestPatch.
+
+ Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+
+ :return: The priority of this GenericUpstreamRequestPatch.
+ :rtype: int
+ """
+ return self._priority
+
+ @priority.setter
+ def priority(self, priority):
+ """Sets the priority of this GenericUpstreamRequestPatch.
+
+ Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+
+ :param priority: The priority of this GenericUpstreamRequestPatch.
+ :type: int
+ """
+ if (self._configuration.client_side_validation and
+ priority is not None and priority > 32767): # noqa: E501
+ raise ValueError("Invalid value for `priority`, must be a value less than or equal to `32767`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ priority is not None and priority < 1): # noqa: E501
+ raise ValueError("Invalid value for `priority`, must be a value greater than or equal to `1`") # noqa: E501
+
+ self._priority = priority
+
+ @property
+ def upstream_prefix(self):
+ """Gets the upstream_prefix of this GenericUpstreamRequestPatch.
+
+ A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+
+ :return: The upstream_prefix of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._upstream_prefix
+
+ @upstream_prefix.setter
+ def upstream_prefix(self, upstream_prefix):
+ """Sets the upstream_prefix of this GenericUpstreamRequestPatch.
+
+ A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+
+ :param upstream_prefix: The upstream_prefix of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ upstream_prefix is not None and len(upstream_prefix) > 64):
+ raise ValueError("Invalid value for `upstream_prefix`, length must be less than or equal to `64`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_prefix is not None and len(upstream_prefix) < 1):
+ raise ValueError("Invalid value for `upstream_prefix`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._upstream_prefix = upstream_prefix
+
+ @property
+ def upstream_url(self):
+ """Gets the upstream_url of this GenericUpstreamRequestPatch.
+
+ The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+
+ :return: The upstream_url of this GenericUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._upstream_url
+
+ @upstream_url.setter
+ def upstream_url(self, upstream_url):
+ """Sets the upstream_url of this GenericUpstreamRequestPatch.
+
+ The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+
+ :param upstream_url: The upstream_url of this GenericUpstreamRequestPatch.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ upstream_url is not None and len(upstream_url) > 200):
+ raise ValueError("Invalid value for `upstream_url`, length must be less than or equal to `200`") # noqa: E501
+ if (self._configuration.client_side_validation and
+ upstream_url is not None and len(upstream_url) < 1):
+ raise ValueError("Invalid value for `upstream_url`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._upstream_url = upstream_url
+
+ @property
+ def verify_ssl(self):
+ """Gets the verify_ssl of this GenericUpstreamRequestPatch.
+
+ If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+
+ :return: The verify_ssl of this GenericUpstreamRequestPatch.
+ :rtype: bool
+ """
+ return self._verify_ssl
+
+ @verify_ssl.setter
+ def verify_ssl(self, verify_ssl):
+ """Sets the verify_ssl of this GenericUpstreamRequestPatch.
+
+ If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+
+ :param verify_ssl: The verify_ssl of this GenericUpstreamRequestPatch.
+ :type: bool
+ """
+
+ self._verify_ssl = verify_ssl
+
+ def to_dict(self):
+ """Returns the model properties as a dict"""
+ result = {}
+
+ for attr, _ in six.iteritems(self.swagger_types):
+ value = getattr(self, attr)
+ if isinstance(value, list):
+ result[attr] = list(map(
+ lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
+ value
+ ))
+ elif hasattr(value, "to_dict"):
+ result[attr] = value.to_dict()
+ elif isinstance(value, dict):
+ result[attr] = dict(map(
+ lambda item: (item[0], item[1].to_dict())
+ if hasattr(item[1], "to_dict") else item,
+ value.items()
+ ))
+ else:
+ result[attr] = value
+ if issubclass(GenericUpstreamRequestPatch, dict):
+ for key, value in self.items():
+ result[key] = value
+
+ return result
+
+ def to_str(self):
+ """Returns the string representation of the model"""
+ return pprint.pformat(self.to_dict())
+
+ def __repr__(self):
+ """For `print` and `pprint`"""
+ return self.to_str()
+
+ def __eq__(self, other):
+ """Returns true if both objects are equal"""
+ if not isinstance(other, GenericUpstreamRequestPatch):
+ return False
+
+ return self.to_dict() == other.to_dict()
+
+ def __ne__(self, other):
+ """Returns true if both objects are not equal"""
+ if not isinstance(other, GenericUpstreamRequestPatch):
+ return True
+
+ return self.to_dict() != other.to_dict()
+
diff --git a/bindings/python/src/cloudsmith_api/models/maven_upstream.py b/bindings/python/src/cloudsmith_api/models/maven_upstream.py
index 16cbc836..6dace370 100644
--- a/bindings/python/src/cloudsmith_api/models/maven_upstream.py
+++ b/bindings/python/src/cloudsmith_api/models/maven_upstream.py
@@ -59,6 +59,7 @@ class MavenUpstream(object):
'pending_validation': 'bool',
'priority': 'int',
'slug_perm': 'str',
+ 'trust_level': 'str',
'updated_at': 'datetime',
'upstream_url': 'str',
'verification_status': 'str',
@@ -92,13 +93,14 @@ class MavenUpstream(object):
'pending_validation': 'pending_validation',
'priority': 'priority',
'slug_perm': 'slug_perm',
+ 'trust_level': 'trust_level',
'updated_at': 'updated_at',
'upstream_url': 'upstream_url',
'verification_status': 'verification_status',
'verify_ssl': 'verify_ssl'
}
- def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, available=None, can_reindex=None, created_at=None, disable_reason='N/A', disable_reason_text=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, gpg_key_fingerprint_short=None, gpg_key_inline=None, gpg_key_url=None, gpg_verification='Allow All', has_failed_signature_verification=None, index_package_count=None, index_status=None, is_active=None, last_indexed=None, mode='Proxy Only', name=None, pending_validation=None, priority=None, slug_perm=None, updated_at=None, upstream_url=None, verification_status='Unknown', verify_ssl=None, _configuration=None): # noqa: E501
+ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, available=None, can_reindex=None, created_at=None, disable_reason='N/A', disable_reason_text=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, gpg_key_fingerprint_short=None, gpg_key_inline=None, gpg_key_url=None, gpg_verification='Allow All', has_failed_signature_verification=None, index_package_count=None, index_status=None, is_active=None, last_indexed=None, mode='Proxy Only', name=None, pending_validation=None, priority=None, slug_perm=None, trust_level='Trusted', updated_at=None, upstream_url=None, verification_status='Unknown', verify_ssl=None, _configuration=None): # noqa: E501
"""MavenUpstream - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -130,6 +132,7 @@ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, avail
self._pending_validation = None
self._priority = None
self._slug_perm = None
+ self._trust_level = None
self._updated_at = None
self._upstream_url = None
self._verification_status = None
@@ -187,6 +190,8 @@ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, avail
self.priority = priority
if slug_perm is not None:
self.slug_perm = slug_perm
+ if trust_level is not None:
+ self.trust_level = trust_level
if updated_at is not None:
self.updated_at = updated_at
self.upstream_url = upstream_url
@@ -865,6 +870,36 @@ def slug_perm(self, slug_perm):
self._slug_perm = slug_perm
+ @property
+ def trust_level(self):
+ """Gets the trust_level of this MavenUpstream.
+
+ Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+
+ :return: The trust_level of this MavenUpstream.
+ :rtype: str
+ """
+ return self._trust_level
+
+ @trust_level.setter
+ def trust_level(self, trust_level):
+ """Sets the trust_level of this MavenUpstream.
+
+ Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+
+ :param trust_level: The trust_level of this MavenUpstream.
+ :type: str
+ """
+ allowed_values = ["Trusted", "Untrusted"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ trust_level not in allowed_values):
+ raise ValueError(
+ "Invalid value for `trust_level` ({0}), must be one of {1}" # noqa: E501
+ .format(trust_level, allowed_values)
+ )
+
+ self._trust_level = trust_level
+
@property
def updated_at(self):
"""Gets the updated_at of this MavenUpstream.
diff --git a/bindings/python/src/cloudsmith_api/models/maven_upstream_request.py b/bindings/python/src/cloudsmith_api/models/maven_upstream_request.py
index 321be509..e63483a0 100644
--- a/bindings/python/src/cloudsmith_api/models/maven_upstream_request.py
+++ b/bindings/python/src/cloudsmith_api/models/maven_upstream_request.py
@@ -47,6 +47,7 @@ class MavenUpstreamRequest(object):
'mode': 'str',
'name': 'str',
'priority': 'int',
+ 'trust_level': 'str',
'upstream_url': 'str',
'verify_ssl': 'bool'
}
@@ -66,11 +67,12 @@ class MavenUpstreamRequest(object):
'mode': 'mode',
'name': 'name',
'priority': 'priority',
+ 'trust_level': 'trust_level',
'upstream_url': 'upstream_url',
'verify_ssl': 'verify_ssl'
}
- def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, gpg_key_inline=None, gpg_key_url=None, gpg_verification='Allow All', is_active=None, mode='Proxy Only', name=None, priority=None, upstream_url=None, verify_ssl=None, _configuration=None): # noqa: E501
+ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, gpg_key_inline=None, gpg_key_url=None, gpg_verification='Allow All', is_active=None, mode='Proxy Only', name=None, priority=None, trust_level='Trusted', upstream_url=None, verify_ssl=None, _configuration=None): # noqa: E501
"""MavenUpstreamRequest - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -90,6 +92,7 @@ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra
self._mode = None
self._name = None
self._priority = None
+ self._trust_level = None
self._upstream_url = None
self._verify_ssl = None
self.discriminator = None
@@ -121,6 +124,8 @@ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra
self.name = name
if priority is not None:
self.priority = priority
+ if trust_level is not None:
+ self.trust_level = trust_level
self.upstream_url = upstream_url
if verify_ssl is not None:
self.verify_ssl = verify_ssl
@@ -518,6 +523,36 @@ def priority(self, priority):
self._priority = priority
+ @property
+ def trust_level(self):
+ """Gets the trust_level of this MavenUpstreamRequest.
+
+ Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+
+ :return: The trust_level of this MavenUpstreamRequest.
+ :rtype: str
+ """
+ return self._trust_level
+
+ @trust_level.setter
+ def trust_level(self, trust_level):
+ """Sets the trust_level of this MavenUpstreamRequest.
+
+ Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+
+ :param trust_level: The trust_level of this MavenUpstreamRequest.
+ :type: str
+ """
+ allowed_values = ["Trusted", "Untrusted"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ trust_level not in allowed_values):
+ raise ValueError(
+ "Invalid value for `trust_level` ({0}), must be one of {1}" # noqa: E501
+ .format(trust_level, allowed_values)
+ )
+
+ self._trust_level = trust_level
+
@property
def upstream_url(self):
"""Gets the upstream_url of this MavenUpstreamRequest.
diff --git a/bindings/python/src/cloudsmith_api/models/maven_upstream_request_patch.py b/bindings/python/src/cloudsmith_api/models/maven_upstream_request_patch.py
index 80b08818..8c71cad8 100644
--- a/bindings/python/src/cloudsmith_api/models/maven_upstream_request_patch.py
+++ b/bindings/python/src/cloudsmith_api/models/maven_upstream_request_patch.py
@@ -47,6 +47,7 @@ class MavenUpstreamRequestPatch(object):
'mode': 'str',
'name': 'str',
'priority': 'int',
+ 'trust_level': 'str',
'upstream_url': 'str',
'verify_ssl': 'bool'
}
@@ -66,11 +67,12 @@ class MavenUpstreamRequestPatch(object):
'mode': 'mode',
'name': 'name',
'priority': 'priority',
+ 'trust_level': 'trust_level',
'upstream_url': 'upstream_url',
'verify_ssl': 'verify_ssl'
}
- def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, gpg_key_inline=None, gpg_key_url=None, gpg_verification='Allow All', is_active=None, mode='Proxy Only', name=None, priority=None, upstream_url=None, verify_ssl=None, _configuration=None): # noqa: E501
+ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra_header_1=None, extra_header_2=None, extra_value_1=None, extra_value_2=None, gpg_key_inline=None, gpg_key_url=None, gpg_verification='Allow All', is_active=None, mode='Proxy Only', name=None, priority=None, trust_level='Trusted', upstream_url=None, verify_ssl=None, _configuration=None): # noqa: E501
"""MavenUpstreamRequestPatch - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -90,6 +92,7 @@ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra
self._mode = None
self._name = None
self._priority = None
+ self._trust_level = None
self._upstream_url = None
self._verify_ssl = None
self.discriminator = None
@@ -122,6 +125,8 @@ def __init__(self, auth_mode='None', auth_secret=None, auth_username=None, extra
self.name = name
if priority is not None:
self.priority = priority
+ if trust_level is not None:
+ self.trust_level = trust_level
if upstream_url is not None:
self.upstream_url = upstream_url
if verify_ssl is not None:
@@ -518,6 +523,36 @@ def priority(self, priority):
self._priority = priority
+ @property
+ def trust_level(self):
+ """Gets the trust_level of this MavenUpstreamRequestPatch.
+
+ Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+
+ :return: The trust_level of this MavenUpstreamRequestPatch.
+ :rtype: str
+ """
+ return self._trust_level
+
+ @trust_level.setter
+ def trust_level(self, trust_level):
+ """Sets the trust_level of this MavenUpstreamRequestPatch.
+
+ Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+
+ :param trust_level: The trust_level of this MavenUpstreamRequestPatch.
+ :type: str
+ """
+ allowed_values = ["Trusted", "Untrusted"] # noqa: E501
+ if (self._configuration.client_side_validation and
+ trust_level not in allowed_values):
+ raise ValueError(
+ "Invalid value for `trust_level` ({0}), must be one of {1}" # noqa: E501
+ .format(trust_level, allowed_values)
+ )
+
+ self._trust_level = trust_level
+
@property
def upstream_url(self):
"""Gets the upstream_url of this MavenUpstreamRequestPatch.
diff --git a/bindings/python/src/cloudsmith_api/models/organization_team.py b/bindings/python/src/cloudsmith_api/models/organization_team.py
index 4a4ac652..9aa4fb0c 100644
--- a/bindings/python/src/cloudsmith_api/models/organization_team.py
+++ b/bindings/python/src/cloudsmith_api/models/organization_team.py
@@ -75,6 +75,7 @@ def __init__(self, description=None, name=None, slug=None, slug_perm=None, visib
def description(self):
"""Gets the description of this OrganizationTeam.
+ A detailed description of the team.
:return: The description of this OrganizationTeam.
:rtype: str
@@ -85,16 +86,14 @@ def description(self):
def description(self, description):
"""Sets the description of this OrganizationTeam.
+ A detailed description of the team.
:param description: The description of this OrganizationTeam.
:type: str
"""
if (self._configuration.client_side_validation and
- description is not None and len(description) > 140):
- raise ValueError("Invalid value for `description`, length must be less than or equal to `140`") # noqa: E501
- if (self._configuration.client_side_validation and
- description is not None and len(description) < 1):
- raise ValueError("Invalid value for `description`, length must be greater than or equal to `1`") # noqa: E501
+ description is not None and len(description) > 200):
+ raise ValueError("Invalid value for `description`, length must be less than or equal to `200`") # noqa: E501
self._description = description
diff --git a/bindings/python/src/cloudsmith_api/models/organization_team_request.py b/bindings/python/src/cloudsmith_api/models/organization_team_request.py
index a970e922..9e2aec95 100644
--- a/bindings/python/src/cloudsmith_api/models/organization_team_request.py
+++ b/bindings/python/src/cloudsmith_api/models/organization_team_request.py
@@ -70,6 +70,7 @@ def __init__(self, description=None, name=None, slug=None, visibility='Visible',
def description(self):
"""Gets the description of this OrganizationTeamRequest.
+ A detailed description of the team.
:return: The description of this OrganizationTeamRequest.
:rtype: str
@@ -80,16 +81,14 @@ def description(self):
def description(self, description):
"""Sets the description of this OrganizationTeamRequest.
+ A detailed description of the team.
:param description: The description of this OrganizationTeamRequest.
:type: str
"""
if (self._configuration.client_side_validation and
- description is not None and len(description) > 140):
- raise ValueError("Invalid value for `description`, length must be less than or equal to `140`") # noqa: E501
- if (self._configuration.client_side_validation and
- description is not None and len(description) < 1):
- raise ValueError("Invalid value for `description`, length must be greater than or equal to `1`") # noqa: E501
+ description is not None and len(description) > 200):
+ raise ValueError("Invalid value for `description`, length must be less than or equal to `200`") # noqa: E501
self._description = description
diff --git a/bindings/python/src/cloudsmith_api/models/organization_team_request_patch.py b/bindings/python/src/cloudsmith_api/models/organization_team_request_patch.py
index 72e75139..d8b80aa3 100644
--- a/bindings/python/src/cloudsmith_api/models/organization_team_request_patch.py
+++ b/bindings/python/src/cloudsmith_api/models/organization_team_request_patch.py
@@ -71,6 +71,7 @@ def __init__(self, description=None, name=None, slug=None, visibility='Visible',
def description(self):
"""Gets the description of this OrganizationTeamRequestPatch.
+ A detailed description of the team.
:return: The description of this OrganizationTeamRequestPatch.
:rtype: str
@@ -81,16 +82,14 @@ def description(self):
def description(self, description):
"""Sets the description of this OrganizationTeamRequestPatch.
+ A detailed description of the team.
:param description: The description of this OrganizationTeamRequestPatch.
:type: str
"""
if (self._configuration.client_side_validation and
- description is not None and len(description) > 140):
- raise ValueError("Invalid value for `description`, length must be less than or equal to `140`") # noqa: E501
- if (self._configuration.client_side_validation and
- description is not None and len(description) < 1):
- raise ValueError("Invalid value for `description`, length must be greater than or equal to `1`") # noqa: E501
+ description is not None and len(description) > 200):
+ raise ValueError("Invalid value for `description`, length must be less than or equal to `200`") # noqa: E501
self._description = description
diff --git a/bindings/python/src/cloudsmith_api/models/package.py b/bindings/python/src/cloudsmith_api/models/package.py
index 55f07e81..931d46fa 100644
--- a/bindings/python/src/cloudsmith_api/models/package.py
+++ b/bindings/python/src/cloudsmith_api/models/package.py
@@ -49,6 +49,7 @@ class Package(object):
'epoch': 'int',
'extension': 'str',
'filename': 'str',
+ 'filepath': 'str',
'files': 'list[PackageFile]',
'format': 'str',
'format_url': 'str',
@@ -136,6 +137,7 @@ class Package(object):
'epoch': 'epoch',
'extension': 'extension',
'filename': 'filename',
+ 'filepath': 'filepath',
'files': 'files',
'format': 'format',
'format_url': 'format_url',
@@ -206,7 +208,7 @@ class Package(object):
'vulnerability_scan_results_url': 'vulnerability_scan_results_url'
}
- def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
+ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, filepath=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
"""Package - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -228,6 +230,7 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self._epoch = None
self._extension = None
self._filename = None
+ self._filepath = None
self._files = None
self._format = None
self._format_url = None
@@ -330,6 +333,8 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self.extension = extension
if filename is not None:
self.filename = filename
+ if filepath is not None:
+ self.filepath = filepath
if files is not None:
self.files = files
if format is not None:
@@ -812,6 +817,32 @@ def filename(self, filename):
self._filename = filename
+ @property
+ def filepath(self):
+ """Gets the filepath of this Package.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :return: The filepath of this Package.
+ :rtype: str
+ """
+ return self._filepath
+
+ @filepath.setter
+ def filepath(self, filepath):
+ """Sets the filepath of this Package.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :param filepath: The filepath of this Package.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) < 1):
+ raise ValueError("Invalid value for `filepath`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filepath = filepath
+
@property
def files(self):
"""Gets the files of this Package.
diff --git a/bindings/python/src/cloudsmith_api/models/package_copy.py b/bindings/python/src/cloudsmith_api/models/package_copy.py
index 850d2922..5b0d9735 100644
--- a/bindings/python/src/cloudsmith_api/models/package_copy.py
+++ b/bindings/python/src/cloudsmith_api/models/package_copy.py
@@ -49,6 +49,7 @@ class PackageCopy(object):
'epoch': 'int',
'extension': 'str',
'filename': 'str',
+ 'filepath': 'str',
'files': 'list[PackageFile]',
'format': 'str',
'format_url': 'str',
@@ -136,6 +137,7 @@ class PackageCopy(object):
'epoch': 'epoch',
'extension': 'extension',
'filename': 'filename',
+ 'filepath': 'filepath',
'files': 'files',
'format': 'format',
'format_url': 'format_url',
@@ -206,7 +208,7 @@ class PackageCopy(object):
'vulnerability_scan_results_url': 'vulnerability_scan_results_url'
}
- def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
+ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, filepath=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
"""PackageCopy - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -228,6 +230,7 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self._epoch = None
self._extension = None
self._filename = None
+ self._filepath = None
self._files = None
self._format = None
self._format_url = None
@@ -330,6 +333,8 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self.extension = extension
if filename is not None:
self.filename = filename
+ if filepath is not None:
+ self.filepath = filepath
if files is not None:
self.files = files
if format is not None:
@@ -812,6 +817,32 @@ def filename(self, filename):
self._filename = filename
+ @property
+ def filepath(self):
+ """Gets the filepath of this PackageCopy.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :return: The filepath of this PackageCopy.
+ :rtype: str
+ """
+ return self._filepath
+
+ @filepath.setter
+ def filepath(self, filepath):
+ """Sets the filepath of this PackageCopy.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :param filepath: The filepath of this PackageCopy.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) < 1):
+ raise ValueError("Invalid value for `filepath`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filepath = filepath
+
@property
def files(self):
"""Gets the files of this PackageCopy.
diff --git a/bindings/python/src/cloudsmith_api/models/package_copy_request.py b/bindings/python/src/cloudsmith_api/models/package_copy_request.py
index c3c4e223..0dbe90c0 100644
--- a/bindings/python/src/cloudsmith_api/models/package_copy_request.py
+++ b/bindings/python/src/cloudsmith_api/models/package_copy_request.py
@@ -60,6 +60,7 @@ def __init__(self, destination=None, republish=None, _configuration=None): # no
def destination(self):
"""Gets the destination of this PackageCopyRequest.
+ The name of the destination repository without the namespace.
:return: The destination of this PackageCopyRequest.
:rtype: str
@@ -70,6 +71,7 @@ def destination(self):
def destination(self, destination):
"""Sets the destination of this PackageCopyRequest.
+ The name of the destination repository without the namespace.
:param destination: The destination of this PackageCopyRequest.
:type: str
diff --git a/bindings/python/src/cloudsmith_api/models/package_move.py b/bindings/python/src/cloudsmith_api/models/package_move.py
index 267e15be..45c75aa2 100644
--- a/bindings/python/src/cloudsmith_api/models/package_move.py
+++ b/bindings/python/src/cloudsmith_api/models/package_move.py
@@ -49,6 +49,7 @@ class PackageMove(object):
'epoch': 'int',
'extension': 'str',
'filename': 'str',
+ 'filepath': 'str',
'files': 'list[PackageFile]',
'format': 'str',
'format_url': 'str',
@@ -136,6 +137,7 @@ class PackageMove(object):
'epoch': 'epoch',
'extension': 'extension',
'filename': 'filename',
+ 'filepath': 'filepath',
'files': 'files',
'format': 'format',
'format_url': 'format_url',
@@ -206,7 +208,7 @@ class PackageMove(object):
'vulnerability_scan_results_url': 'vulnerability_scan_results_url'
}
- def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
+ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, filepath=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
"""PackageMove - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -228,6 +230,7 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self._epoch = None
self._extension = None
self._filename = None
+ self._filepath = None
self._files = None
self._format = None
self._format_url = None
@@ -330,6 +333,8 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self.extension = extension
if filename is not None:
self.filename = filename
+ if filepath is not None:
+ self.filepath = filepath
if files is not None:
self.files = files
if format is not None:
@@ -812,6 +817,32 @@ def filename(self, filename):
self._filename = filename
+ @property
+ def filepath(self):
+ """Gets the filepath of this PackageMove.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :return: The filepath of this PackageMove.
+ :rtype: str
+ """
+ return self._filepath
+
+ @filepath.setter
+ def filepath(self, filepath):
+ """Sets the filepath of this PackageMove.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :param filepath: The filepath of this PackageMove.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) < 1):
+ raise ValueError("Invalid value for `filepath`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filepath = filepath
+
@property
def files(self):
"""Gets the files of this PackageMove.
diff --git a/bindings/python/src/cloudsmith_api/models/package_move_request.py b/bindings/python/src/cloudsmith_api/models/package_move_request.py
index 43d70c90..287049fa 100644
--- a/bindings/python/src/cloudsmith_api/models/package_move_request.py
+++ b/bindings/python/src/cloudsmith_api/models/package_move_request.py
@@ -55,6 +55,7 @@ def __init__(self, destination=None, _configuration=None): # noqa: E501
def destination(self):
"""Gets the destination of this PackageMoveRequest.
+ The name of the destination repository without the namespace.
:return: The destination of this PackageMoveRequest.
:rtype: str
@@ -65,6 +66,7 @@ def destination(self):
def destination(self, destination):
"""Sets the destination of this PackageMoveRequest.
+ The name of the destination repository without the namespace.
:param destination: The destination of this PackageMoveRequest.
:type: str
diff --git a/bindings/python/src/cloudsmith_api/models/package_quarantine.py b/bindings/python/src/cloudsmith_api/models/package_quarantine.py
index 6e7cba9c..163cb01f 100644
--- a/bindings/python/src/cloudsmith_api/models/package_quarantine.py
+++ b/bindings/python/src/cloudsmith_api/models/package_quarantine.py
@@ -49,6 +49,7 @@ class PackageQuarantine(object):
'epoch': 'int',
'extension': 'str',
'filename': 'str',
+ 'filepath': 'str',
'files': 'list[PackageFile]',
'format': 'str',
'format_url': 'str',
@@ -135,6 +136,7 @@ class PackageQuarantine(object):
'epoch': 'epoch',
'extension': 'extension',
'filename': 'filename',
+ 'filepath': 'filepath',
'files': 'files',
'format': 'format',
'format_url': 'format_url',
@@ -204,7 +206,7 @@ class PackageQuarantine(object):
'vulnerability_scan_results_url': 'vulnerability_scan_results_url'
}
- def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
+ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, filepath=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
"""PackageQuarantine - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -226,6 +228,7 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self._epoch = None
self._extension = None
self._filename = None
+ self._filepath = None
self._files = None
self._format = None
self._format_url = None
@@ -327,6 +330,8 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self.extension = extension
if filename is not None:
self.filename = filename
+ if filepath is not None:
+ self.filepath = filepath
if files is not None:
self.files = files
if format is not None:
@@ -807,6 +812,32 @@ def filename(self, filename):
self._filename = filename
+ @property
+ def filepath(self):
+ """Gets the filepath of this PackageQuarantine.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :return: The filepath of this PackageQuarantine.
+ :rtype: str
+ """
+ return self._filepath
+
+ @filepath.setter
+ def filepath(self, filepath):
+ """Sets the filepath of this PackageQuarantine.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :param filepath: The filepath of this PackageQuarantine.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) < 1):
+ raise ValueError("Invalid value for `filepath`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filepath = filepath
+
@property
def files(self):
"""Gets the files of this PackageQuarantine.
diff --git a/bindings/python/src/cloudsmith_api/models/package_resync.py b/bindings/python/src/cloudsmith_api/models/package_resync.py
index 43e3efcc..a82e9f5b 100644
--- a/bindings/python/src/cloudsmith_api/models/package_resync.py
+++ b/bindings/python/src/cloudsmith_api/models/package_resync.py
@@ -49,6 +49,7 @@ class PackageResync(object):
'epoch': 'int',
'extension': 'str',
'filename': 'str',
+ 'filepath': 'str',
'files': 'list[PackageFile]',
'format': 'str',
'format_url': 'str',
@@ -136,6 +137,7 @@ class PackageResync(object):
'epoch': 'epoch',
'extension': 'extension',
'filename': 'filename',
+ 'filepath': 'filepath',
'files': 'files',
'format': 'format',
'format_url': 'format_url',
@@ -206,7 +208,7 @@ class PackageResync(object):
'vulnerability_scan_results_url': 'vulnerability_scan_results_url'
}
- def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
+ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, filepath=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
"""PackageResync - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -228,6 +230,7 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self._epoch = None
self._extension = None
self._filename = None
+ self._filepath = None
self._files = None
self._format = None
self._format_url = None
@@ -330,6 +333,8 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self.extension = extension
if filename is not None:
self.filename = filename
+ if filepath is not None:
+ self.filepath = filepath
if files is not None:
self.files = files
if format is not None:
@@ -812,6 +817,32 @@ def filename(self, filename):
self._filename = filename
+ @property
+ def filepath(self):
+ """Gets the filepath of this PackageResync.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :return: The filepath of this PackageResync.
+ :rtype: str
+ """
+ return self._filepath
+
+ @filepath.setter
+ def filepath(self, filepath):
+ """Sets the filepath of this PackageResync.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :param filepath: The filepath of this PackageResync.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) < 1):
+ raise ValueError("Invalid value for `filepath`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filepath = filepath
+
@property
def files(self):
"""Gets the files of this PackageResync.
diff --git a/bindings/python/src/cloudsmith_api/models/package_tag.py b/bindings/python/src/cloudsmith_api/models/package_tag.py
index 3a047c57..e3401d51 100644
--- a/bindings/python/src/cloudsmith_api/models/package_tag.py
+++ b/bindings/python/src/cloudsmith_api/models/package_tag.py
@@ -49,6 +49,7 @@ class PackageTag(object):
'epoch': 'int',
'extension': 'str',
'filename': 'str',
+ 'filepath': 'str',
'files': 'list[PackageFile]',
'format': 'str',
'format_url': 'str',
@@ -136,6 +137,7 @@ class PackageTag(object):
'epoch': 'epoch',
'extension': 'extension',
'filename': 'filename',
+ 'filepath': 'filepath',
'files': 'files',
'format': 'format',
'format_url': 'format_url',
@@ -206,7 +208,7 @@ class PackageTag(object):
'vulnerability_scan_results_url': 'vulnerability_scan_results_url'
}
- def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_immutable=False, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
+ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum_sha1=None, checksum_sha256=None, checksum_sha512=None, dependencies_checksum_md5=None, dependencies_url=None, description=None, display_name=None, distro=None, distro_version=None, downloads=None, epoch=None, extension=None, filename=None, filepath=None, files=None, format=None, format_url=None, freeable_storage=None, fully_qualified_name=None, identifier_perm=None, identifiers=None, indexed=None, is_cancellable=None, is_copyable=None, is_deleteable=None, is_downloadable=None, is_immutable=False, is_moveable=None, is_quarantinable=None, is_quarantined=None, is_resyncable=None, is_security_scannable=None, is_sync_awaiting=None, is_sync_completed=None, is_sync_failed=None, is_sync_in_flight=None, is_sync_in_progress=None, license=None, name=None, namespace=None, namespace_url=None, num_files=None, origin_repository=None, origin_repository_url=None, package_type=None, policy_violated=None, raw_license=None, release=None, repository=None, repository_url=None, security_scan_completed_at=None, security_scan_started_at=None, security_scan_status='Awaiting Security Scan', security_scan_status_updated_at=None, self_html_url=None, self_url=None, signature_url=None, size=None, slug=None, slug_perm=None, spdx_license=None, stage=None, stage_str=None, stage_updated_at=None, status=None, status_reason=None, status_str=None, status_updated_at=None, status_url=None, subtype=None, summary=None, sync_finished_at=None, sync_progress=None, tags_automatic=None, tags_immutable=None, type_display=None, uploaded_at=None, uploader=None, uploader_url=None, version=None, version_orig=None, vulnerability_scan_results_url=None, _configuration=None): # noqa: E501
"""PackageTag - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
@@ -228,6 +230,7 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self._epoch = None
self._extension = None
self._filename = None
+ self._filepath = None
self._files = None
self._format = None
self._format_url = None
@@ -330,6 +333,8 @@ def __init__(self, architectures=None, cdn_url=None, checksum_md5=None, checksum
self.extension = extension
if filename is not None:
self.filename = filename
+ if filepath is not None:
+ self.filepath = filepath
if files is not None:
self.files = files
if format is not None:
@@ -812,6 +817,32 @@ def filename(self, filename):
self._filename = filename
+ @property
+ def filepath(self):
+ """Gets the filepath of this PackageTag.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :return: The filepath of this PackageTag.
+ :rtype: str
+ """
+ return self._filepath
+
+ @filepath.setter
+ def filepath(self, filepath):
+ """Sets the filepath of this PackageTag.
+
+ Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+
+ :param filepath: The filepath of this PackageTag.
+ :type: str
+ """
+ if (self._configuration.client_side_validation and
+ filepath is not None and len(filepath) < 1):
+ raise ValueError("Invalid value for `filepath`, length must be greater than or equal to `1`") # noqa: E501
+
+ self._filepath = filepath
+
@property
def files(self):
"""Gets the files of this PackageTag.
diff --git a/bindings/python/src/docs/FormatSupport.md b/bindings/python/src/docs/FormatSupport.md
index 83707677..0722fff4 100644
--- a/bindings/python/src/docs/FormatSupport.md
+++ b/bindings/python/src/docs/FormatSupport.md
@@ -6,6 +6,7 @@ Name | Type | Description | Notes
**dependencies** | **bool** | If true the package format supports dependencies |
**distributions** | **bool** | If true the package format supports distributions |
**file_lists** | **bool** | If true the package format supports file lists |
+**filepaths** | **bool** | If true the package format supports filepaths |
**metadata** | **bool** | If true the package format supports metadata |
**upstreams** | [**FormatSupportUpstream**](FormatSupportUpstream.md) | |
**versioning** | **bool** | If true the package format supports versioning |
diff --git a/bindings/python/src/docs/GenericPackageUpload.md b/bindings/python/src/docs/GenericPackageUpload.md
new file mode 100644
index 00000000..e263bedd
--- /dev/null
+++ b/bindings/python/src/docs/GenericPackageUpload.md
@@ -0,0 +1,92 @@
+# GenericPackageUpload
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**architectures** | [**list[Architecture]**](Architecture.md) | | [optional]
+**cdn_url** | **str** | | [optional]
+**checksum_md5** | **str** | | [optional]
+**checksum_sha1** | **str** | | [optional]
+**checksum_sha256** | **str** | | [optional]
+**checksum_sha512** | **str** | | [optional]
+**dependencies_checksum_md5** | **str** | A checksum of all of the package's dependencies. | [optional]
+**dependencies_url** | **str** | | [optional]
+**description** | **str** | A textual description of this package. | [optional]
+**display_name** | **str** | | [optional]
+**distro** | [**Distribution**](Distribution.md) | | [optional]
+**distro_version** | [**DistributionVersion**](DistributionVersion.md) | | [optional]
+**downloads** | **int** | | [optional]
+**epoch** | **int** | The epoch of the package version (if any). | [optional]
+**extension** | **str** | | [optional]
+**filename** | **str** | | [optional]
+**files** | [**list[PackageFile]**](PackageFile.md) | | [optional]
+**format** | **str** | | [optional]
+**format_url** | **str** | | [optional]
+**freeable_storage** | **int** | Amount of storage that will be freed if this package is deleted | [optional]
+**fully_qualified_name** | **str** | | [optional]
+**identifier_perm** | **str** | Unique and permanent identifier for the package. | [optional]
+**identifiers** | **dict(str, str)** | Return a map of identifier field names and their values. | [optional]
+**indexed** | **bool** | | [optional]
+**is_cancellable** | **bool** | | [optional]
+**is_copyable** | **bool** | | [optional]
+**is_deleteable** | **bool** | | [optional]
+**is_downloadable** | **bool** | | [optional]
+**is_moveable** | **bool** | | [optional]
+**is_quarantinable** | **bool** | | [optional]
+**is_quarantined** | **bool** | | [optional]
+**is_resyncable** | **bool** | | [optional]
+**is_security_scannable** | **bool** | | [optional]
+**is_sync_awaiting** | **bool** | | [optional]
+**is_sync_completed** | **bool** | | [optional]
+**is_sync_failed** | **bool** | | [optional]
+**is_sync_in_flight** | **bool** | | [optional]
+**is_sync_in_progress** | **bool** | | [optional]
+**license** | **str** | The license of this package. | [optional]
+**name** | **str** | The name of this package. | [optional]
+**namespace** | **str** | | [optional]
+**namespace_url** | **str** | | [optional]
+**num_files** | **int** | | [optional]
+**origin_repository** | **str** | | [optional]
+**origin_repository_url** | **str** | | [optional]
+**package_type** | **int** | The type of package contents. | [optional]
+**policy_violated** | **bool** | Whether or not the package has violated any policy. | [optional]
+**raw_license** | **str** | The raw license string. | [optional]
+**release** | **str** | The release of the package version (if any). | [optional]
+**repository** | **str** | | [optional]
+**repository_url** | **str** | | [optional]
+**security_scan_completed_at** | **datetime** | The datetime the security scanning was completed. | [optional]
+**security_scan_started_at** | **datetime** | The datetime the security scanning was started. | [optional]
+**security_scan_status** | **str** | | [optional] [default to 'Awaiting Security Scan']
+**security_scan_status_updated_at** | **datetime** | The datetime the security scanning status was updated. | [optional]
+**self_html_url** | **str** | | [optional]
+**self_url** | **str** | | [optional]
+**signature_url** | **str** | | [optional]
+**size** | **int** | The calculated size of the package. | [optional]
+**slug** | **str** | The public unique identifier for the package. | [optional]
+**slug_perm** | **str** | | [optional]
+**spdx_license** | **str** | The SPDX license identifier for this package. | [optional]
+**stage** | **int** | The synchronisation (in progress) stage of the package. | [optional]
+**stage_str** | **str** | | [optional]
+**stage_updated_at** | **datetime** | The datetime the package stage was updated at. | [optional]
+**status** | **int** | The synchronisation status of the package. | [optional]
+**status_reason** | **str** | A textual description for the synchronous status reason (if any | [optional]
+**status_str** | **str** | | [optional]
+**status_updated_at** | **datetime** | The datetime the package status was updated at. | [optional]
+**status_url** | **str** | | [optional]
+**subtype** | **str** | | [optional]
+**summary** | **str** | A one-liner synopsis of this package. | [optional]
+**sync_finished_at** | **datetime** | The datetime the package sync was finished at. | [optional]
+**sync_progress** | **int** | Synchronisation progress (from 0-100) | [optional]
+**tags_automatic** | [**Tags**](Tags.md) | | [optional]
+**tags_immutable** | [**Tags**](Tags.md) | | [optional]
+**type_display** | **str** | | [optional]
+**uploaded_at** | **datetime** | The date this package was uploaded. | [optional]
+**uploader** | **str** | | [optional]
+**uploader_url** | **str** | | [optional]
+**version** | **str** | The raw version for this package. | [optional]
+**version_orig** | **str** | | [optional]
+**vulnerability_scan_results_url** | **str** | | [optional]
+
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/bindings/python/src/docs/GenericPackageUploadRequest.md b/bindings/python/src/docs/GenericPackageUploadRequest.md
new file mode 100644
index 00000000..0acaf22a
--- /dev/null
+++ b/bindings/python/src/docs/GenericPackageUploadRequest.md
@@ -0,0 +1,15 @@
+# GenericPackageUploadRequest
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**filepath** | **str** | The full filepath of the package including filename. |
+**name** | **str** | The name of this package. | [optional]
+**package_file** | **str** | The primary file for the package. |
+**republish** | **bool** | If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate. | [optional]
+**tags** | **str** | A comma-separated values list of tags to add to the package. | [optional]
+**version** | **str** | The raw version for this package. | [optional]
+
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/bindings/python/src/docs/GenericUpstream.md b/bindings/python/src/docs/GenericUpstream.md
new file mode 100644
index 00000000..692f7979
--- /dev/null
+++ b/bindings/python/src/docs/GenericUpstream.md
@@ -0,0 +1,35 @@
+# GenericUpstream
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**auth_mode** | **str** | The authentication mode to use when accessing this upstream. | [optional] [default to 'None']
+**auth_secret** | **str** | Secret to provide with requests to upstream. | [optional]
+**auth_username** | **str** | Username to provide with requests to upstream. | [optional]
+**available** | **str** | | [optional]
+**can_reindex** | **str** | | [optional]
+**created_at** | **datetime** | The datetime the upstream source was created. | [optional]
+**disable_reason** | **str** | | [optional] [default to 'N/A']
+**disable_reason_text** | **str** | Human-readable explanation of why this upstream is disabled | [optional]
+**extra_header_1** | **str** | The key for extra header #1 to send to upstream. | [optional]
+**extra_header_2** | **str** | The key for extra header #2 to send to upstream. | [optional]
+**extra_value_1** | **str** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extra_value_2** | **str** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**has_failed_signature_verification** | **str** | | [optional]
+**index_package_count** | **str** | The number of packages available in this upstream source | [optional]
+**index_status** | **str** | The current indexing status of this upstream source | [optional]
+**is_active** | **bool** | Whether or not this upstream is active and ready for requests. | [optional]
+**last_indexed** | **str** | The last time this upstream source was indexed | [optional]
+**mode** | **str** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
+**name** | **str** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
+**pending_validation** | **bool** | When true, this upstream source is pending validation. | [optional]
+**priority** | **int** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**slug_perm** | **str** | | [optional]
+**updated_at** | **datetime** | | [optional]
+**upstream_prefix** | **str** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstream_url** | **str** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
+**verify_ssl** | **bool** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/bindings/python/src/docs/GenericUpstreamRequest.md b/bindings/python/src/docs/GenericUpstreamRequest.md
new file mode 100644
index 00000000..6477a415
--- /dev/null
+++ b/bindings/python/src/docs/GenericUpstreamRequest.md
@@ -0,0 +1,23 @@
+# GenericUpstreamRequest
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**auth_mode** | **str** | The authentication mode to use when accessing this upstream. | [optional] [default to 'None']
+**auth_secret** | **str** | Secret to provide with requests to upstream. | [optional]
+**auth_username** | **str** | Username to provide with requests to upstream. | [optional]
+**extra_header_1** | **str** | The key for extra header #1 to send to upstream. | [optional]
+**extra_header_2** | **str** | The key for extra header #2 to send to upstream. | [optional]
+**extra_value_1** | **str** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extra_value_2** | **str** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**is_active** | **bool** | Whether or not this upstream is active and ready for requests. | [optional]
+**mode** | **str** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
+**name** | **str** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
+**priority** | **int** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**upstream_prefix** | **str** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstream_url** | **str** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
+**verify_ssl** | **bool** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/bindings/python/src/docs/GenericUpstreamRequestPatch.md b/bindings/python/src/docs/GenericUpstreamRequestPatch.md
new file mode 100644
index 00000000..046b3b54
--- /dev/null
+++ b/bindings/python/src/docs/GenericUpstreamRequestPatch.md
@@ -0,0 +1,23 @@
+# GenericUpstreamRequestPatch
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**auth_mode** | **str** | The authentication mode to use when accessing this upstream. | [optional] [default to 'None']
+**auth_secret** | **str** | Secret to provide with requests to upstream. | [optional]
+**auth_username** | **str** | Username to provide with requests to upstream. | [optional]
+**extra_header_1** | **str** | The key for extra header #1 to send to upstream. | [optional]
+**extra_header_2** | **str** | The key for extra header #2 to send to upstream. | [optional]
+**extra_value_1** | **str** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extra_value_2** | **str** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**is_active** | **bool** | Whether or not this upstream is active and ready for requests. | [optional]
+**mode** | **str** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
+**name** | **str** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. | [optional]
+**priority** | **int** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**upstream_prefix** | **str** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstream_url** | **str** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. | [optional]
+**verify_ssl** | **bool** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/bindings/python/src/docs/MavenUpstream.md b/bindings/python/src/docs/MavenUpstream.md
index a5c1d3fa..34d39a1e 100644
--- a/bindings/python/src/docs/MavenUpstream.md
+++ b/bindings/python/src/docs/MavenUpstream.md
@@ -29,6 +29,7 @@ Name | Type | Description | Notes
**pending_validation** | **bool** | When true, this upstream source is pending validation. | [optional]
**priority** | **int** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
**slug_perm** | **str** | | [optional]
+**trust_level** | **str** | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional] [default to 'Trusted']
**updated_at** | **datetime** | | [optional]
**upstream_url** | **str** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
**verification_status** | **str** | The signature verification status for this upstream. | [optional] [default to 'Unknown']
diff --git a/bindings/python/src/docs/MavenUpstreamRequest.md b/bindings/python/src/docs/MavenUpstreamRequest.md
index e0a47d5b..6d9dc410 100644
--- a/bindings/python/src/docs/MavenUpstreamRequest.md
+++ b/bindings/python/src/docs/MavenUpstreamRequest.md
@@ -17,6 +17,7 @@ Name | Type | Description | Notes
**mode** | **str** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
**name** | **str** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
**priority** | **int** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**trust_level** | **str** | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional] [default to 'Trusted']
**upstream_url** | **str** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
**verify_ssl** | **bool** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
diff --git a/bindings/python/src/docs/MavenUpstreamRequestPatch.md b/bindings/python/src/docs/MavenUpstreamRequestPatch.md
index c26b81f7..1c4c5a1c 100644
--- a/bindings/python/src/docs/MavenUpstreamRequestPatch.md
+++ b/bindings/python/src/docs/MavenUpstreamRequestPatch.md
@@ -17,6 +17,7 @@ Name | Type | Description | Notes
**mode** | **str** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
**name** | **str** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. | [optional]
**priority** | **int** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**trust_level** | **str** | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional] [default to 'Trusted']
**upstream_url** | **str** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. | [optional]
**verify_ssl** | **bool** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
diff --git a/bindings/python/src/docs/OrganizationTeam.md b/bindings/python/src/docs/OrganizationTeam.md
index 16a5c054..73e656ef 100644
--- a/bindings/python/src/docs/OrganizationTeam.md
+++ b/bindings/python/src/docs/OrganizationTeam.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **str** | | [optional]
+**description** | **str** | A detailed description of the team. | [optional]
**name** | **str** | A descriptive name for the team. |
**slug** | **str** | | [optional]
**slug_perm** | **str** | | [optional]
diff --git a/bindings/python/src/docs/OrganizationTeamRequest.md b/bindings/python/src/docs/OrganizationTeamRequest.md
index eece20db..c0df5c34 100644
--- a/bindings/python/src/docs/OrganizationTeamRequest.md
+++ b/bindings/python/src/docs/OrganizationTeamRequest.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **str** | | [optional]
+**description** | **str** | A detailed description of the team. | [optional]
**name** | **str** | A descriptive name for the team. |
**slug** | **str** | | [optional]
**visibility** | **str** | | [optional] [default to 'Visible']
diff --git a/bindings/python/src/docs/OrganizationTeamRequestPatch.md b/bindings/python/src/docs/OrganizationTeamRequestPatch.md
index 5d03ec78..404b2c97 100644
--- a/bindings/python/src/docs/OrganizationTeamRequestPatch.md
+++ b/bindings/python/src/docs/OrganizationTeamRequestPatch.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **str** | | [optional]
+**description** | **str** | A detailed description of the team. | [optional]
**name** | **str** | A descriptive name for the team. | [optional]
**slug** | **str** | | [optional]
**visibility** | **str** | | [optional] [default to 'Visible']
diff --git a/bindings/python/src/docs/Package.md b/bindings/python/src/docs/Package.md
index 5f048322..3f9fe5f3 100644
--- a/bindings/python/src/docs/Package.md
+++ b/bindings/python/src/docs/Package.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **int** | The epoch of the package version (if any). | [optional]
**extension** | **str** | | [optional]
**filename** | **str** | | [optional]
+**filepath** | **str** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**list[PackageFile]**](PackageFile.md) | | [optional]
**format** | **str** | | [optional]
**format_url** | **str** | | [optional]
diff --git a/bindings/python/src/docs/PackageCopy.md b/bindings/python/src/docs/PackageCopy.md
index 4e4a9350..7968833f 100644
--- a/bindings/python/src/docs/PackageCopy.md
+++ b/bindings/python/src/docs/PackageCopy.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **int** | The epoch of the package version (if any). | [optional]
**extension** | **str** | | [optional]
**filename** | **str** | | [optional]
+**filepath** | **str** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**list[PackageFile]**](PackageFile.md) | | [optional]
**format** | **str** | | [optional]
**format_url** | **str** | | [optional]
diff --git a/bindings/python/src/docs/PackageCopyRequest.md b/bindings/python/src/docs/PackageCopyRequest.md
index bcea1acc..43238c76 100644
--- a/bindings/python/src/docs/PackageCopyRequest.md
+++ b/bindings/python/src/docs/PackageCopyRequest.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**destination** | **str** | |
+**destination** | **str** | The name of the destination repository without the namespace. |
**republish** | **bool** | If true, the package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate. | [optional]
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
diff --git a/bindings/python/src/docs/PackageMove.md b/bindings/python/src/docs/PackageMove.md
index 129e6232..bc4a49a1 100644
--- a/bindings/python/src/docs/PackageMove.md
+++ b/bindings/python/src/docs/PackageMove.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **int** | The epoch of the package version (if any). | [optional]
**extension** | **str** | | [optional]
**filename** | **str** | | [optional]
+**filepath** | **str** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**list[PackageFile]**](PackageFile.md) | | [optional]
**format** | **str** | | [optional]
**format_url** | **str** | | [optional]
diff --git a/bindings/python/src/docs/PackageMoveRequest.md b/bindings/python/src/docs/PackageMoveRequest.md
index a680bdaa..6965de8d 100644
--- a/bindings/python/src/docs/PackageMoveRequest.md
+++ b/bindings/python/src/docs/PackageMoveRequest.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**destination** | **str** | |
+**destination** | **str** | The name of the destination repository without the namespace. |
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
diff --git a/bindings/python/src/docs/PackageQuarantine.md b/bindings/python/src/docs/PackageQuarantine.md
index 23131296..ed15f799 100644
--- a/bindings/python/src/docs/PackageQuarantine.md
+++ b/bindings/python/src/docs/PackageQuarantine.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **int** | The epoch of the package version (if any). | [optional]
**extension** | **str** | | [optional]
**filename** | **str** | | [optional]
+**filepath** | **str** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**list[PackageFile]**](PackageFile.md) | | [optional]
**format** | **str** | | [optional]
**format_url** | **str** | | [optional]
diff --git a/bindings/python/src/docs/PackageResync.md b/bindings/python/src/docs/PackageResync.md
index 1664460e..82e29c67 100644
--- a/bindings/python/src/docs/PackageResync.md
+++ b/bindings/python/src/docs/PackageResync.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **int** | The epoch of the package version (if any). | [optional]
**extension** | **str** | | [optional]
**filename** | **str** | | [optional]
+**filepath** | **str** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**list[PackageFile]**](PackageFile.md) | | [optional]
**format** | **str** | | [optional]
**format_url** | **str** | | [optional]
diff --git a/bindings/python/src/docs/PackageTag.md b/bindings/python/src/docs/PackageTag.md
index 24d508dd..e01323e3 100644
--- a/bindings/python/src/docs/PackageTag.md
+++ b/bindings/python/src/docs/PackageTag.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **int** | The epoch of the package version (if any). | [optional]
**extension** | **str** | | [optional]
**filename** | **str** | | [optional]
+**filepath** | **str** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**list[PackageFile]**](PackageFile.md) | | [optional]
**format** | **str** | | [optional]
**format_url** | **str** | | [optional]
diff --git a/bindings/python/src/docs/PackagesApi.md b/bindings/python/src/docs/PackagesApi.md
index 4db668bf..7bf947db 100644
--- a/bindings/python/src/docs/PackagesApi.md
+++ b/bindings/python/src/docs/PackagesApi.md
@@ -27,6 +27,7 @@ Method | HTTP request | Description
[**packages_upload_dart**](PackagesApi.md#packages_upload_dart) | **POST** /packages/{owner}/{repo}/upload/dart/ | Create a new Dart package
[**packages_upload_deb**](PackagesApi.md#packages_upload_deb) | **POST** /packages/{owner}/{repo}/upload/deb/ | Create a new Debian package
[**packages_upload_docker**](PackagesApi.md#packages_upload_docker) | **POST** /packages/{owner}/{repo}/upload/docker/ | Create a new Docker package
+[**packages_upload_generic**](PackagesApi.md#packages_upload_generic) | **POST** /packages/{owner}/{repo}/upload/generic/ | Create a new Generic package
[**packages_upload_go**](PackagesApi.md#packages_upload_go) | **POST** /packages/{owner}/{repo}/upload/go/ | Create a new Go package
[**packages_upload_helm**](PackagesApi.md#packages_upload_helm) | **POST** /packages/{owner}/{repo}/upload/helm/ | Create a new Helm package
[**packages_upload_hex**](PackagesApi.md#packages_upload_hex) | **POST** /packages/{owner}/{repo}/upload/hex/ | Create a new Hex package
@@ -53,6 +54,7 @@ Method | HTTP request | Description
[**packages_validate_upload_dart**](PackagesApi.md#packages_validate_upload_dart) | **POST** /packages/{owner}/{repo}/validate-upload/dart/ | Validate parameters for create Dart package
[**packages_validate_upload_deb**](PackagesApi.md#packages_validate_upload_deb) | **POST** /packages/{owner}/{repo}/validate-upload/deb/ | Validate parameters for create Debian package
[**packages_validate_upload_docker**](PackagesApi.md#packages_validate_upload_docker) | **POST** /packages/{owner}/{repo}/validate-upload/docker/ | Validate parameters for create Docker package
+[**packages_validate_upload_generic**](PackagesApi.md#packages_validate_upload_generic) | **POST** /packages/{owner}/{repo}/validate-upload/generic/ | Validate parameters for create Generic package
[**packages_validate_upload_go**](PackagesApi.md#packages_validate_upload_go) | **POST** /packages/{owner}/{repo}/validate-upload/go/ | Validate parameters for create Go package
[**packages_validate_upload_helm**](PackagesApi.md#packages_validate_upload_helm) | **POST** /packages/{owner}/{repo}/validate-upload/helm/ | Validate parameters for create Helm package
[**packages_validate_upload_hex**](PackagesApi.md#packages_validate_upload_hex) | **POST** /packages/{owner}/{repo}/validate-upload/hex/ | Validate parameters for create Hex package
@@ -1522,6 +1524,68 @@ Name | Type | Description | Notes
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+# **packages_upload_generic**
+> GenericPackageUpload packages_upload_generic(owner, repo, data=data)
+
+Create a new Generic package
+
+Create a new Generic package
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.PackagesApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+repo = 'repo_example' # str |
+data = cloudsmith_api.GenericPackageUploadRequest() # GenericPackageUploadRequest | (optional)
+
+try:
+ # Create a new Generic package
+ api_response = api_instance.packages_upload_generic(owner, repo, data=data)
+ pprint(api_response)
+except ApiException as e:
+ print("Exception when calling PackagesApi->packages_upload_generic: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **repo** | **str**| |
+ **data** | [**GenericPackageUploadRequest**](GenericPackageUploadRequest.md)| | [optional]
+
+### Return type
+
+[**GenericPackageUpload**](GenericPackageUpload.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
# **packages_upload_go**
> GoPackageUpload packages_upload_go(owner, repo, data=data)
@@ -3124,6 +3188,67 @@ void (empty response body)
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+# **packages_validate_upload_generic**
+> packages_validate_upload_generic(owner, repo, data=data)
+
+Validate parameters for create Generic package
+
+Validate parameters for create Generic package
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.PackagesApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+repo = 'repo_example' # str |
+data = cloudsmith_api.GenericPackageUploadRequest() # GenericPackageUploadRequest | (optional)
+
+try:
+ # Validate parameters for create Generic package
+ api_instance.packages_validate_upload_generic(owner, repo, data=data)
+except ApiException as e:
+ print("Exception when calling PackagesApi->packages_validate_upload_generic: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **repo** | **str**| |
+ **data** | [**GenericPackageUploadRequest**](GenericPackageUploadRequest.md)| | [optional]
+
+### Return type
+
+void (empty response body)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
# **packages_validate_upload_go**
> packages_validate_upload_go(owner, repo, data=data)
diff --git a/bindings/python/src/docs/ReposApi.md b/bindings/python/src/docs/ReposApi.md
index f9191585..2285799c 100644
--- a/bindings/python/src/docs/ReposApi.md
+++ b/bindings/python/src/docs/ReposApi.md
@@ -73,6 +73,12 @@ Method | HTTP request | Description
[**repos_upstream_docker_partial_update**](ReposApi.md#repos_upstream_docker_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Partially update a Docker upstream config for this repository.
[**repos_upstream_docker_read**](ReposApi.md#repos_upstream_docker_read) | **GET** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Retrieve a Docker upstream config for this repository.
[**repos_upstream_docker_update**](ReposApi.md#repos_upstream_docker_update) | **PUT** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Update a Docker upstream config for this repository.
+[**repos_upstream_generic_create**](ReposApi.md#repos_upstream_generic_create) | **POST** /repos/{owner}/{identifier}/upstream/generic/ | Create a Generic upstream config for this repository.
+[**repos_upstream_generic_delete**](ReposApi.md#repos_upstream_generic_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Delete a Generic upstream config for this repository.
+[**repos_upstream_generic_list**](ReposApi.md#repos_upstream_generic_list) | **GET** /repos/{owner}/{identifier}/upstream/generic/ | List Generic upstream configs for this repository.
+[**repos_upstream_generic_partial_update**](ReposApi.md#repos_upstream_generic_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Partially update a Generic upstream config for this repository.
+[**repos_upstream_generic_read**](ReposApi.md#repos_upstream_generic_read) | **GET** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Retrieve a Generic upstream config for this repository.
+[**repos_upstream_generic_update**](ReposApi.md#repos_upstream_generic_update) | **PUT** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Update a Generic upstream config for this repository.
[**repos_upstream_go_create**](ReposApi.md#repos_upstream_go_create) | **POST** /repos/{owner}/{identifier}/upstream/go/ | Create a Go upstream config for this repository.
[**repos_upstream_go_delete**](ReposApi.md#repos_upstream_go_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/go/{slug_perm}/ | Delete a Go upstream config for this repository.
[**repos_upstream_go_list**](ReposApi.md#repos_upstream_go_list) | **GET** /repos/{owner}/{identifier}/upstream/go/ | List Go upstream configs for this repository.
@@ -4429,6 +4435,383 @@ Name | Type | Description | Notes
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+# **repos_upstream_generic_create**
+> GenericUpstream repos_upstream_generic_create(owner, identifier, data=data)
+
+Create a Generic upstream config for this repository.
+
+Create a Generic upstream config for this repository.
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.ReposApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+identifier = 'identifier_example' # str |
+data = cloudsmith_api.GenericUpstreamRequest() # GenericUpstreamRequest | (optional)
+
+try:
+ # Create a Generic upstream config for this repository.
+ api_response = api_instance.repos_upstream_generic_create(owner, identifier, data=data)
+ pprint(api_response)
+except ApiException as e:
+ print("Exception when calling ReposApi->repos_upstream_generic_create: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **identifier** | **str**| |
+ **data** | [**GenericUpstreamRequest**](GenericUpstreamRequest.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **repos_upstream_generic_delete**
+> repos_upstream_generic_delete(owner, identifier, slug_perm)
+
+Delete a Generic upstream config for this repository.
+
+Delete a Generic upstream config for this repository.
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.ReposApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+identifier = 'identifier_example' # str |
+slug_perm = 'slug_perm_example' # str |
+
+try:
+ # Delete a Generic upstream config for this repository.
+ api_instance.repos_upstream_generic_delete(owner, identifier, slug_perm)
+except ApiException as e:
+ print("Exception when calling ReposApi->repos_upstream_generic_delete: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **identifier** | **str**| |
+ **slug_perm** | **str**| |
+
+### Return type
+
+void (empty response body)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **repos_upstream_generic_list**
+> list[GenericUpstream] repos_upstream_generic_list(owner, identifier, page=page, page_size=page_size)
+
+List Generic upstream configs for this repository.
+
+List Generic upstream configs for this repository.
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.ReposApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+identifier = 'identifier_example' # str |
+page = 56 # int | A page number within the paginated result set. (optional)
+page_size = 56 # int | Number of results to return per page. (optional)
+
+try:
+ # List Generic upstream configs for this repository.
+ api_response = api_instance.repos_upstream_generic_list(owner, identifier, page=page, page_size=page_size)
+ pprint(api_response)
+except ApiException as e:
+ print("Exception when calling ReposApi->repos_upstream_generic_list: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **identifier** | **str**| |
+ **page** | **int**| A page number within the paginated result set. | [optional]
+ **page_size** | **int**| Number of results to return per page. | [optional]
+
+### Return type
+
+[**list[GenericUpstream]**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **repos_upstream_generic_partial_update**
+> GenericUpstream repos_upstream_generic_partial_update(owner, identifier, slug_perm, data=data)
+
+Partially update a Generic upstream config for this repository.
+
+Partially update a Generic upstream config for this repository.
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.ReposApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+identifier = 'identifier_example' # str |
+slug_perm = 'slug_perm_example' # str |
+data = cloudsmith_api.GenericUpstreamRequestPatch() # GenericUpstreamRequestPatch | (optional)
+
+try:
+ # Partially update a Generic upstream config for this repository.
+ api_response = api_instance.repos_upstream_generic_partial_update(owner, identifier, slug_perm, data=data)
+ pprint(api_response)
+except ApiException as e:
+ print("Exception when calling ReposApi->repos_upstream_generic_partial_update: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **identifier** | **str**| |
+ **slug_perm** | **str**| |
+ **data** | [**GenericUpstreamRequestPatch**](GenericUpstreamRequestPatch.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **repos_upstream_generic_read**
+> GenericUpstream repos_upstream_generic_read(owner, identifier, slug_perm)
+
+Retrieve a Generic upstream config for this repository.
+
+Retrieve a Generic upstream config for this repository.
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.ReposApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+identifier = 'identifier_example' # str |
+slug_perm = 'slug_perm_example' # str |
+
+try:
+ # Retrieve a Generic upstream config for this repository.
+ api_response = api_instance.repos_upstream_generic_read(owner, identifier, slug_perm)
+ pprint(api_response)
+except ApiException as e:
+ print("Exception when calling ReposApi->repos_upstream_generic_read: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **identifier** | **str**| |
+ **slug_perm** | **str**| |
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **repos_upstream_generic_update**
+> GenericUpstream repos_upstream_generic_update(owner, identifier, slug_perm, data=data)
+
+Update a Generic upstream config for this repository.
+
+Update a Generic upstream config for this repository.
+
+### Example
+```python
+from __future__ import print_function
+import time
+import cloudsmith_api
+from cloudsmith_api.rest import ApiException
+from pprint import pprint
+
+# Configure API key authorization: apikey
+configuration = cloudsmith_api.Configuration()
+configuration.api_key['X-Api-Key'] = 'YOUR_API_KEY'
+# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
+# configuration.api_key_prefix['X-Api-Key'] = 'Bearer'
+# Configure HTTP basic authorization: basic
+configuration = cloudsmith_api.Configuration()
+configuration.username = 'YOUR_USERNAME'
+configuration.password = 'YOUR_PASSWORD'
+
+# create an instance of the API class
+api_instance = cloudsmith_api.ReposApi(cloudsmith_api.ApiClient(configuration))
+owner = 'owner_example' # str |
+identifier = 'identifier_example' # str |
+slug_perm = 'slug_perm_example' # str |
+data = cloudsmith_api.GenericUpstreamRequest() # GenericUpstreamRequest | (optional)
+
+try:
+ # Update a Generic upstream config for this repository.
+ api_response = api_instance.repos_upstream_generic_update(owner, identifier, slug_perm, data=data)
+ pprint(api_response)
+except ApiException as e:
+ print("Exception when calling ReposApi->repos_upstream_generic_update: %s\n" % e)
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **str**| |
+ **identifier** | **str**| |
+ **slug_perm** | **str**| |
+ **data** | [**GenericUpstreamRequest**](GenericUpstreamRequest.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
# **repos_upstream_go_create**
> GoUpstream repos_upstream_go_create(owner, identifier, data=data)
diff --git a/bindings/python/src/test/test_generic_package_upload.py b/bindings/python/src/test/test_generic_package_upload.py
new file mode 100644
index 00000000..f60370a2
--- /dev/null
+++ b/bindings/python/src/test/test_generic_package_upload.py
@@ -0,0 +1,40 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+from __future__ import absolute_import
+
+import unittest
+
+import cloudsmith_api
+from cloudsmith_api.models.generic_package_upload import GenericPackageUpload # noqa: E501
+from cloudsmith_api.rest import ApiException
+
+
+class TestGenericPackageUpload(unittest.TestCase):
+ """GenericPackageUpload unit test stubs"""
+
+ def setUp(self):
+ pass
+
+ def tearDown(self):
+ pass
+
+ def testGenericPackageUpload(self):
+ """Test GenericPackageUpload"""
+ # FIXME: construct object with mandatory attributes with example values
+ # model = cloudsmith_api.models.generic_package_upload.GenericPackageUpload() # noqa: E501
+ pass
+
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/bindings/python/src/test/test_generic_package_upload_request.py b/bindings/python/src/test/test_generic_package_upload_request.py
new file mode 100644
index 00000000..e051e72d
--- /dev/null
+++ b/bindings/python/src/test/test_generic_package_upload_request.py
@@ -0,0 +1,40 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+from __future__ import absolute_import
+
+import unittest
+
+import cloudsmith_api
+from cloudsmith_api.models.generic_package_upload_request import GenericPackageUploadRequest # noqa: E501
+from cloudsmith_api.rest import ApiException
+
+
+class TestGenericPackageUploadRequest(unittest.TestCase):
+ """GenericPackageUploadRequest unit test stubs"""
+
+ def setUp(self):
+ pass
+
+ def tearDown(self):
+ pass
+
+ def testGenericPackageUploadRequest(self):
+ """Test GenericPackageUploadRequest"""
+ # FIXME: construct object with mandatory attributes with example values
+ # model = cloudsmith_api.models.generic_package_upload_request.GenericPackageUploadRequest() # noqa: E501
+ pass
+
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/bindings/python/src/test/test_generic_upstream.py b/bindings/python/src/test/test_generic_upstream.py
new file mode 100644
index 00000000..8fcc5851
--- /dev/null
+++ b/bindings/python/src/test/test_generic_upstream.py
@@ -0,0 +1,40 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+from __future__ import absolute_import
+
+import unittest
+
+import cloudsmith_api
+from cloudsmith_api.models.generic_upstream import GenericUpstream # noqa: E501
+from cloudsmith_api.rest import ApiException
+
+
+class TestGenericUpstream(unittest.TestCase):
+ """GenericUpstream unit test stubs"""
+
+ def setUp(self):
+ pass
+
+ def tearDown(self):
+ pass
+
+ def testGenericUpstream(self):
+ """Test GenericUpstream"""
+ # FIXME: construct object with mandatory attributes with example values
+ # model = cloudsmith_api.models.generic_upstream.GenericUpstream() # noqa: E501
+ pass
+
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/bindings/python/src/test/test_generic_upstream_request.py b/bindings/python/src/test/test_generic_upstream_request.py
new file mode 100644
index 00000000..483716ba
--- /dev/null
+++ b/bindings/python/src/test/test_generic_upstream_request.py
@@ -0,0 +1,40 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+from __future__ import absolute_import
+
+import unittest
+
+import cloudsmith_api
+from cloudsmith_api.models.generic_upstream_request import GenericUpstreamRequest # noqa: E501
+from cloudsmith_api.rest import ApiException
+
+
+class TestGenericUpstreamRequest(unittest.TestCase):
+ """GenericUpstreamRequest unit test stubs"""
+
+ def setUp(self):
+ pass
+
+ def tearDown(self):
+ pass
+
+ def testGenericUpstreamRequest(self):
+ """Test GenericUpstreamRequest"""
+ # FIXME: construct object with mandatory attributes with example values
+ # model = cloudsmith_api.models.generic_upstream_request.GenericUpstreamRequest() # noqa: E501
+ pass
+
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/bindings/python/src/test/test_generic_upstream_request_patch.py b/bindings/python/src/test/test_generic_upstream_request_patch.py
new file mode 100644
index 00000000..8ad80f74
--- /dev/null
+++ b/bindings/python/src/test/test_generic_upstream_request_patch.py
@@ -0,0 +1,40 @@
+# coding: utf-8
+
+"""
+ Cloudsmith API (v1)
+
+ The API to the Cloudsmith Service # noqa: E501
+
+ OpenAPI spec version: v1
+ Contact: support@cloudsmith.io
+ Generated by: https://github.com/swagger-api/swagger-codegen.git
+"""
+
+
+from __future__ import absolute_import
+
+import unittest
+
+import cloudsmith_api
+from cloudsmith_api.models.generic_upstream_request_patch import GenericUpstreamRequestPatch # noqa: E501
+from cloudsmith_api.rest import ApiException
+
+
+class TestGenericUpstreamRequestPatch(unittest.TestCase):
+ """GenericUpstreamRequestPatch unit test stubs"""
+
+ def setUp(self):
+ pass
+
+ def tearDown(self):
+ pass
+
+ def testGenericUpstreamRequestPatch(self):
+ """Test GenericUpstreamRequestPatch"""
+ # FIXME: construct object with mandatory attributes with example values
+ # model = cloudsmith_api.models.generic_upstream_request_patch.GenericUpstreamRequestPatch() # noqa: E501
+ pass
+
+
+if __name__ == '__main__':
+ unittest.main()
diff --git a/bindings/python/src/test/test_packages_api.py b/bindings/python/src/test/test_packages_api.py
index 26baedb5..2d0aa768 100644
--- a/bindings/python/src/test/test_packages_api.py
+++ b/bindings/python/src/test/test_packages_api.py
@@ -190,6 +190,13 @@ def test_packages_upload_docker(self):
"""
pass
+ def test_packages_upload_generic(self):
+ """Test case for packages_upload_generic
+
+ Create a new Generic package # noqa: E501
+ """
+ pass
+
def test_packages_upload_go(self):
"""Test case for packages_upload_go
@@ -372,6 +379,13 @@ def test_packages_validate_upload_docker(self):
"""
pass
+ def test_packages_validate_upload_generic(self):
+ """Test case for packages_validate_upload_generic
+
+ Validate parameters for create Generic package # noqa: E501
+ """
+ pass
+
def test_packages_validate_upload_go(self):
"""Test case for packages_validate_upload_go
diff --git a/bindings/python/src/test/test_repos_api.py b/bindings/python/src/test/test_repos_api.py
index 7d71fe1c..c5c82de2 100644
--- a/bindings/python/src/test/test_repos_api.py
+++ b/bindings/python/src/test/test_repos_api.py
@@ -512,6 +512,48 @@ def test_repos_upstream_docker_update(self):
"""
pass
+ def test_repos_upstream_generic_create(self):
+ """Test case for repos_upstream_generic_create
+
+ Create a Generic upstream config for this repository. # noqa: E501
+ """
+ pass
+
+ def test_repos_upstream_generic_delete(self):
+ """Test case for repos_upstream_generic_delete
+
+ Delete a Generic upstream config for this repository. # noqa: E501
+ """
+ pass
+
+ def test_repos_upstream_generic_list(self):
+ """Test case for repos_upstream_generic_list
+
+ List Generic upstream configs for this repository. # noqa: E501
+ """
+ pass
+
+ def test_repos_upstream_generic_partial_update(self):
+ """Test case for repos_upstream_generic_partial_update
+
+ Partially update a Generic upstream config for this repository. # noqa: E501
+ """
+ pass
+
+ def test_repos_upstream_generic_read(self):
+ """Test case for repos_upstream_generic_read
+
+ Retrieve a Generic upstream config for this repository. # noqa: E501
+ """
+ pass
+
+ def test_repos_upstream_generic_update(self):
+ """Test case for repos_upstream_generic_update
+
+ Update a Generic upstream config for this repository. # noqa: E501
+ """
+ pass
+
def test_repos_upstream_go_create(self):
"""Test case for repos_upstream_go_create
diff --git a/bindings/ruby/src/README.md b/bindings/ruby/src/README.md
index 1a88bd2a..03319d6d 100644
--- a/bindings/ruby/src/README.md
+++ b/bindings/ruby/src/README.md
@@ -217,6 +217,7 @@ Class | Method | HTTP request | Description
*CloudsmithApi::PackagesApi* | [**packages_upload_dart**](docs/PackagesApi.md#packages_upload_dart) | **POST** /packages/{owner}/{repo}/upload/dart/ | Create a new Dart package
*CloudsmithApi::PackagesApi* | [**packages_upload_deb**](docs/PackagesApi.md#packages_upload_deb) | **POST** /packages/{owner}/{repo}/upload/deb/ | Create a new Debian package
*CloudsmithApi::PackagesApi* | [**packages_upload_docker**](docs/PackagesApi.md#packages_upload_docker) | **POST** /packages/{owner}/{repo}/upload/docker/ | Create a new Docker package
+*CloudsmithApi::PackagesApi* | [**packages_upload_generic**](docs/PackagesApi.md#packages_upload_generic) | **POST** /packages/{owner}/{repo}/upload/generic/ | Create a new Generic package
*CloudsmithApi::PackagesApi* | [**packages_upload_go**](docs/PackagesApi.md#packages_upload_go) | **POST** /packages/{owner}/{repo}/upload/go/ | Create a new Go package
*CloudsmithApi::PackagesApi* | [**packages_upload_helm**](docs/PackagesApi.md#packages_upload_helm) | **POST** /packages/{owner}/{repo}/upload/helm/ | Create a new Helm package
*CloudsmithApi::PackagesApi* | [**packages_upload_hex**](docs/PackagesApi.md#packages_upload_hex) | **POST** /packages/{owner}/{repo}/upload/hex/ | Create a new Hex package
@@ -243,6 +244,7 @@ Class | Method | HTTP request | Description
*CloudsmithApi::PackagesApi* | [**packages_validate_upload_dart**](docs/PackagesApi.md#packages_validate_upload_dart) | **POST** /packages/{owner}/{repo}/validate-upload/dart/ | Validate parameters for create Dart package
*CloudsmithApi::PackagesApi* | [**packages_validate_upload_deb**](docs/PackagesApi.md#packages_validate_upload_deb) | **POST** /packages/{owner}/{repo}/validate-upload/deb/ | Validate parameters for create Debian package
*CloudsmithApi::PackagesApi* | [**packages_validate_upload_docker**](docs/PackagesApi.md#packages_validate_upload_docker) | **POST** /packages/{owner}/{repo}/validate-upload/docker/ | Validate parameters for create Docker package
+*CloudsmithApi::PackagesApi* | [**packages_validate_upload_generic**](docs/PackagesApi.md#packages_validate_upload_generic) | **POST** /packages/{owner}/{repo}/validate-upload/generic/ | Validate parameters for create Generic package
*CloudsmithApi::PackagesApi* | [**packages_validate_upload_go**](docs/PackagesApi.md#packages_validate_upload_go) | **POST** /packages/{owner}/{repo}/validate-upload/go/ | Validate parameters for create Go package
*CloudsmithApi::PackagesApi* | [**packages_validate_upload_helm**](docs/PackagesApi.md#packages_validate_upload_helm) | **POST** /packages/{owner}/{repo}/validate-upload/helm/ | Validate parameters for create Helm package
*CloudsmithApi::PackagesApi* | [**packages_validate_upload_hex**](docs/PackagesApi.md#packages_validate_upload_hex) | **POST** /packages/{owner}/{repo}/validate-upload/hex/ | Validate parameters for create Hex package
@@ -333,6 +335,12 @@ Class | Method | HTTP request | Description
*CloudsmithApi::ReposApi* | [**repos_upstream_docker_partial_update**](docs/ReposApi.md#repos_upstream_docker_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Partially update a Docker upstream config for this repository.
*CloudsmithApi::ReposApi* | [**repos_upstream_docker_read**](docs/ReposApi.md#repos_upstream_docker_read) | **GET** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Retrieve a Docker upstream config for this repository.
*CloudsmithApi::ReposApi* | [**repos_upstream_docker_update**](docs/ReposApi.md#repos_upstream_docker_update) | **PUT** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Update a Docker upstream config for this repository.
+*CloudsmithApi::ReposApi* | [**repos_upstream_generic_create**](docs/ReposApi.md#repos_upstream_generic_create) | **POST** /repos/{owner}/{identifier}/upstream/generic/ | Create a Generic upstream config for this repository.
+*CloudsmithApi::ReposApi* | [**repos_upstream_generic_delete**](docs/ReposApi.md#repos_upstream_generic_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Delete a Generic upstream config for this repository.
+*CloudsmithApi::ReposApi* | [**repos_upstream_generic_list**](docs/ReposApi.md#repos_upstream_generic_list) | **GET** /repos/{owner}/{identifier}/upstream/generic/ | List Generic upstream configs for this repository.
+*CloudsmithApi::ReposApi* | [**repos_upstream_generic_partial_update**](docs/ReposApi.md#repos_upstream_generic_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Partially update a Generic upstream config for this repository.
+*CloudsmithApi::ReposApi* | [**repos_upstream_generic_read**](docs/ReposApi.md#repos_upstream_generic_read) | **GET** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Retrieve a Generic upstream config for this repository.
+*CloudsmithApi::ReposApi* | [**repos_upstream_generic_update**](docs/ReposApi.md#repos_upstream_generic_update) | **PUT** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Update a Generic upstream config for this repository.
*CloudsmithApi::ReposApi* | [**repos_upstream_go_create**](docs/ReposApi.md#repos_upstream_go_create) | **POST** /repos/{owner}/{identifier}/upstream/go/ | Create a Go upstream config for this repository.
*CloudsmithApi::ReposApi* | [**repos_upstream_go_delete**](docs/ReposApi.md#repos_upstream_go_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/go/{slug_perm}/ | Delete a Go upstream config for this repository.
*CloudsmithApi::ReposApi* | [**repos_upstream_go_list**](docs/ReposApi.md#repos_upstream_go_list) | **GET** /repos/{owner}/{identifier}/upstream/go/ | List Go upstream configs for this repository.
@@ -485,6 +493,11 @@ Class | Method | HTTP request | Description
- [CloudsmithApi::Format](docs/Format.md)
- [CloudsmithApi::FormatSupport](docs/FormatSupport.md)
- [CloudsmithApi::FormatSupportUpstream](docs/FormatSupportUpstream.md)
+ - [CloudsmithApi::GenericPackageUpload](docs/GenericPackageUpload.md)
+ - [CloudsmithApi::GenericPackageUploadRequest](docs/GenericPackageUploadRequest.md)
+ - [CloudsmithApi::GenericUpstream](docs/GenericUpstream.md)
+ - [CloudsmithApi::GenericUpstreamRequest](docs/GenericUpstreamRequest.md)
+ - [CloudsmithApi::GenericUpstreamRequestPatch](docs/GenericUpstreamRequestPatch.md)
- [CloudsmithApi::GeoIpLocation](docs/GeoIpLocation.md)
- [CloudsmithApi::GoPackageUpload](docs/GoPackageUpload.md)
- [CloudsmithApi::GoPackageUploadRequest](docs/GoPackageUploadRequest.md)
diff --git a/bindings/ruby/src/docs/FormatSupport.md b/bindings/ruby/src/docs/FormatSupport.md
index 1d76d096..5572653d 100644
--- a/bindings/ruby/src/docs/FormatSupport.md
+++ b/bindings/ruby/src/docs/FormatSupport.md
@@ -6,6 +6,7 @@ Name | Type | Description | Notes
**dependencies** | **BOOLEAN** | If true the package format supports dependencies |
**distributions** | **BOOLEAN** | If true the package format supports distributions |
**file_lists** | **BOOLEAN** | If true the package format supports file lists |
+**filepaths** | **BOOLEAN** | If true the package format supports filepaths |
**metadata** | **BOOLEAN** | If true the package format supports metadata |
**upstreams** | [**FormatSupportUpstream**](FormatSupportUpstream.md) | |
**versioning** | **BOOLEAN** | If true the package format supports versioning |
diff --git a/bindings/ruby/src/docs/GenericPackageUpload.md b/bindings/ruby/src/docs/GenericPackageUpload.md
new file mode 100644
index 00000000..05e6fe32
--- /dev/null
+++ b/bindings/ruby/src/docs/GenericPackageUpload.md
@@ -0,0 +1,90 @@
+# CloudsmithApi::GenericPackageUpload
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**architectures** | [**Array<Architecture>**](Architecture.md) | | [optional]
+**cdn_url** | **String** | | [optional]
+**checksum_md5** | **String** | | [optional]
+**checksum_sha1** | **String** | | [optional]
+**checksum_sha256** | **String** | | [optional]
+**checksum_sha512** | **String** | | [optional]
+**dependencies_checksum_md5** | **String** | A checksum of all of the package's dependencies. | [optional]
+**dependencies_url** | **String** | | [optional]
+**description** | **String** | A textual description of this package. | [optional]
+**display_name** | **String** | | [optional]
+**distro** | [**Distribution**](Distribution.md) | | [optional]
+**distro_version** | [**DistributionVersion**](DistributionVersion.md) | | [optional]
+**downloads** | **Integer** | | [optional]
+**epoch** | **Integer** | The epoch of the package version (if any). | [optional]
+**extension** | **String** | | [optional]
+**filename** | **String** | | [optional]
+**files** | [**Array<PackageFile>**](PackageFile.md) | | [optional]
+**format** | **String** | | [optional]
+**format_url** | **String** | | [optional]
+**freeable_storage** | **Integer** | Amount of storage that will be freed if this package is deleted | [optional]
+**fully_qualified_name** | **String** | | [optional]
+**identifier_perm** | **String** | Unique and permanent identifier for the package. | [optional]
+**identifiers** | **Hash<String, String>** | Return a map of identifier field names and their values. | [optional]
+**indexed** | **BOOLEAN** | | [optional]
+**is_cancellable** | **BOOLEAN** | | [optional]
+**is_copyable** | **BOOLEAN** | | [optional]
+**is_deleteable** | **BOOLEAN** | | [optional]
+**is_downloadable** | **BOOLEAN** | | [optional]
+**is_moveable** | **BOOLEAN** | | [optional]
+**is_quarantinable** | **BOOLEAN** | | [optional]
+**is_quarantined** | **BOOLEAN** | | [optional]
+**is_resyncable** | **BOOLEAN** | | [optional]
+**is_security_scannable** | **BOOLEAN** | | [optional]
+**is_sync_awaiting** | **BOOLEAN** | | [optional]
+**is_sync_completed** | **BOOLEAN** | | [optional]
+**is_sync_failed** | **BOOLEAN** | | [optional]
+**is_sync_in_flight** | **BOOLEAN** | | [optional]
+**is_sync_in_progress** | **BOOLEAN** | | [optional]
+**license** | **String** | The license of this package. | [optional]
+**name** | **String** | The name of this package. | [optional]
+**namespace** | **String** | | [optional]
+**namespace_url** | **String** | | [optional]
+**num_files** | **Integer** | | [optional]
+**origin_repository** | **String** | | [optional]
+**origin_repository_url** | **String** | | [optional]
+**package_type** | **Integer** | The type of package contents. | [optional]
+**policy_violated** | **BOOLEAN** | Whether or not the package has violated any policy. | [optional]
+**raw_license** | **String** | The raw license string. | [optional]
+**release** | **String** | The release of the package version (if any). | [optional]
+**repository** | **String** | | [optional]
+**repository_url** | **String** | | [optional]
+**security_scan_completed_at** | **DateTime** | The datetime the security scanning was completed. | [optional]
+**security_scan_started_at** | **DateTime** | The datetime the security scanning was started. | [optional]
+**security_scan_status** | **String** | | [optional] [default to 'Awaiting Security Scan']
+**security_scan_status_updated_at** | **DateTime** | The datetime the security scanning status was updated. | [optional]
+**self_html_url** | **String** | | [optional]
+**self_url** | **String** | | [optional]
+**signature_url** | **String** | | [optional]
+**size** | **Integer** | The calculated size of the package. | [optional]
+**slug** | **String** | The public unique identifier for the package. | [optional]
+**slug_perm** | **String** | | [optional]
+**spdx_license** | **String** | The SPDX license identifier for this package. | [optional]
+**stage** | **Integer** | The synchronisation (in progress) stage of the package. | [optional]
+**stage_str** | **String** | | [optional]
+**stage_updated_at** | **DateTime** | The datetime the package stage was updated at. | [optional]
+**status** | **Integer** | The synchronisation status of the package. | [optional]
+**status_reason** | **String** | A textual description for the synchronous status reason (if any | [optional]
+**status_str** | **String** | | [optional]
+**status_updated_at** | **DateTime** | The datetime the package status was updated at. | [optional]
+**status_url** | **String** | | [optional]
+**subtype** | **String** | | [optional]
+**summary** | **String** | A one-liner synopsis of this package. | [optional]
+**sync_finished_at** | **DateTime** | The datetime the package sync was finished at. | [optional]
+**sync_progress** | **Integer** | Synchronisation progress (from 0-100) | [optional]
+**tags_automatic** | [**Tags**](Tags.md) | | [optional]
+**tags_immutable** | [**Tags**](Tags.md) | | [optional]
+**type_display** | **String** | | [optional]
+**uploaded_at** | **DateTime** | The date this package was uploaded. | [optional]
+**uploader** | **String** | | [optional]
+**uploader_url** | **String** | | [optional]
+**version** | **String** | The raw version for this package. | [optional]
+**version_orig** | **String** | | [optional]
+**vulnerability_scan_results_url** | **String** | | [optional]
+
+
diff --git a/bindings/ruby/src/docs/GenericPackageUploadRequest.md b/bindings/ruby/src/docs/GenericPackageUploadRequest.md
new file mode 100644
index 00000000..021d86df
--- /dev/null
+++ b/bindings/ruby/src/docs/GenericPackageUploadRequest.md
@@ -0,0 +1,13 @@
+# CloudsmithApi::GenericPackageUploadRequest
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**filepath** | **String** | The full filepath of the package including filename. |
+**name** | **String** | The name of this package. | [optional]
+**package_file** | **String** | The primary file for the package. |
+**republish** | **BOOLEAN** | If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate. | [optional]
+**tags** | **String** | A comma-separated values list of tags to add to the package. | [optional]
+**version** | **String** | The raw version for this package. | [optional]
+
+
diff --git a/bindings/ruby/src/docs/GenericUpstream.md b/bindings/ruby/src/docs/GenericUpstream.md
new file mode 100644
index 00000000..fa17a3b4
--- /dev/null
+++ b/bindings/ruby/src/docs/GenericUpstream.md
@@ -0,0 +1,33 @@
+# CloudsmithApi::GenericUpstream
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**auth_mode** | **String** | The authentication mode to use when accessing this upstream. | [optional] [default to 'None']
+**auth_secret** | **String** | Secret to provide with requests to upstream. | [optional]
+**auth_username** | **String** | Username to provide with requests to upstream. | [optional]
+**available** | **String** | | [optional]
+**can_reindex** | **String** | | [optional]
+**created_at** | **DateTime** | The datetime the upstream source was created. | [optional]
+**disable_reason** | **String** | | [optional] [default to 'N/A']
+**disable_reason_text** | **String** | Human-readable explanation of why this upstream is disabled | [optional]
+**extra_header_1** | **String** | The key for extra header #1 to send to upstream. | [optional]
+**extra_header_2** | **String** | The key for extra header #2 to send to upstream. | [optional]
+**extra_value_1** | **String** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extra_value_2** | **String** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**has_failed_signature_verification** | **String** | | [optional]
+**index_package_count** | **String** | The number of packages available in this upstream source | [optional]
+**index_status** | **String** | The current indexing status of this upstream source | [optional]
+**is_active** | **BOOLEAN** | Whether or not this upstream is active and ready for requests. | [optional]
+**last_indexed** | **String** | The last time this upstream source was indexed | [optional]
+**mode** | **String** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
+**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
+**pending_validation** | **BOOLEAN** | When true, this upstream source is pending validation. | [optional]
+**priority** | **Integer** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**slug_perm** | **String** | | [optional]
+**updated_at** | **DateTime** | | [optional]
+**upstream_prefix** | **String** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstream_url** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
+**verify_ssl** | **BOOLEAN** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+
diff --git a/bindings/ruby/src/docs/GenericUpstreamRequest.md b/bindings/ruby/src/docs/GenericUpstreamRequest.md
new file mode 100644
index 00000000..8b95bf40
--- /dev/null
+++ b/bindings/ruby/src/docs/GenericUpstreamRequest.md
@@ -0,0 +1,21 @@
+# CloudsmithApi::GenericUpstreamRequest
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**auth_mode** | **String** | The authentication mode to use when accessing this upstream. | [optional] [default to 'None']
+**auth_secret** | **String** | Secret to provide with requests to upstream. | [optional]
+**auth_username** | **String** | Username to provide with requests to upstream. | [optional]
+**extra_header_1** | **String** | The key for extra header #1 to send to upstream. | [optional]
+**extra_header_2** | **String** | The key for extra header #2 to send to upstream. | [optional]
+**extra_value_1** | **String** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extra_value_2** | **String** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**is_active** | **BOOLEAN** | Whether or not this upstream is active and ready for requests. | [optional]
+**mode** | **String** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
+**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
+**priority** | **Integer** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**upstream_prefix** | **String** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstream_url** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
+**verify_ssl** | **BOOLEAN** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+
diff --git a/bindings/ruby/src/docs/GenericUpstreamRequestPatch.md b/bindings/ruby/src/docs/GenericUpstreamRequestPatch.md
new file mode 100644
index 00000000..997b130a
--- /dev/null
+++ b/bindings/ruby/src/docs/GenericUpstreamRequestPatch.md
@@ -0,0 +1,21 @@
+# CloudsmithApi::GenericUpstreamRequestPatch
+
+## Properties
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**auth_mode** | **String** | The authentication mode to use when accessing this upstream. | [optional] [default to 'None']
+**auth_secret** | **String** | Secret to provide with requests to upstream. | [optional]
+**auth_username** | **String** | Username to provide with requests to upstream. | [optional]
+**extra_header_1** | **String** | The key for extra header #1 to send to upstream. | [optional]
+**extra_header_2** | **String** | The key for extra header #2 to send to upstream. | [optional]
+**extra_value_1** | **String** | The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**extra_value_2** | **String** | The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted. | [optional]
+**is_active** | **BOOLEAN** | Whether or not this upstream is active and ready for requests. | [optional]
+**mode** | **String** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
+**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. | [optional]
+**priority** | **Integer** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**upstream_prefix** | **String** | A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream. | [optional]
+**upstream_url** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. | [optional]
+**verify_ssl** | **BOOLEAN** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
+
+
diff --git a/bindings/ruby/src/docs/MavenUpstream.md b/bindings/ruby/src/docs/MavenUpstream.md
index dadb64ad..ec0cdb75 100644
--- a/bindings/ruby/src/docs/MavenUpstream.md
+++ b/bindings/ruby/src/docs/MavenUpstream.md
@@ -29,6 +29,7 @@ Name | Type | Description | Notes
**pending_validation** | **BOOLEAN** | When true, this upstream source is pending validation. | [optional]
**priority** | **Integer** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
**slug_perm** | **String** | | [optional]
+**trust_level** | **String** | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional] [default to 'Trusted']
**updated_at** | **DateTime** | | [optional]
**upstream_url** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
**verification_status** | **String** | The signature verification status for this upstream. | [optional] [default to 'Unknown']
diff --git a/bindings/ruby/src/docs/MavenUpstreamRequest.md b/bindings/ruby/src/docs/MavenUpstreamRequest.md
index 6603c844..ea83ed02 100644
--- a/bindings/ruby/src/docs/MavenUpstreamRequest.md
+++ b/bindings/ruby/src/docs/MavenUpstreamRequest.md
@@ -17,6 +17,7 @@ Name | Type | Description | Notes
**mode** | **String** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. |
**priority** | **Integer** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**trust_level** | **String** | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional] [default to 'Trusted']
**upstream_url** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. |
**verify_ssl** | **BOOLEAN** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
diff --git a/bindings/ruby/src/docs/MavenUpstreamRequestPatch.md b/bindings/ruby/src/docs/MavenUpstreamRequestPatch.md
index a02876ea..d2f9a438 100644
--- a/bindings/ruby/src/docs/MavenUpstreamRequestPatch.md
+++ b/bindings/ruby/src/docs/MavenUpstreamRequestPatch.md
@@ -17,6 +17,7 @@ Name | Type | Description | Notes
**mode** | **String** | The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode. | [optional] [default to 'Proxy Only']
**name** | **String** | A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream. | [optional]
**priority** | **Integer** | Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date. | [optional]
+**trust_level** | **String** | Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors. | [optional] [default to 'Trusted']
**upstream_url** | **String** | The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository. | [optional]
**verify_ssl** | **BOOLEAN** | If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams. | [optional]
diff --git a/bindings/ruby/src/docs/OrganizationTeam.md b/bindings/ruby/src/docs/OrganizationTeam.md
index 56c949db..3ea8ed2d 100644
--- a/bindings/ruby/src/docs/OrganizationTeam.md
+++ b/bindings/ruby/src/docs/OrganizationTeam.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **String** | | [optional]
+**description** | **String** | A detailed description of the team. | [optional]
**name** | **String** | A descriptive name for the team. |
**slug** | **String** | | [optional]
**slug_perm** | **String** | | [optional]
diff --git a/bindings/ruby/src/docs/OrganizationTeamRequest.md b/bindings/ruby/src/docs/OrganizationTeamRequest.md
index 8f7e3f1c..3c89314b 100644
--- a/bindings/ruby/src/docs/OrganizationTeamRequest.md
+++ b/bindings/ruby/src/docs/OrganizationTeamRequest.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **String** | | [optional]
+**description** | **String** | A detailed description of the team. | [optional]
**name** | **String** | A descriptive name for the team. |
**slug** | **String** | | [optional]
**visibility** | **String** | | [optional] [default to 'Visible']
diff --git a/bindings/ruby/src/docs/OrganizationTeamRequestPatch.md b/bindings/ruby/src/docs/OrganizationTeamRequestPatch.md
index 2e3cb083..0ba5a29f 100644
--- a/bindings/ruby/src/docs/OrganizationTeamRequestPatch.md
+++ b/bindings/ruby/src/docs/OrganizationTeamRequestPatch.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**description** | **String** | | [optional]
+**description** | **String** | A detailed description of the team. | [optional]
**name** | **String** | A descriptive name for the team. | [optional]
**slug** | **String** | | [optional]
**visibility** | **String** | | [optional] [default to 'Visible']
diff --git a/bindings/ruby/src/docs/Package.md b/bindings/ruby/src/docs/Package.md
index 1cbc4954..fdd7fb6c 100644
--- a/bindings/ruby/src/docs/Package.md
+++ b/bindings/ruby/src/docs/Package.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **Integer** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**Array<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**format_url** | **String** | | [optional]
diff --git a/bindings/ruby/src/docs/PackageCopy.md b/bindings/ruby/src/docs/PackageCopy.md
index 6eaec515..d3cf2576 100644
--- a/bindings/ruby/src/docs/PackageCopy.md
+++ b/bindings/ruby/src/docs/PackageCopy.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **Integer** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**Array<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**format_url** | **String** | | [optional]
diff --git a/bindings/ruby/src/docs/PackageCopyRequest.md b/bindings/ruby/src/docs/PackageCopyRequest.md
index dae0530c..a63fe6cb 100644
--- a/bindings/ruby/src/docs/PackageCopyRequest.md
+++ b/bindings/ruby/src/docs/PackageCopyRequest.md
@@ -3,7 +3,7 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**destination** | **String** | |
+**destination** | **String** | The name of the destination repository without the namespace. |
**republish** | **BOOLEAN** | If true, the package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate. | [optional]
diff --git a/bindings/ruby/src/docs/PackageMove.md b/bindings/ruby/src/docs/PackageMove.md
index a264ff99..b710683f 100644
--- a/bindings/ruby/src/docs/PackageMove.md
+++ b/bindings/ruby/src/docs/PackageMove.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **Integer** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**Array<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**format_url** | **String** | | [optional]
diff --git a/bindings/ruby/src/docs/PackageMoveRequest.md b/bindings/ruby/src/docs/PackageMoveRequest.md
index 9945a245..47e829f0 100644
--- a/bindings/ruby/src/docs/PackageMoveRequest.md
+++ b/bindings/ruby/src/docs/PackageMoveRequest.md
@@ -3,6 +3,6 @@
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
-**destination** | **String** | |
+**destination** | **String** | The name of the destination repository without the namespace. |
diff --git a/bindings/ruby/src/docs/PackageQuarantine.md b/bindings/ruby/src/docs/PackageQuarantine.md
index 655bb6a0..c9a2b2b7 100644
--- a/bindings/ruby/src/docs/PackageQuarantine.md
+++ b/bindings/ruby/src/docs/PackageQuarantine.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **Integer** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**Array<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**format_url** | **String** | | [optional]
diff --git a/bindings/ruby/src/docs/PackageResync.md b/bindings/ruby/src/docs/PackageResync.md
index 05a6a784..b5b8be46 100644
--- a/bindings/ruby/src/docs/PackageResync.md
+++ b/bindings/ruby/src/docs/PackageResync.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **Integer** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**Array<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**format_url** | **String** | | [optional]
diff --git a/bindings/ruby/src/docs/PackageTag.md b/bindings/ruby/src/docs/PackageTag.md
index ebd32116..cf99ea58 100644
--- a/bindings/ruby/src/docs/PackageTag.md
+++ b/bindings/ruby/src/docs/PackageTag.md
@@ -19,6 +19,7 @@ Name | Type | Description | Notes
**epoch** | **Integer** | The epoch of the package version (if any). | [optional]
**extension** | **String** | | [optional]
**filename** | **String** | | [optional]
+**filepath** | **String** | Full path to the file, including filename e.g. bin/utils/tool.tar.gz | [optional]
**files** | [**Array<PackageFile>**](PackageFile.md) | | [optional]
**format** | **String** | | [optional]
**format_url** | **String** | | [optional]
diff --git a/bindings/ruby/src/docs/PackagesApi.md b/bindings/ruby/src/docs/PackagesApi.md
index 05f84f5b..959f0f80 100644
--- a/bindings/ruby/src/docs/PackagesApi.md
+++ b/bindings/ruby/src/docs/PackagesApi.md
@@ -27,6 +27,7 @@ Method | HTTP request | Description
[**packages_upload_dart**](PackagesApi.md#packages_upload_dart) | **POST** /packages/{owner}/{repo}/upload/dart/ | Create a new Dart package
[**packages_upload_deb**](PackagesApi.md#packages_upload_deb) | **POST** /packages/{owner}/{repo}/upload/deb/ | Create a new Debian package
[**packages_upload_docker**](PackagesApi.md#packages_upload_docker) | **POST** /packages/{owner}/{repo}/upload/docker/ | Create a new Docker package
+[**packages_upload_generic**](PackagesApi.md#packages_upload_generic) | **POST** /packages/{owner}/{repo}/upload/generic/ | Create a new Generic package
[**packages_upload_go**](PackagesApi.md#packages_upload_go) | **POST** /packages/{owner}/{repo}/upload/go/ | Create a new Go package
[**packages_upload_helm**](PackagesApi.md#packages_upload_helm) | **POST** /packages/{owner}/{repo}/upload/helm/ | Create a new Helm package
[**packages_upload_hex**](PackagesApi.md#packages_upload_hex) | **POST** /packages/{owner}/{repo}/upload/hex/ | Create a new Hex package
@@ -53,6 +54,7 @@ Method | HTTP request | Description
[**packages_validate_upload_dart**](PackagesApi.md#packages_validate_upload_dart) | **POST** /packages/{owner}/{repo}/validate-upload/dart/ | Validate parameters for create Dart package
[**packages_validate_upload_deb**](PackagesApi.md#packages_validate_upload_deb) | **POST** /packages/{owner}/{repo}/validate-upload/deb/ | Validate parameters for create Debian package
[**packages_validate_upload_docker**](PackagesApi.md#packages_validate_upload_docker) | **POST** /packages/{owner}/{repo}/validate-upload/docker/ | Validate parameters for create Docker package
+[**packages_validate_upload_generic**](PackagesApi.md#packages_validate_upload_generic) | **POST** /packages/{owner}/{repo}/validate-upload/generic/ | Validate parameters for create Generic package
[**packages_validate_upload_go**](PackagesApi.md#packages_validate_upload_go) | **POST** /packages/{owner}/{repo}/validate-upload/go/ | Validate parameters for create Go package
[**packages_validate_upload_helm**](PackagesApi.md#packages_validate_upload_helm) | **POST** /packages/{owner}/{repo}/validate-upload/helm/ | Validate parameters for create Helm package
[**packages_validate_upload_hex**](PackagesApi.md#packages_validate_upload_hex) | **POST** /packages/{owner}/{repo}/validate-upload/hex/ | Validate parameters for create Hex package
@@ -1590,6 +1592,71 @@ Name | Type | Description | Notes
+# **packages_upload_generic**
+> GenericPackageUpload packages_upload_generic(owner, repo, opts)
+
+Create a new Generic package
+
+Create a new Generic package
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::PackagesApi.new
+
+owner = 'owner_example' # String |
+
+repo = 'repo_example' # String |
+
+opts = {
+ data: CloudsmithApi::GenericPackageUploadRequest.new # GenericPackageUploadRequest |
+}
+
+begin
+ #Create a new Generic package
+ result = api_instance.packages_upload_generic(owner, repo, opts)
+ p result
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling PackagesApi->packages_upload_generic: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **repo** | **String**| |
+ **data** | [**GenericPackageUploadRequest**](GenericPackageUploadRequest.md)| | [optional]
+
+### Return type
+
+[**GenericPackageUpload**](GenericPackageUpload.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
# **packages_upload_go**
> GoPackageUpload packages_upload_go(owner, repo, opts)
@@ -3270,6 +3337,70 @@ nil (empty response body)
+# **packages_validate_upload_generic**
+> packages_validate_upload_generic(owner, repo, opts)
+
+Validate parameters for create Generic package
+
+Validate parameters for create Generic package
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::PackagesApi.new
+
+owner = 'owner_example' # String |
+
+repo = 'repo_example' # String |
+
+opts = {
+ data: CloudsmithApi::GenericPackageUploadRequest.new # GenericPackageUploadRequest |
+}
+
+begin
+ #Validate parameters for create Generic package
+ api_instance.packages_validate_upload_generic(owner, repo, opts)
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling PackagesApi->packages_validate_upload_generic: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **repo** | **String**| |
+ **data** | [**GenericPackageUploadRequest**](GenericPackageUploadRequest.md)| | [optional]
+
+### Return type
+
+nil (empty response body)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
# **packages_validate_upload_go**
> packages_validate_upload_go(owner, repo, opts)
diff --git a/bindings/ruby/src/docs/ReposApi.md b/bindings/ruby/src/docs/ReposApi.md
index 499e74c0..4b4f88c7 100644
--- a/bindings/ruby/src/docs/ReposApi.md
+++ b/bindings/ruby/src/docs/ReposApi.md
@@ -73,6 +73,12 @@ Method | HTTP request | Description
[**repos_upstream_docker_partial_update**](ReposApi.md#repos_upstream_docker_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Partially update a Docker upstream config for this repository.
[**repos_upstream_docker_read**](ReposApi.md#repos_upstream_docker_read) | **GET** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Retrieve a Docker upstream config for this repository.
[**repos_upstream_docker_update**](ReposApi.md#repos_upstream_docker_update) | **PUT** /repos/{owner}/{identifier}/upstream/docker/{slug_perm}/ | Update a Docker upstream config for this repository.
+[**repos_upstream_generic_create**](ReposApi.md#repos_upstream_generic_create) | **POST** /repos/{owner}/{identifier}/upstream/generic/ | Create a Generic upstream config for this repository.
+[**repos_upstream_generic_delete**](ReposApi.md#repos_upstream_generic_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Delete a Generic upstream config for this repository.
+[**repos_upstream_generic_list**](ReposApi.md#repos_upstream_generic_list) | **GET** /repos/{owner}/{identifier}/upstream/generic/ | List Generic upstream configs for this repository.
+[**repos_upstream_generic_partial_update**](ReposApi.md#repos_upstream_generic_partial_update) | **PATCH** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Partially update a Generic upstream config for this repository.
+[**repos_upstream_generic_read**](ReposApi.md#repos_upstream_generic_read) | **GET** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Retrieve a Generic upstream config for this repository.
+[**repos_upstream_generic_update**](ReposApi.md#repos_upstream_generic_update) | **PUT** /repos/{owner}/{identifier}/upstream/generic/{slug_perm}/ | Update a Generic upstream config for this repository.
[**repos_upstream_go_create**](ReposApi.md#repos_upstream_go_create) | **POST** /repos/{owner}/{identifier}/upstream/go/ | Create a Go upstream config for this repository.
[**repos_upstream_go_delete**](ReposApi.md#repos_upstream_go_delete) | **DELETE** /repos/{owner}/{identifier}/upstream/go/{slug_perm}/ | Delete a Go upstream config for this repository.
[**repos_upstream_go_list**](ReposApi.md#repos_upstream_go_list) | **GET** /repos/{owner}/{identifier}/upstream/go/ | List Go upstream configs for this repository.
@@ -4612,6 +4618,401 @@ Name | Type | Description | Notes
+# **repos_upstream_generic_create**
+> GenericUpstream repos_upstream_generic_create(owner, identifier, opts)
+
+Create a Generic upstream config for this repository.
+
+Create a Generic upstream config for this repository.
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::ReposApi.new
+
+owner = 'owner_example' # String |
+
+identifier = 'identifier_example' # String |
+
+opts = {
+ data: CloudsmithApi::GenericUpstreamRequest.new # GenericUpstreamRequest |
+}
+
+begin
+ #Create a Generic upstream config for this repository.
+ result = api_instance.repos_upstream_generic_create(owner, identifier, opts)
+ p result
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling ReposApi->repos_upstream_generic_create: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **data** | [**GenericUpstreamRequest**](GenericUpstreamRequest.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
+# **repos_upstream_generic_delete**
+> repos_upstream_generic_delete(owner, identifier, slug_perm)
+
+Delete a Generic upstream config for this repository.
+
+Delete a Generic upstream config for this repository.
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::ReposApi.new
+
+owner = 'owner_example' # String |
+
+identifier = 'identifier_example' # String |
+
+slug_perm = 'slug_perm_example' # String |
+
+
+begin
+ #Delete a Generic upstream config for this repository.
+ api_instance.repos_upstream_generic_delete(owner, identifier, slug_perm)
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling ReposApi->repos_upstream_generic_delete: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slug_perm** | **String**| |
+
+### Return type
+
+nil (empty response body)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
+# **repos_upstream_generic_list**
+> Array<GenericUpstream> repos_upstream_generic_list(owner, identifier, opts)
+
+List Generic upstream configs for this repository.
+
+List Generic upstream configs for this repository.
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::ReposApi.new
+
+owner = 'owner_example' # String |
+
+identifier = 'identifier_example' # String |
+
+opts = {
+ page: 56, # Integer | A page number within the paginated result set.
+ page_size: 56 # Integer | Number of results to return per page.
+}
+
+begin
+ #List Generic upstream configs for this repository.
+ result = api_instance.repos_upstream_generic_list(owner, identifier, opts)
+ p result
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling ReposApi->repos_upstream_generic_list: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **page** | **Integer**| A page number within the paginated result set. | [optional]
+ **page_size** | **Integer**| Number of results to return per page. | [optional]
+
+### Return type
+
+[**Array<GenericUpstream>**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
+# **repos_upstream_generic_partial_update**
+> GenericUpstream repos_upstream_generic_partial_update(owner, identifier, slug_perm, opts)
+
+Partially update a Generic upstream config for this repository.
+
+Partially update a Generic upstream config for this repository.
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::ReposApi.new
+
+owner = 'owner_example' # String |
+
+identifier = 'identifier_example' # String |
+
+slug_perm = 'slug_perm_example' # String |
+
+opts = {
+ data: CloudsmithApi::GenericUpstreamRequestPatch.new # GenericUpstreamRequestPatch |
+}
+
+begin
+ #Partially update a Generic upstream config for this repository.
+ result = api_instance.repos_upstream_generic_partial_update(owner, identifier, slug_perm, opts)
+ p result
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling ReposApi->repos_upstream_generic_partial_update: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slug_perm** | **String**| |
+ **data** | [**GenericUpstreamRequestPatch**](GenericUpstreamRequestPatch.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
+# **repos_upstream_generic_read**
+> GenericUpstream repos_upstream_generic_read(owner, identifier, slug_perm)
+
+Retrieve a Generic upstream config for this repository.
+
+Retrieve a Generic upstream config for this repository.
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::ReposApi.new
+
+owner = 'owner_example' # String |
+
+identifier = 'identifier_example' # String |
+
+slug_perm = 'slug_perm_example' # String |
+
+
+begin
+ #Retrieve a Generic upstream config for this repository.
+ result = api_instance.repos_upstream_generic_read(owner, identifier, slug_perm)
+ p result
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling ReposApi->repos_upstream_generic_read: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slug_perm** | **String**| |
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
+# **repos_upstream_generic_update**
+> GenericUpstream repos_upstream_generic_update(owner, identifier, slug_perm, opts)
+
+Update a Generic upstream config for this repository.
+
+Update a Generic upstream config for this repository.
+
+### Example
+```ruby
+# load the gem
+require 'cloudsmith-api'
+# setup authorization
+CloudsmithApi.configure do |config|
+ # Configure API key authorization: apikey
+ config.api_key['X-Api-Key'] = 'YOUR API KEY'
+ # Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
+ #config.api_key_prefix['X-Api-Key'] = 'Bearer'
+
+ # Configure HTTP basic authorization: basic
+ config.username = 'YOUR USERNAME'
+ config.password = 'YOUR PASSWORD'
+end
+
+api_instance = CloudsmithApi::ReposApi.new
+
+owner = 'owner_example' # String |
+
+identifier = 'identifier_example' # String |
+
+slug_perm = 'slug_perm_example' # String |
+
+opts = {
+ data: CloudsmithApi::GenericUpstreamRequest.new # GenericUpstreamRequest |
+}
+
+begin
+ #Update a Generic upstream config for this repository.
+ result = api_instance.repos_upstream_generic_update(owner, identifier, slug_perm, opts)
+ p result
+rescue CloudsmithApi::ApiError => e
+ puts "Exception when calling ReposApi->repos_upstream_generic_update: #{e}"
+end
+```
+
+### Parameters
+
+Name | Type | Description | Notes
+------------- | ------------- | ------------- | -------------
+ **owner** | **String**| |
+ **identifier** | **String**| |
+ **slug_perm** | **String**| |
+ **data** | [**GenericUpstreamRequest**](GenericUpstreamRequest.md)| | [optional]
+
+### Return type
+
+[**GenericUpstream**](GenericUpstream.md)
+
+### Authorization
+
+[apikey](../README.md#apikey), [basic](../README.md#basic)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+
+
# **repos_upstream_go_create**
> GoUpstream repos_upstream_go_create(owner, identifier, opts)
diff --git a/bindings/ruby/src/lib/cloudsmith-api.rb b/bindings/ruby/src/lib/cloudsmith-api.rb
index 2d5e5a35..9e8dbc89 100644
--- a/bindings/ruby/src/lib/cloudsmith-api.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api.rb
@@ -78,6 +78,11 @@
require 'cloudsmith-api/models/format'
require 'cloudsmith-api/models/format_support'
require 'cloudsmith-api/models/format_support_upstream'
+require 'cloudsmith-api/models/generic_package_upload'
+require 'cloudsmith-api/models/generic_package_upload_request'
+require 'cloudsmith-api/models/generic_upstream'
+require 'cloudsmith-api/models/generic_upstream_request'
+require 'cloudsmith-api/models/generic_upstream_request_patch'
require 'cloudsmith-api/models/geo_ip_location'
require 'cloudsmith-api/models/go_package_upload'
require 'cloudsmith-api/models/go_package_upload_request'
diff --git a/bindings/ruby/src/lib/cloudsmith-api/api/packages_api.rb b/bindings/ruby/src/lib/cloudsmith-api/api/packages_api.rb
index 627f74cd..91fc4dc9 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/api/packages_api.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/api/packages_api.rb
@@ -1524,6 +1524,68 @@ def packages_upload_docker_with_http_info(owner, repo, opts = {})
end
return data, status_code, headers
end
+ # Create a new Generic package
+ # Create a new Generic package
+ # @param owner
+ # @param repo
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericPackageUploadRequest] :data
+ # @return [GenericPackageUpload]
+ def packages_upload_generic(owner, repo, opts = {})
+ data, _status_code, _headers = packages_upload_generic_with_http_info(owner, repo, opts)
+ data
+ end
+
+ # Create a new Generic package
+ # Create a new Generic package
+ # @param owner
+ # @param repo
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericPackageUploadRequest] :data
+ # @return [Array<(GenericPackageUpload, Fixnum, Hash)>] GenericPackageUpload data, response status code and response headers
+ def packages_upload_generic_with_http_info(owner, repo, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: PackagesApi.packages_upload_generic ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling PackagesApi.packages_upload_generic"
+ end
+ # verify the required parameter 'repo' is set
+ if @api_client.config.client_side_validation && repo.nil?
+ fail ArgumentError, "Missing the required parameter 'repo' when calling PackagesApi.packages_upload_generic"
+ end
+ # resource path
+ local_var_path = '/packages/{owner}/{repo}/upload/generic/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'repo' + '}', repo.to_s)
+
+ # query parameters
+ query_params = {}
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = @api_client.object_to_http_body(opts[:'data'])
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:POST, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names,
+ :return_type => 'GenericPackageUpload')
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: PackagesApi#packages_upload_generic\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
# Create a new Go package
# Create a new Go package
# @param owner
@@ -3126,6 +3188,67 @@ def packages_validate_upload_docker_with_http_info(owner, repo, opts = {})
end
return data, status_code, headers
end
+ # Validate parameters for create Generic package
+ # Validate parameters for create Generic package
+ # @param owner
+ # @param repo
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericPackageUploadRequest] :data
+ # @return [nil]
+ def packages_validate_upload_generic(owner, repo, opts = {})
+ packages_validate_upload_generic_with_http_info(owner, repo, opts)
+ nil
+ end
+
+ # Validate parameters for create Generic package
+ # Validate parameters for create Generic package
+ # @param owner
+ # @param repo
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericPackageUploadRequest] :data
+ # @return [Array<(nil, Fixnum, Hash)>] nil, response status code and response headers
+ def packages_validate_upload_generic_with_http_info(owner, repo, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: PackagesApi.packages_validate_upload_generic ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling PackagesApi.packages_validate_upload_generic"
+ end
+ # verify the required parameter 'repo' is set
+ if @api_client.config.client_side_validation && repo.nil?
+ fail ArgumentError, "Missing the required parameter 'repo' when calling PackagesApi.packages_validate_upload_generic"
+ end
+ # resource path
+ local_var_path = '/packages/{owner}/{repo}/validate-upload/generic/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'repo' + '}', repo.to_s)
+
+ # query parameters
+ query_params = {}
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = @api_client.object_to_http_body(opts[:'data'])
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:POST, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names)
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: PackagesApi#packages_validate_upload_generic\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
# Validate parameters for create Go package
# Validate parameters for create Go package
# @param owner
diff --git a/bindings/ruby/src/lib/cloudsmith-api/api/repos_api.rb b/bindings/ruby/src/lib/cloudsmith-api/api/repos_api.rb
index 35bf42f7..66bb525d 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/api/repos_api.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/api/repos_api.rb
@@ -4426,6 +4426,401 @@ def repos_upstream_docker_update_with_http_info(owner, identifier, slug_perm, op
end
return data, status_code, headers
end
+ # Create a Generic upstream config for this repository.
+ # Create a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequest] :data
+ # @return [GenericUpstream]
+ def repos_upstream_generic_create(owner, identifier, opts = {})
+ data, _status_code, _headers = repos_upstream_generic_create_with_http_info(owner, identifier, opts)
+ data
+ end
+
+ # Create a Generic upstream config for this repository.
+ # Create a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequest] :data
+ # @return [Array<(GenericUpstream, Fixnum, Hash)>] GenericUpstream data, response status code and response headers
+ def repos_upstream_generic_create_with_http_info(owner, identifier, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: ReposApi.repos_upstream_generic_create ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling ReposApi.repos_upstream_generic_create"
+ end
+ # verify the required parameter 'identifier' is set
+ if @api_client.config.client_side_validation && identifier.nil?
+ fail ArgumentError, "Missing the required parameter 'identifier' when calling ReposApi.repos_upstream_generic_create"
+ end
+ # resource path
+ local_var_path = '/repos/{owner}/{identifier}/upstream/generic/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'identifier' + '}', identifier.to_s)
+
+ # query parameters
+ query_params = {}
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = @api_client.object_to_http_body(opts[:'data'])
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:POST, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names,
+ :return_type => 'GenericUpstream')
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: ReposApi#repos_upstream_generic_create\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
+ # Delete a Generic upstream config for this repository.
+ # Delete a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @return [nil]
+ def repos_upstream_generic_delete(owner, identifier, slug_perm, opts = {})
+ repos_upstream_generic_delete_with_http_info(owner, identifier, slug_perm, opts)
+ nil
+ end
+
+ # Delete a Generic upstream config for this repository.
+ # Delete a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @return [Array<(nil, Fixnum, Hash)>] nil, response status code and response headers
+ def repos_upstream_generic_delete_with_http_info(owner, identifier, slug_perm, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: ReposApi.repos_upstream_generic_delete ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling ReposApi.repos_upstream_generic_delete"
+ end
+ # verify the required parameter 'identifier' is set
+ if @api_client.config.client_side_validation && identifier.nil?
+ fail ArgumentError, "Missing the required parameter 'identifier' when calling ReposApi.repos_upstream_generic_delete"
+ end
+ # verify the required parameter 'slug_perm' is set
+ if @api_client.config.client_side_validation && slug_perm.nil?
+ fail ArgumentError, "Missing the required parameter 'slug_perm' when calling ReposApi.repos_upstream_generic_delete"
+ end
+ # resource path
+ local_var_path = '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'identifier' + '}', identifier.to_s).sub('{' + 'slug_perm' + '}', slug_perm.to_s)
+
+ # query parameters
+ query_params = {}
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = nil
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:DELETE, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names)
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: ReposApi#repos_upstream_generic_delete\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
+ # List Generic upstream configs for this repository.
+ # List Generic upstream configs for this repository.
+ # @param owner
+ # @param identifier
+ # @param [Hash] opts the optional parameters
+ # @option opts [Integer] :page A page number within the paginated result set.
+ # @option opts [Integer] :page_size Number of results to return per page.
+ # @return [Array]
+ def repos_upstream_generic_list(owner, identifier, opts = {})
+ data, _status_code, _headers = repos_upstream_generic_list_with_http_info(owner, identifier, opts)
+ data
+ end
+
+ # List Generic upstream configs for this repository.
+ # List Generic upstream configs for this repository.
+ # @param owner
+ # @param identifier
+ # @param [Hash] opts the optional parameters
+ # @option opts [Integer] :page A page number within the paginated result set.
+ # @option opts [Integer] :page_size Number of results to return per page.
+ # @return [Array<(Array, Fixnum, Hash)>] Array data, response status code and response headers
+ def repos_upstream_generic_list_with_http_info(owner, identifier, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: ReposApi.repos_upstream_generic_list ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling ReposApi.repos_upstream_generic_list"
+ end
+ # verify the required parameter 'identifier' is set
+ if @api_client.config.client_side_validation && identifier.nil?
+ fail ArgumentError, "Missing the required parameter 'identifier' when calling ReposApi.repos_upstream_generic_list"
+ end
+ # resource path
+ local_var_path = '/repos/{owner}/{identifier}/upstream/generic/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'identifier' + '}', identifier.to_s)
+
+ # query parameters
+ query_params = {}
+ query_params[:'page'] = opts[:'page'] if !opts[:'page'].nil?
+ query_params[:'page_size'] = opts[:'page_size'] if !opts[:'page_size'].nil?
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = nil
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:GET, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names,
+ :return_type => 'Array')
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: ReposApi#repos_upstream_generic_list\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
+ # Partially update a Generic upstream config for this repository.
+ # Partially update a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequestPatch] :data
+ # @return [GenericUpstream]
+ def repos_upstream_generic_partial_update(owner, identifier, slug_perm, opts = {})
+ data, _status_code, _headers = repos_upstream_generic_partial_update_with_http_info(owner, identifier, slug_perm, opts)
+ data
+ end
+
+ # Partially update a Generic upstream config for this repository.
+ # Partially update a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequestPatch] :data
+ # @return [Array<(GenericUpstream, Fixnum, Hash)>] GenericUpstream data, response status code and response headers
+ def repos_upstream_generic_partial_update_with_http_info(owner, identifier, slug_perm, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: ReposApi.repos_upstream_generic_partial_update ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling ReposApi.repos_upstream_generic_partial_update"
+ end
+ # verify the required parameter 'identifier' is set
+ if @api_client.config.client_side_validation && identifier.nil?
+ fail ArgumentError, "Missing the required parameter 'identifier' when calling ReposApi.repos_upstream_generic_partial_update"
+ end
+ # verify the required parameter 'slug_perm' is set
+ if @api_client.config.client_side_validation && slug_perm.nil?
+ fail ArgumentError, "Missing the required parameter 'slug_perm' when calling ReposApi.repos_upstream_generic_partial_update"
+ end
+ # resource path
+ local_var_path = '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'identifier' + '}', identifier.to_s).sub('{' + 'slug_perm' + '}', slug_perm.to_s)
+
+ # query parameters
+ query_params = {}
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = @api_client.object_to_http_body(opts[:'data'])
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:PATCH, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names,
+ :return_type => 'GenericUpstream')
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: ReposApi#repos_upstream_generic_partial_update\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
+ # Retrieve a Generic upstream config for this repository.
+ # Retrieve a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @return [GenericUpstream]
+ def repos_upstream_generic_read(owner, identifier, slug_perm, opts = {})
+ data, _status_code, _headers = repos_upstream_generic_read_with_http_info(owner, identifier, slug_perm, opts)
+ data
+ end
+
+ # Retrieve a Generic upstream config for this repository.
+ # Retrieve a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @return [Array<(GenericUpstream, Fixnum, Hash)>] GenericUpstream data, response status code and response headers
+ def repos_upstream_generic_read_with_http_info(owner, identifier, slug_perm, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: ReposApi.repos_upstream_generic_read ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling ReposApi.repos_upstream_generic_read"
+ end
+ # verify the required parameter 'identifier' is set
+ if @api_client.config.client_side_validation && identifier.nil?
+ fail ArgumentError, "Missing the required parameter 'identifier' when calling ReposApi.repos_upstream_generic_read"
+ end
+ # verify the required parameter 'slug_perm' is set
+ if @api_client.config.client_side_validation && slug_perm.nil?
+ fail ArgumentError, "Missing the required parameter 'slug_perm' when calling ReposApi.repos_upstream_generic_read"
+ end
+ # resource path
+ local_var_path = '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'identifier' + '}', identifier.to_s).sub('{' + 'slug_perm' + '}', slug_perm.to_s)
+
+ # query parameters
+ query_params = {}
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = nil
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:GET, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names,
+ :return_type => 'GenericUpstream')
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: ReposApi#repos_upstream_generic_read\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
+ # Update a Generic upstream config for this repository.
+ # Update a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequest] :data
+ # @return [GenericUpstream]
+ def repos_upstream_generic_update(owner, identifier, slug_perm, opts = {})
+ data, _status_code, _headers = repos_upstream_generic_update_with_http_info(owner, identifier, slug_perm, opts)
+ data
+ end
+
+ # Update a Generic upstream config for this repository.
+ # Update a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequest] :data
+ # @return [Array<(GenericUpstream, Fixnum, Hash)>] GenericUpstream data, response status code and response headers
+ def repos_upstream_generic_update_with_http_info(owner, identifier, slug_perm, opts = {})
+ if @api_client.config.debugging
+ @api_client.config.logger.debug 'Calling API: ReposApi.repos_upstream_generic_update ...'
+ end
+ # verify the required parameter 'owner' is set
+ if @api_client.config.client_side_validation && owner.nil?
+ fail ArgumentError, "Missing the required parameter 'owner' when calling ReposApi.repos_upstream_generic_update"
+ end
+ # verify the required parameter 'identifier' is set
+ if @api_client.config.client_side_validation && identifier.nil?
+ fail ArgumentError, "Missing the required parameter 'identifier' when calling ReposApi.repos_upstream_generic_update"
+ end
+ # verify the required parameter 'slug_perm' is set
+ if @api_client.config.client_side_validation && slug_perm.nil?
+ fail ArgumentError, "Missing the required parameter 'slug_perm' when calling ReposApi.repos_upstream_generic_update"
+ end
+ # resource path
+ local_var_path = '/repos/{owner}/{identifier}/upstream/generic/{slug_perm}/'.sub('{' + 'owner' + '}', owner.to_s).sub('{' + 'identifier' + '}', identifier.to_s).sub('{' + 'slug_perm' + '}', slug_perm.to_s)
+
+ # query parameters
+ query_params = {}
+
+ # header parameters
+ header_params = {}
+ # HTTP header 'Accept' (if needed)
+ header_params['Accept'] = @api_client.select_header_accept(['application/json'])
+ # HTTP header 'Content-Type'
+ header_params['Content-Type'] = @api_client.select_header_content_type(['application/json'])
+
+ # form parameters
+ form_params = {}
+
+ # http body (model)
+ post_body = @api_client.object_to_http_body(opts[:'data'])
+ auth_names = ['apikey', 'basic']
+ data, status_code, headers = @api_client.call_api(:PUT, local_var_path,
+ :header_params => header_params,
+ :query_params => query_params,
+ :form_params => form_params,
+ :body => post_body,
+ :auth_names => auth_names,
+ :return_type => 'GenericUpstream')
+ if @api_client.config.debugging
+ @api_client.config.logger.debug "API called: ReposApi#repos_upstream_generic_update\nData: #{data.inspect}\nStatus code: #{status_code}\nHeaders: #{headers}"
+ end
+ return data, status_code, headers
+ end
# Create a Go upstream config for this repository.
# Create a Go upstream config for this repository.
# @param owner
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/format_support.rb b/bindings/ruby/src/lib/cloudsmith-api/models/format_support.rb
index 8fe2f2aa..5f4e19e0 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/format_support.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/format_support.rb
@@ -24,6 +24,9 @@ class FormatSupport
# If true the package format supports file lists
attr_accessor :file_lists
+ # If true the package format supports filepaths
+ attr_accessor :filepaths
+
# If true the package format supports metadata
attr_accessor :metadata
@@ -38,6 +41,7 @@ def self.attribute_map
:'dependencies' => :'dependencies',
:'distributions' => :'distributions',
:'file_lists' => :'file_lists',
+ :'filepaths' => :'filepaths',
:'metadata' => :'metadata',
:'upstreams' => :'upstreams',
:'versioning' => :'versioning'
@@ -50,6 +54,7 @@ def self.swagger_types
:'dependencies' => :'BOOLEAN',
:'distributions' => :'BOOLEAN',
:'file_lists' => :'BOOLEAN',
+ :'filepaths' => :'BOOLEAN',
:'metadata' => :'BOOLEAN',
:'upstreams' => :'FormatSupportUpstream',
:'versioning' => :'BOOLEAN'
@@ -76,6 +81,10 @@ def initialize(attributes = {})
self.file_lists = attributes[:'file_lists']
end
+ if attributes.has_key?(:'filepaths')
+ self.filepaths = attributes[:'filepaths']
+ end
+
if attributes.has_key?(:'metadata')
self.metadata = attributes[:'metadata']
end
@@ -105,6 +114,10 @@ def list_invalid_properties
invalid_properties.push('invalid value for "file_lists", file_lists cannot be nil.')
end
+ if @filepaths.nil?
+ invalid_properties.push('invalid value for "filepaths", filepaths cannot be nil.')
+ end
+
if @metadata.nil?
invalid_properties.push('invalid value for "metadata", metadata cannot be nil.')
end
@@ -126,6 +139,7 @@ def valid?
return false if @dependencies.nil?
return false if @distributions.nil?
return false if @file_lists.nil?
+ return false if @filepaths.nil?
return false if @metadata.nil?
return false if @upstreams.nil?
return false if @versioning.nil?
@@ -140,6 +154,7 @@ def ==(o)
dependencies == o.dependencies &&
distributions == o.distributions &&
file_lists == o.file_lists &&
+ filepaths == o.filepaths &&
metadata == o.metadata &&
upstreams == o.upstreams &&
versioning == o.versioning
@@ -154,7 +169,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [dependencies, distributions, file_lists, metadata, upstreams, versioning].hash
+ [dependencies, distributions, file_lists, filepaths, metadata, upstreams, versioning].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/generic_package_upload.rb b/bindings/ruby/src/lib/cloudsmith-api/models/generic_package_upload.rb
new file mode 100644
index 00000000..39abec9d
--- /dev/null
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/generic_package_upload.rb
@@ -0,0 +1,992 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'date'
+
+module CloudsmithApi
+class GenericPackageUpload
+ attr_accessor :architectures
+
+ attr_accessor :cdn_url
+
+ attr_accessor :checksum_md5
+
+ attr_accessor :checksum_sha1
+
+ attr_accessor :checksum_sha256
+
+ attr_accessor :checksum_sha512
+
+ # A checksum of all of the package's dependencies.
+ attr_accessor :dependencies_checksum_md5
+
+ attr_accessor :dependencies_url
+
+ # A textual description of this package.
+ attr_accessor :description
+
+ attr_accessor :display_name
+
+ attr_accessor :distro
+
+ attr_accessor :distro_version
+
+ attr_accessor :downloads
+
+ # The epoch of the package version (if any).
+ attr_accessor :epoch
+
+ attr_accessor :extension
+
+ attr_accessor :filename
+
+ attr_accessor :files
+
+ attr_accessor :format
+
+ attr_accessor :format_url
+
+ # Amount of storage that will be freed if this package is deleted
+ attr_accessor :freeable_storage
+
+ attr_accessor :fully_qualified_name
+
+ # Unique and permanent identifier for the package.
+ attr_accessor :identifier_perm
+
+ # Return a map of identifier field names and their values.
+ attr_accessor :identifiers
+
+ attr_accessor :indexed
+
+ attr_accessor :is_cancellable
+
+ attr_accessor :is_copyable
+
+ attr_accessor :is_deleteable
+
+ attr_accessor :is_downloadable
+
+ attr_accessor :is_moveable
+
+ attr_accessor :is_quarantinable
+
+ attr_accessor :is_quarantined
+
+ attr_accessor :is_resyncable
+
+ attr_accessor :is_security_scannable
+
+ attr_accessor :is_sync_awaiting
+
+ attr_accessor :is_sync_completed
+
+ attr_accessor :is_sync_failed
+
+ attr_accessor :is_sync_in_flight
+
+ attr_accessor :is_sync_in_progress
+
+ # The license of this package.
+ attr_accessor :license
+
+ # The name of this package.
+ attr_accessor :name
+
+ attr_accessor :namespace
+
+ attr_accessor :namespace_url
+
+ attr_accessor :num_files
+
+ attr_accessor :origin_repository
+
+ attr_accessor :origin_repository_url
+
+ # The type of package contents.
+ attr_accessor :package_type
+
+ # Whether or not the package has violated any policy.
+ attr_accessor :policy_violated
+
+ # The raw license string.
+ attr_accessor :raw_license
+
+ # The release of the package version (if any).
+ attr_accessor :release
+
+ attr_accessor :repository
+
+ attr_accessor :repository_url
+
+ # The datetime the security scanning was completed.
+ attr_accessor :security_scan_completed_at
+
+ # The datetime the security scanning was started.
+ attr_accessor :security_scan_started_at
+
+ attr_accessor :security_scan_status
+
+ # The datetime the security scanning status was updated.
+ attr_accessor :security_scan_status_updated_at
+
+ attr_accessor :self_html_url
+
+ attr_accessor :self_url
+
+ attr_accessor :signature_url
+
+ # The calculated size of the package.
+ attr_accessor :size
+
+ # The public unique identifier for the package.
+ attr_accessor :slug
+
+ attr_accessor :slug_perm
+
+ # The SPDX license identifier for this package.
+ attr_accessor :spdx_license
+
+ # The synchronisation (in progress) stage of the package.
+ attr_accessor :stage
+
+ attr_accessor :stage_str
+
+ # The datetime the package stage was updated at.
+ attr_accessor :stage_updated_at
+
+ # The synchronisation status of the package.
+ attr_accessor :status
+
+ # A textual description for the synchronous status reason (if any
+ attr_accessor :status_reason
+
+ attr_accessor :status_str
+
+ # The datetime the package status was updated at.
+ attr_accessor :status_updated_at
+
+ attr_accessor :status_url
+
+ attr_accessor :subtype
+
+ # A one-liner synopsis of this package.
+ attr_accessor :summary
+
+ # The datetime the package sync was finished at.
+ attr_accessor :sync_finished_at
+
+ # Synchronisation progress (from 0-100)
+ attr_accessor :sync_progress
+
+ attr_accessor :tags_automatic
+
+ attr_accessor :tags_immutable
+
+ attr_accessor :type_display
+
+ # The date this package was uploaded.
+ attr_accessor :uploaded_at
+
+ attr_accessor :uploader
+
+ attr_accessor :uploader_url
+
+ # The raw version for this package.
+ attr_accessor :version
+
+ attr_accessor :version_orig
+
+ attr_accessor :vulnerability_scan_results_url
+
+ class EnumAttributeValidator
+ attr_reader :datatype
+ attr_reader :allowable_values
+
+ def initialize(datatype, allowable_values)
+ @allowable_values = allowable_values.map do |value|
+ case datatype.to_s
+ when /Integer/i
+ value.to_i
+ when /Float/i
+ value.to_f
+ else
+ value
+ end
+ end
+ end
+
+ def valid?(value)
+ !value || allowable_values.include?(value)
+ end
+ end
+
+ # Attribute mapping from ruby-style variable name to JSON key.
+ def self.attribute_map
+ {
+ :'architectures' => :'architectures',
+ :'cdn_url' => :'cdn_url',
+ :'checksum_md5' => :'checksum_md5',
+ :'checksum_sha1' => :'checksum_sha1',
+ :'checksum_sha256' => :'checksum_sha256',
+ :'checksum_sha512' => :'checksum_sha512',
+ :'dependencies_checksum_md5' => :'dependencies_checksum_md5',
+ :'dependencies_url' => :'dependencies_url',
+ :'description' => :'description',
+ :'display_name' => :'display_name',
+ :'distro' => :'distro',
+ :'distro_version' => :'distro_version',
+ :'downloads' => :'downloads',
+ :'epoch' => :'epoch',
+ :'extension' => :'extension',
+ :'filename' => :'filename',
+ :'files' => :'files',
+ :'format' => :'format',
+ :'format_url' => :'format_url',
+ :'freeable_storage' => :'freeable_storage',
+ :'fully_qualified_name' => :'fully_qualified_name',
+ :'identifier_perm' => :'identifier_perm',
+ :'identifiers' => :'identifiers',
+ :'indexed' => :'indexed',
+ :'is_cancellable' => :'is_cancellable',
+ :'is_copyable' => :'is_copyable',
+ :'is_deleteable' => :'is_deleteable',
+ :'is_downloadable' => :'is_downloadable',
+ :'is_moveable' => :'is_moveable',
+ :'is_quarantinable' => :'is_quarantinable',
+ :'is_quarantined' => :'is_quarantined',
+ :'is_resyncable' => :'is_resyncable',
+ :'is_security_scannable' => :'is_security_scannable',
+ :'is_sync_awaiting' => :'is_sync_awaiting',
+ :'is_sync_completed' => :'is_sync_completed',
+ :'is_sync_failed' => :'is_sync_failed',
+ :'is_sync_in_flight' => :'is_sync_in_flight',
+ :'is_sync_in_progress' => :'is_sync_in_progress',
+ :'license' => :'license',
+ :'name' => :'name',
+ :'namespace' => :'namespace',
+ :'namespace_url' => :'namespace_url',
+ :'num_files' => :'num_files',
+ :'origin_repository' => :'origin_repository',
+ :'origin_repository_url' => :'origin_repository_url',
+ :'package_type' => :'package_type',
+ :'policy_violated' => :'policy_violated',
+ :'raw_license' => :'raw_license',
+ :'release' => :'release',
+ :'repository' => :'repository',
+ :'repository_url' => :'repository_url',
+ :'security_scan_completed_at' => :'security_scan_completed_at',
+ :'security_scan_started_at' => :'security_scan_started_at',
+ :'security_scan_status' => :'security_scan_status',
+ :'security_scan_status_updated_at' => :'security_scan_status_updated_at',
+ :'self_html_url' => :'self_html_url',
+ :'self_url' => :'self_url',
+ :'signature_url' => :'signature_url',
+ :'size' => :'size',
+ :'slug' => :'slug',
+ :'slug_perm' => :'slug_perm',
+ :'spdx_license' => :'spdx_license',
+ :'stage' => :'stage',
+ :'stage_str' => :'stage_str',
+ :'stage_updated_at' => :'stage_updated_at',
+ :'status' => :'status',
+ :'status_reason' => :'status_reason',
+ :'status_str' => :'status_str',
+ :'status_updated_at' => :'status_updated_at',
+ :'status_url' => :'status_url',
+ :'subtype' => :'subtype',
+ :'summary' => :'summary',
+ :'sync_finished_at' => :'sync_finished_at',
+ :'sync_progress' => :'sync_progress',
+ :'tags_automatic' => :'tags_automatic',
+ :'tags_immutable' => :'tags_immutable',
+ :'type_display' => :'type_display',
+ :'uploaded_at' => :'uploaded_at',
+ :'uploader' => :'uploader',
+ :'uploader_url' => :'uploader_url',
+ :'version' => :'version',
+ :'version_orig' => :'version_orig',
+ :'vulnerability_scan_results_url' => :'vulnerability_scan_results_url'
+ }
+ end
+
+ # Attribute type mapping.
+ def self.swagger_types
+ {
+ :'architectures' => :'Array',
+ :'cdn_url' => :'String',
+ :'checksum_md5' => :'String',
+ :'checksum_sha1' => :'String',
+ :'checksum_sha256' => :'String',
+ :'checksum_sha512' => :'String',
+ :'dependencies_checksum_md5' => :'String',
+ :'dependencies_url' => :'String',
+ :'description' => :'String',
+ :'display_name' => :'String',
+ :'distro' => :'Distribution',
+ :'distro_version' => :'DistributionVersion',
+ :'downloads' => :'Integer',
+ :'epoch' => :'Integer',
+ :'extension' => :'String',
+ :'filename' => :'String',
+ :'files' => :'Array',
+ :'format' => :'String',
+ :'format_url' => :'String',
+ :'freeable_storage' => :'Integer',
+ :'fully_qualified_name' => :'String',
+ :'identifier_perm' => :'String',
+ :'identifiers' => :'Hash',
+ :'indexed' => :'BOOLEAN',
+ :'is_cancellable' => :'BOOLEAN',
+ :'is_copyable' => :'BOOLEAN',
+ :'is_deleteable' => :'BOOLEAN',
+ :'is_downloadable' => :'BOOLEAN',
+ :'is_moveable' => :'BOOLEAN',
+ :'is_quarantinable' => :'BOOLEAN',
+ :'is_quarantined' => :'BOOLEAN',
+ :'is_resyncable' => :'BOOLEAN',
+ :'is_security_scannable' => :'BOOLEAN',
+ :'is_sync_awaiting' => :'BOOLEAN',
+ :'is_sync_completed' => :'BOOLEAN',
+ :'is_sync_failed' => :'BOOLEAN',
+ :'is_sync_in_flight' => :'BOOLEAN',
+ :'is_sync_in_progress' => :'BOOLEAN',
+ :'license' => :'String',
+ :'name' => :'String',
+ :'namespace' => :'String',
+ :'namespace_url' => :'String',
+ :'num_files' => :'Integer',
+ :'origin_repository' => :'String',
+ :'origin_repository_url' => :'String',
+ :'package_type' => :'Integer',
+ :'policy_violated' => :'BOOLEAN',
+ :'raw_license' => :'String',
+ :'release' => :'String',
+ :'repository' => :'String',
+ :'repository_url' => :'String',
+ :'security_scan_completed_at' => :'DateTime',
+ :'security_scan_started_at' => :'DateTime',
+ :'security_scan_status' => :'String',
+ :'security_scan_status_updated_at' => :'DateTime',
+ :'self_html_url' => :'String',
+ :'self_url' => :'String',
+ :'signature_url' => :'String',
+ :'size' => :'Integer',
+ :'slug' => :'String',
+ :'slug_perm' => :'String',
+ :'spdx_license' => :'String',
+ :'stage' => :'Integer',
+ :'stage_str' => :'String',
+ :'stage_updated_at' => :'DateTime',
+ :'status' => :'Integer',
+ :'status_reason' => :'String',
+ :'status_str' => :'String',
+ :'status_updated_at' => :'DateTime',
+ :'status_url' => :'String',
+ :'subtype' => :'String',
+ :'summary' => :'String',
+ :'sync_finished_at' => :'DateTime',
+ :'sync_progress' => :'Integer',
+ :'tags_automatic' => :'Tags',
+ :'tags_immutable' => :'Tags',
+ :'type_display' => :'String',
+ :'uploaded_at' => :'DateTime',
+ :'uploader' => :'String',
+ :'uploader_url' => :'String',
+ :'version' => :'String',
+ :'version_orig' => :'String',
+ :'vulnerability_scan_results_url' => :'String'
+ }
+ end
+
+ # Initializes the object
+ # @param [Hash] attributes Model attributes in the form of hash
+ def initialize(attributes = {})
+ return unless attributes.is_a?(Hash)
+
+ # convert string to symbol for hash key
+ attributes = attributes.each_with_object({}) { |(k, v), h| h[k.to_sym] = v }
+
+ if attributes.has_key?(:'architectures')
+ if (value = attributes[:'architectures']).is_a?(Array)
+ self.architectures = value
+ end
+ end
+
+ if attributes.has_key?(:'cdn_url')
+ self.cdn_url = attributes[:'cdn_url']
+ end
+
+ if attributes.has_key?(:'checksum_md5')
+ self.checksum_md5 = attributes[:'checksum_md5']
+ end
+
+ if attributes.has_key?(:'checksum_sha1')
+ self.checksum_sha1 = attributes[:'checksum_sha1']
+ end
+
+ if attributes.has_key?(:'checksum_sha256')
+ self.checksum_sha256 = attributes[:'checksum_sha256']
+ end
+
+ if attributes.has_key?(:'checksum_sha512')
+ self.checksum_sha512 = attributes[:'checksum_sha512']
+ end
+
+ if attributes.has_key?(:'dependencies_checksum_md5')
+ self.dependencies_checksum_md5 = attributes[:'dependencies_checksum_md5']
+ end
+
+ if attributes.has_key?(:'dependencies_url')
+ self.dependencies_url = attributes[:'dependencies_url']
+ end
+
+ if attributes.has_key?(:'description')
+ self.description = attributes[:'description']
+ end
+
+ if attributes.has_key?(:'display_name')
+ self.display_name = attributes[:'display_name']
+ end
+
+ if attributes.has_key?(:'distro')
+ self.distro = attributes[:'distro']
+ end
+
+ if attributes.has_key?(:'distro_version')
+ self.distro_version = attributes[:'distro_version']
+ end
+
+ if attributes.has_key?(:'downloads')
+ self.downloads = attributes[:'downloads']
+ end
+
+ if attributes.has_key?(:'epoch')
+ self.epoch = attributes[:'epoch']
+ end
+
+ if attributes.has_key?(:'extension')
+ self.extension = attributes[:'extension']
+ end
+
+ if attributes.has_key?(:'filename')
+ self.filename = attributes[:'filename']
+ end
+
+ if attributes.has_key?(:'files')
+ if (value = attributes[:'files']).is_a?(Array)
+ self.files = value
+ end
+ end
+
+ if attributes.has_key?(:'format')
+ self.format = attributes[:'format']
+ end
+
+ if attributes.has_key?(:'format_url')
+ self.format_url = attributes[:'format_url']
+ end
+
+ if attributes.has_key?(:'freeable_storage')
+ self.freeable_storage = attributes[:'freeable_storage']
+ end
+
+ if attributes.has_key?(:'fully_qualified_name')
+ self.fully_qualified_name = attributes[:'fully_qualified_name']
+ end
+
+ if attributes.has_key?(:'identifier_perm')
+ self.identifier_perm = attributes[:'identifier_perm']
+ end
+
+ if attributes.has_key?(:'identifiers')
+ if (value = attributes[:'identifiers']).is_a?(Hash)
+ self.identifiers = value
+ end
+ end
+
+ if attributes.has_key?(:'indexed')
+ self.indexed = attributes[:'indexed']
+ end
+
+ if attributes.has_key?(:'is_cancellable')
+ self.is_cancellable = attributes[:'is_cancellable']
+ end
+
+ if attributes.has_key?(:'is_copyable')
+ self.is_copyable = attributes[:'is_copyable']
+ end
+
+ if attributes.has_key?(:'is_deleteable')
+ self.is_deleteable = attributes[:'is_deleteable']
+ end
+
+ if attributes.has_key?(:'is_downloadable')
+ self.is_downloadable = attributes[:'is_downloadable']
+ end
+
+ if attributes.has_key?(:'is_moveable')
+ self.is_moveable = attributes[:'is_moveable']
+ end
+
+ if attributes.has_key?(:'is_quarantinable')
+ self.is_quarantinable = attributes[:'is_quarantinable']
+ end
+
+ if attributes.has_key?(:'is_quarantined')
+ self.is_quarantined = attributes[:'is_quarantined']
+ end
+
+ if attributes.has_key?(:'is_resyncable')
+ self.is_resyncable = attributes[:'is_resyncable']
+ end
+
+ if attributes.has_key?(:'is_security_scannable')
+ self.is_security_scannable = attributes[:'is_security_scannable']
+ end
+
+ if attributes.has_key?(:'is_sync_awaiting')
+ self.is_sync_awaiting = attributes[:'is_sync_awaiting']
+ end
+
+ if attributes.has_key?(:'is_sync_completed')
+ self.is_sync_completed = attributes[:'is_sync_completed']
+ end
+
+ if attributes.has_key?(:'is_sync_failed')
+ self.is_sync_failed = attributes[:'is_sync_failed']
+ end
+
+ if attributes.has_key?(:'is_sync_in_flight')
+ self.is_sync_in_flight = attributes[:'is_sync_in_flight']
+ end
+
+ if attributes.has_key?(:'is_sync_in_progress')
+ self.is_sync_in_progress = attributes[:'is_sync_in_progress']
+ end
+
+ if attributes.has_key?(:'license')
+ self.license = attributes[:'license']
+ end
+
+ if attributes.has_key?(:'name')
+ self.name = attributes[:'name']
+ end
+
+ if attributes.has_key?(:'namespace')
+ self.namespace = attributes[:'namespace']
+ end
+
+ if attributes.has_key?(:'namespace_url')
+ self.namespace_url = attributes[:'namespace_url']
+ end
+
+ if attributes.has_key?(:'num_files')
+ self.num_files = attributes[:'num_files']
+ end
+
+ if attributes.has_key?(:'origin_repository')
+ self.origin_repository = attributes[:'origin_repository']
+ end
+
+ if attributes.has_key?(:'origin_repository_url')
+ self.origin_repository_url = attributes[:'origin_repository_url']
+ end
+
+ if attributes.has_key?(:'package_type')
+ self.package_type = attributes[:'package_type']
+ end
+
+ if attributes.has_key?(:'policy_violated')
+ self.policy_violated = attributes[:'policy_violated']
+ end
+
+ if attributes.has_key?(:'raw_license')
+ self.raw_license = attributes[:'raw_license']
+ end
+
+ if attributes.has_key?(:'release')
+ self.release = attributes[:'release']
+ end
+
+ if attributes.has_key?(:'repository')
+ self.repository = attributes[:'repository']
+ end
+
+ if attributes.has_key?(:'repository_url')
+ self.repository_url = attributes[:'repository_url']
+ end
+
+ if attributes.has_key?(:'security_scan_completed_at')
+ self.security_scan_completed_at = attributes[:'security_scan_completed_at']
+ end
+
+ if attributes.has_key?(:'security_scan_started_at')
+ self.security_scan_started_at = attributes[:'security_scan_started_at']
+ end
+
+ if attributes.has_key?(:'security_scan_status')
+ self.security_scan_status = attributes[:'security_scan_status']
+ else
+ self.security_scan_status = 'Awaiting Security Scan'
+ end
+
+ if attributes.has_key?(:'security_scan_status_updated_at')
+ self.security_scan_status_updated_at = attributes[:'security_scan_status_updated_at']
+ end
+
+ if attributes.has_key?(:'self_html_url')
+ self.self_html_url = attributes[:'self_html_url']
+ end
+
+ if attributes.has_key?(:'self_url')
+ self.self_url = attributes[:'self_url']
+ end
+
+ if attributes.has_key?(:'signature_url')
+ self.signature_url = attributes[:'signature_url']
+ end
+
+ if attributes.has_key?(:'size')
+ self.size = attributes[:'size']
+ end
+
+ if attributes.has_key?(:'slug')
+ self.slug = attributes[:'slug']
+ end
+
+ if attributes.has_key?(:'slug_perm')
+ self.slug_perm = attributes[:'slug_perm']
+ end
+
+ if attributes.has_key?(:'spdx_license')
+ self.spdx_license = attributes[:'spdx_license']
+ end
+
+ if attributes.has_key?(:'stage')
+ self.stage = attributes[:'stage']
+ end
+
+ if attributes.has_key?(:'stage_str')
+ self.stage_str = attributes[:'stage_str']
+ end
+
+ if attributes.has_key?(:'stage_updated_at')
+ self.stage_updated_at = attributes[:'stage_updated_at']
+ end
+
+ if attributes.has_key?(:'status')
+ self.status = attributes[:'status']
+ end
+
+ if attributes.has_key?(:'status_reason')
+ self.status_reason = attributes[:'status_reason']
+ end
+
+ if attributes.has_key?(:'status_str')
+ self.status_str = attributes[:'status_str']
+ end
+
+ if attributes.has_key?(:'status_updated_at')
+ self.status_updated_at = attributes[:'status_updated_at']
+ end
+
+ if attributes.has_key?(:'status_url')
+ self.status_url = attributes[:'status_url']
+ end
+
+ if attributes.has_key?(:'subtype')
+ self.subtype = attributes[:'subtype']
+ end
+
+ if attributes.has_key?(:'summary')
+ self.summary = attributes[:'summary']
+ end
+
+ if attributes.has_key?(:'sync_finished_at')
+ self.sync_finished_at = attributes[:'sync_finished_at']
+ end
+
+ if attributes.has_key?(:'sync_progress')
+ self.sync_progress = attributes[:'sync_progress']
+ end
+
+ if attributes.has_key?(:'tags_automatic')
+ self.tags_automatic = attributes[:'tags_automatic']
+ end
+
+ if attributes.has_key?(:'tags_immutable')
+ self.tags_immutable = attributes[:'tags_immutable']
+ end
+
+ if attributes.has_key?(:'type_display')
+ self.type_display = attributes[:'type_display']
+ end
+
+ if attributes.has_key?(:'uploaded_at')
+ self.uploaded_at = attributes[:'uploaded_at']
+ end
+
+ if attributes.has_key?(:'uploader')
+ self.uploader = attributes[:'uploader']
+ end
+
+ if attributes.has_key?(:'uploader_url')
+ self.uploader_url = attributes[:'uploader_url']
+ end
+
+ if attributes.has_key?(:'version')
+ self.version = attributes[:'version']
+ end
+
+ if attributes.has_key?(:'version_orig')
+ self.version_orig = attributes[:'version_orig']
+ end
+
+ if attributes.has_key?(:'vulnerability_scan_results_url')
+ self.vulnerability_scan_results_url = attributes[:'vulnerability_scan_results_url']
+ end
+ end
+
+ # Show invalid properties with the reasons. Usually used together with valid?
+ # @return Array for valid properties with the reasons
+ def list_invalid_properties
+ invalid_properties = Array.new
+ invalid_properties
+ end
+
+ # Check to see if the all the properties in the model are valid
+ # @return true if the model is valid
+ def valid?
+ security_scan_status_validator = EnumAttributeValidator.new('String', ['Awaiting Security Scan', 'Security Scanning in Progress', 'Scan Detected Vulnerabilities', 'Scan Detected No Vulnerabilities', 'Security Scanning Disabled', 'Security Scanning Failed', 'Security Scanning Skipped', 'Security Scanning Not Supported'])
+ return false unless security_scan_status_validator.valid?(@security_scan_status)
+ true
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] security_scan_status Object to be assigned
+ def security_scan_status=(security_scan_status)
+ validator = EnumAttributeValidator.new('String', ['Awaiting Security Scan', 'Security Scanning in Progress', 'Scan Detected Vulnerabilities', 'Scan Detected No Vulnerabilities', 'Security Scanning Disabled', 'Security Scanning Failed', 'Security Scanning Skipped', 'Security Scanning Not Supported'])
+ unless validator.valid?(security_scan_status)
+ fail ArgumentError, 'invalid value for "security_scan_status", must be one of #{validator.allowable_values}.'
+ end
+ @security_scan_status = security_scan_status
+ end
+
+ # Checks equality by comparing each attribute.
+ # @param [Object] Object to be compared
+ def ==(o)
+ return true if self.equal?(o)
+ self.class == o.class &&
+ architectures == o.architectures &&
+ cdn_url == o.cdn_url &&
+ checksum_md5 == o.checksum_md5 &&
+ checksum_sha1 == o.checksum_sha1 &&
+ checksum_sha256 == o.checksum_sha256 &&
+ checksum_sha512 == o.checksum_sha512 &&
+ dependencies_checksum_md5 == o.dependencies_checksum_md5 &&
+ dependencies_url == o.dependencies_url &&
+ description == o.description &&
+ display_name == o.display_name &&
+ distro == o.distro &&
+ distro_version == o.distro_version &&
+ downloads == o.downloads &&
+ epoch == o.epoch &&
+ extension == o.extension &&
+ filename == o.filename &&
+ files == o.files &&
+ format == o.format &&
+ format_url == o.format_url &&
+ freeable_storage == o.freeable_storage &&
+ fully_qualified_name == o.fully_qualified_name &&
+ identifier_perm == o.identifier_perm &&
+ identifiers == o.identifiers &&
+ indexed == o.indexed &&
+ is_cancellable == o.is_cancellable &&
+ is_copyable == o.is_copyable &&
+ is_deleteable == o.is_deleteable &&
+ is_downloadable == o.is_downloadable &&
+ is_moveable == o.is_moveable &&
+ is_quarantinable == o.is_quarantinable &&
+ is_quarantined == o.is_quarantined &&
+ is_resyncable == o.is_resyncable &&
+ is_security_scannable == o.is_security_scannable &&
+ is_sync_awaiting == o.is_sync_awaiting &&
+ is_sync_completed == o.is_sync_completed &&
+ is_sync_failed == o.is_sync_failed &&
+ is_sync_in_flight == o.is_sync_in_flight &&
+ is_sync_in_progress == o.is_sync_in_progress &&
+ license == o.license &&
+ name == o.name &&
+ namespace == o.namespace &&
+ namespace_url == o.namespace_url &&
+ num_files == o.num_files &&
+ origin_repository == o.origin_repository &&
+ origin_repository_url == o.origin_repository_url &&
+ package_type == o.package_type &&
+ policy_violated == o.policy_violated &&
+ raw_license == o.raw_license &&
+ release == o.release &&
+ repository == o.repository &&
+ repository_url == o.repository_url &&
+ security_scan_completed_at == o.security_scan_completed_at &&
+ security_scan_started_at == o.security_scan_started_at &&
+ security_scan_status == o.security_scan_status &&
+ security_scan_status_updated_at == o.security_scan_status_updated_at &&
+ self_html_url == o.self_html_url &&
+ self_url == o.self_url &&
+ signature_url == o.signature_url &&
+ size == o.size &&
+ slug == o.slug &&
+ slug_perm == o.slug_perm &&
+ spdx_license == o.spdx_license &&
+ stage == o.stage &&
+ stage_str == o.stage_str &&
+ stage_updated_at == o.stage_updated_at &&
+ status == o.status &&
+ status_reason == o.status_reason &&
+ status_str == o.status_str &&
+ status_updated_at == o.status_updated_at &&
+ status_url == o.status_url &&
+ subtype == o.subtype &&
+ summary == o.summary &&
+ sync_finished_at == o.sync_finished_at &&
+ sync_progress == o.sync_progress &&
+ tags_automatic == o.tags_automatic &&
+ tags_immutable == o.tags_immutable &&
+ type_display == o.type_display &&
+ uploaded_at == o.uploaded_at &&
+ uploader == o.uploader &&
+ uploader_url == o.uploader_url &&
+ version == o.version &&
+ version_orig == o.version_orig &&
+ vulnerability_scan_results_url == o.vulnerability_scan_results_url
+ end
+
+ # @see the `==` method
+ # @param [Object] Object to be compared
+ def eql?(o)
+ self == o
+ end
+
+ # Calculates hash code according to all attributes.
+ # @return [Fixnum] Hash code
+ def hash
+ [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
+ end
+
+ # Builds the object from hash
+ # @param [Hash] attributes Model attributes in the form of hash
+ # @return [Object] Returns the model itself
+ def build_from_hash(attributes)
+ return nil unless attributes.is_a?(Hash)
+ self.class.swagger_types.each_pair do |key, type|
+ if type =~ /\AArray<(.*)>/i
+ # check to ensure the input is an array given that the attribute
+ # is documented as an array but the input is not
+ if attributes[self.class.attribute_map[key]].is_a?(Array)
+ self.send("#{key}=", attributes[self.class.attribute_map[key]].map { |v| _deserialize($1, v) })
+ end
+ elsif !attributes[self.class.attribute_map[key]].nil?
+ self.send("#{key}=", _deserialize(type, attributes[self.class.attribute_map[key]]))
+ end # or else data not found in attributes(hash), not an issue as the data can be optional
+ end
+
+ self
+ end
+
+ # Deserializes the data based on type
+ # @param string type Data type
+ # @param string value Value to be deserialized
+ # @return [Object] Deserialized data
+ def _deserialize(type, value)
+ case type.to_sym
+ when :DateTime
+ DateTime.parse(value)
+ when :Date
+ Date.parse(value)
+ when :String
+ value.to_s
+ when :Integer
+ value.to_i
+ when :Float
+ value.to_f
+ when :BOOLEAN
+ if value.to_s =~ /\A(true|t|yes|y|1)\z/i
+ true
+ else
+ false
+ end
+ when :Object
+ # generic object (usually a Hash), return directly
+ value
+ when /\AArray<(?.+)>\z/
+ inner_type = Regexp.last_match[:inner_type]
+ value.map { |v| _deserialize(inner_type, v) }
+ when /\AHash<(?.+?), (?.+)>\z/
+ k_type = Regexp.last_match[:k_type]
+ v_type = Regexp.last_match[:v_type]
+ {}.tap do |hash|
+ value.each do |k, v|
+ hash[_deserialize(k_type, k)] = _deserialize(v_type, v)
+ end
+ end
+ else # model
+ temp_model = CloudsmithApi.const_get(type).new
+ temp_model.build_from_hash(value)
+ end
+ end
+
+ # Returns the string representation of the object
+ # @return [String] String presentation of the object
+ def to_s
+ to_hash.to_s
+ end
+
+ # to_body is an alias to to_hash (backward compatibility)
+ # @return [Hash] Returns the object in the form of hash
+ def to_body
+ to_hash
+ end
+
+ # Returns the object in the form of hash
+ # @return [Hash] Returns the object in the form of hash
+ def to_hash
+ hash = {}
+ self.class.attribute_map.each_pair do |attr, param|
+ value = self.send(attr)
+ next if value.nil?
+ hash[param] = _to_hash(value)
+ end
+ hash
+ end
+
+ # Outputs non-array value in the form of hash
+ # For object, use to_hash. Otherwise, just return the value
+ # @param [Object] value Any valid value
+ # @return [Hash] Returns the value in the form of hash
+ def _to_hash(value)
+ if value.is_a?(Array)
+ value.compact.map { |v| _to_hash(v) }
+ elsif value.is_a?(Hash)
+ {}.tap do |hash|
+ value.each { |k, v| hash[k] = _to_hash(v) }
+ end
+ elsif value.respond_to? :to_hash
+ value.to_hash
+ else
+ value
+ end
+ end
+
+end
+end
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/generic_package_upload_request.rb b/bindings/ruby/src/lib/cloudsmith-api/models/generic_package_upload_request.rb
new file mode 100644
index 00000000..0c2929f4
--- /dev/null
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/generic_package_upload_request.rb
@@ -0,0 +1,245 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'date'
+
+module CloudsmithApi
+class GenericPackageUploadRequest
+ # The full filepath of the package including filename.
+ attr_accessor :filepath
+
+ # The name of this package.
+ attr_accessor :name
+
+ # The primary file for the package.
+ attr_accessor :package_file
+
+ # If true, the uploaded package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate.
+ attr_accessor :republish
+
+ # A comma-separated values list of tags to add to the package.
+ attr_accessor :tags
+
+ # The raw version for this package.
+ attr_accessor :version
+
+ # Attribute mapping from ruby-style variable name to JSON key.
+ def self.attribute_map
+ {
+ :'filepath' => :'filepath',
+ :'name' => :'name',
+ :'package_file' => :'package_file',
+ :'republish' => :'republish',
+ :'tags' => :'tags',
+ :'version' => :'version'
+ }
+ end
+
+ # Attribute type mapping.
+ def self.swagger_types
+ {
+ :'filepath' => :'String',
+ :'name' => :'String',
+ :'package_file' => :'String',
+ :'republish' => :'BOOLEAN',
+ :'tags' => :'String',
+ :'version' => :'String'
+ }
+ end
+
+ # Initializes the object
+ # @param [Hash] attributes Model attributes in the form of hash
+ def initialize(attributes = {})
+ return unless attributes.is_a?(Hash)
+
+ # convert string to symbol for hash key
+ attributes = attributes.each_with_object({}) { |(k, v), h| h[k.to_sym] = v }
+
+ if attributes.has_key?(:'filepath')
+ self.filepath = attributes[:'filepath']
+ end
+
+ if attributes.has_key?(:'name')
+ self.name = attributes[:'name']
+ end
+
+ if attributes.has_key?(:'package_file')
+ self.package_file = attributes[:'package_file']
+ end
+
+ if attributes.has_key?(:'republish')
+ self.republish = attributes[:'republish']
+ end
+
+ if attributes.has_key?(:'tags')
+ self.tags = attributes[:'tags']
+ end
+
+ if attributes.has_key?(:'version')
+ self.version = attributes[:'version']
+ end
+ end
+
+ # Show invalid properties with the reasons. Usually used together with valid?
+ # @return Array for valid properties with the reasons
+ def list_invalid_properties
+ invalid_properties = Array.new
+ if @filepath.nil?
+ invalid_properties.push('invalid value for "filepath", filepath cannot be nil.')
+ end
+
+ if @package_file.nil?
+ invalid_properties.push('invalid value for "package_file", package_file cannot be nil.')
+ end
+
+ invalid_properties
+ end
+
+ # Check to see if the all the properties in the model are valid
+ # @return true if the model is valid
+ def valid?
+ return false if @filepath.nil?
+ return false if @package_file.nil?
+ true
+ end
+
+ # Checks equality by comparing each attribute.
+ # @param [Object] Object to be compared
+ def ==(o)
+ return true if self.equal?(o)
+ self.class == o.class &&
+ filepath == o.filepath &&
+ name == o.name &&
+ package_file == o.package_file &&
+ republish == o.republish &&
+ tags == o.tags &&
+ version == o.version
+ end
+
+ # @see the `==` method
+ # @param [Object] Object to be compared
+ def eql?(o)
+ self == o
+ end
+
+ # Calculates hash code according to all attributes.
+ # @return [Fixnum] Hash code
+ def hash
+ [filepath, name, package_file, republish, tags, version].hash
+ end
+
+ # Builds the object from hash
+ # @param [Hash] attributes Model attributes in the form of hash
+ # @return [Object] Returns the model itself
+ def build_from_hash(attributes)
+ return nil unless attributes.is_a?(Hash)
+ self.class.swagger_types.each_pair do |key, type|
+ if type =~ /\AArray<(.*)>/i
+ # check to ensure the input is an array given that the attribute
+ # is documented as an array but the input is not
+ if attributes[self.class.attribute_map[key]].is_a?(Array)
+ self.send("#{key}=", attributes[self.class.attribute_map[key]].map { |v| _deserialize($1, v) })
+ end
+ elsif !attributes[self.class.attribute_map[key]].nil?
+ self.send("#{key}=", _deserialize(type, attributes[self.class.attribute_map[key]]))
+ end # or else data not found in attributes(hash), not an issue as the data can be optional
+ end
+
+ self
+ end
+
+ # Deserializes the data based on type
+ # @param string type Data type
+ # @param string value Value to be deserialized
+ # @return [Object] Deserialized data
+ def _deserialize(type, value)
+ case type.to_sym
+ when :DateTime
+ DateTime.parse(value)
+ when :Date
+ Date.parse(value)
+ when :String
+ value.to_s
+ when :Integer
+ value.to_i
+ when :Float
+ value.to_f
+ when :BOOLEAN
+ if value.to_s =~ /\A(true|t|yes|y|1)\z/i
+ true
+ else
+ false
+ end
+ when :Object
+ # generic object (usually a Hash), return directly
+ value
+ when /\AArray<(?.+)>\z/
+ inner_type = Regexp.last_match[:inner_type]
+ value.map { |v| _deserialize(inner_type, v) }
+ when /\AHash<(?.+?), (?.+)>\z/
+ k_type = Regexp.last_match[:k_type]
+ v_type = Regexp.last_match[:v_type]
+ {}.tap do |hash|
+ value.each do |k, v|
+ hash[_deserialize(k_type, k)] = _deserialize(v_type, v)
+ end
+ end
+ else # model
+ temp_model = CloudsmithApi.const_get(type).new
+ temp_model.build_from_hash(value)
+ end
+ end
+
+ # Returns the string representation of the object
+ # @return [String] String presentation of the object
+ def to_s
+ to_hash.to_s
+ end
+
+ # to_body is an alias to to_hash (backward compatibility)
+ # @return [Hash] Returns the object in the form of hash
+ def to_body
+ to_hash
+ end
+
+ # Returns the object in the form of hash
+ # @return [Hash] Returns the object in the form of hash
+ def to_hash
+ hash = {}
+ self.class.attribute_map.each_pair do |attr, param|
+ value = self.send(attr)
+ next if value.nil?
+ hash[param] = _to_hash(value)
+ end
+ hash
+ end
+
+ # Outputs non-array value in the form of hash
+ # For object, use to_hash. Otherwise, just return the value
+ # @param [Object] value Any valid value
+ # @return [Hash] Returns the value in the form of hash
+ def _to_hash(value)
+ if value.is_a?(Array)
+ value.compact.map { |v| _to_hash(v) }
+ elsif value.is_a?(Hash)
+ {}.tap do |hash|
+ value.each { |k, v| hash[k] = _to_hash(v) }
+ end
+ elsif value.respond_to? :to_hash
+ value.to_hash
+ else
+ value
+ end
+ end
+
+end
+end
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream.rb b/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream.rb
new file mode 100644
index 00000000..a1dbeea5
--- /dev/null
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream.rb
@@ -0,0 +1,503 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'date'
+
+module CloudsmithApi
+class GenericUpstream
+ # The authentication mode to use when accessing this upstream.
+ attr_accessor :auth_mode
+
+ # Secret to provide with requests to upstream.
+ attr_accessor :auth_secret
+
+ # Username to provide with requests to upstream.
+ attr_accessor :auth_username
+
+ attr_accessor :available
+
+ attr_accessor :can_reindex
+
+ # The datetime the upstream source was created.
+ attr_accessor :created_at
+
+ attr_accessor :disable_reason
+
+ # Human-readable explanation of why this upstream is disabled
+ attr_accessor :disable_reason_text
+
+ # The key for extra header #1 to send to upstream.
+ attr_accessor :extra_header_1
+
+ # The key for extra header #2 to send to upstream.
+ attr_accessor :extra_header_2
+
+ # The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ attr_accessor :extra_value_1
+
+ # The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ attr_accessor :extra_value_2
+
+ attr_accessor :has_failed_signature_verification
+
+ # The number of packages available in this upstream source
+ attr_accessor :index_package_count
+
+ # The current indexing status of this upstream source
+ attr_accessor :index_status
+
+ # Whether or not this upstream is active and ready for requests.
+ attr_accessor :is_active
+
+ # The last time this upstream source was indexed
+ attr_accessor :last_indexed
+
+ # The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ attr_accessor :mode
+
+ # A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+ attr_accessor :name
+
+ # When true, this upstream source is pending validation.
+ attr_accessor :pending_validation
+
+ # Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+ attr_accessor :priority
+
+ attr_accessor :slug_perm
+
+ attr_accessor :updated_at
+
+ # A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+ attr_accessor :upstream_prefix
+
+ # The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+ attr_accessor :upstream_url
+
+ # If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+ attr_accessor :verify_ssl
+
+ class EnumAttributeValidator
+ attr_reader :datatype
+ attr_reader :allowable_values
+
+ def initialize(datatype, allowable_values)
+ @allowable_values = allowable_values.map do |value|
+ case datatype.to_s
+ when /Integer/i
+ value.to_i
+ when /Float/i
+ value.to_f
+ else
+ value
+ end
+ end
+ end
+
+ def valid?(value)
+ !value || allowable_values.include?(value)
+ end
+ end
+
+ # Attribute mapping from ruby-style variable name to JSON key.
+ def self.attribute_map
+ {
+ :'auth_mode' => :'auth_mode',
+ :'auth_secret' => :'auth_secret',
+ :'auth_username' => :'auth_username',
+ :'available' => :'available',
+ :'can_reindex' => :'can_reindex',
+ :'created_at' => :'created_at',
+ :'disable_reason' => :'disable_reason',
+ :'disable_reason_text' => :'disable_reason_text',
+ :'extra_header_1' => :'extra_header_1',
+ :'extra_header_2' => :'extra_header_2',
+ :'extra_value_1' => :'extra_value_1',
+ :'extra_value_2' => :'extra_value_2',
+ :'has_failed_signature_verification' => :'has_failed_signature_verification',
+ :'index_package_count' => :'index_package_count',
+ :'index_status' => :'index_status',
+ :'is_active' => :'is_active',
+ :'last_indexed' => :'last_indexed',
+ :'mode' => :'mode',
+ :'name' => :'name',
+ :'pending_validation' => :'pending_validation',
+ :'priority' => :'priority',
+ :'slug_perm' => :'slug_perm',
+ :'updated_at' => :'updated_at',
+ :'upstream_prefix' => :'upstream_prefix',
+ :'upstream_url' => :'upstream_url',
+ :'verify_ssl' => :'verify_ssl'
+ }
+ end
+
+ # Attribute type mapping.
+ def self.swagger_types
+ {
+ :'auth_mode' => :'String',
+ :'auth_secret' => :'String',
+ :'auth_username' => :'String',
+ :'available' => :'String',
+ :'can_reindex' => :'String',
+ :'created_at' => :'DateTime',
+ :'disable_reason' => :'String',
+ :'disable_reason_text' => :'String',
+ :'extra_header_1' => :'String',
+ :'extra_header_2' => :'String',
+ :'extra_value_1' => :'String',
+ :'extra_value_2' => :'String',
+ :'has_failed_signature_verification' => :'String',
+ :'index_package_count' => :'String',
+ :'index_status' => :'String',
+ :'is_active' => :'BOOLEAN',
+ :'last_indexed' => :'String',
+ :'mode' => :'String',
+ :'name' => :'String',
+ :'pending_validation' => :'BOOLEAN',
+ :'priority' => :'Integer',
+ :'slug_perm' => :'String',
+ :'updated_at' => :'DateTime',
+ :'upstream_prefix' => :'String',
+ :'upstream_url' => :'String',
+ :'verify_ssl' => :'BOOLEAN'
+ }
+ end
+
+ # Initializes the object
+ # @param [Hash] attributes Model attributes in the form of hash
+ def initialize(attributes = {})
+ return unless attributes.is_a?(Hash)
+
+ # convert string to symbol for hash key
+ attributes = attributes.each_with_object({}) { |(k, v), h| h[k.to_sym] = v }
+
+ if attributes.has_key?(:'auth_mode')
+ self.auth_mode = attributes[:'auth_mode']
+ else
+ self.auth_mode = 'None'
+ end
+
+ if attributes.has_key?(:'auth_secret')
+ self.auth_secret = attributes[:'auth_secret']
+ end
+
+ if attributes.has_key?(:'auth_username')
+ self.auth_username = attributes[:'auth_username']
+ end
+
+ if attributes.has_key?(:'available')
+ self.available = attributes[:'available']
+ end
+
+ if attributes.has_key?(:'can_reindex')
+ self.can_reindex = attributes[:'can_reindex']
+ end
+
+ if attributes.has_key?(:'created_at')
+ self.created_at = attributes[:'created_at']
+ end
+
+ if attributes.has_key?(:'disable_reason')
+ self.disable_reason = attributes[:'disable_reason']
+ else
+ self.disable_reason = 'N/A'
+ end
+
+ if attributes.has_key?(:'disable_reason_text')
+ self.disable_reason_text = attributes[:'disable_reason_text']
+ end
+
+ if attributes.has_key?(:'extra_header_1')
+ self.extra_header_1 = attributes[:'extra_header_1']
+ end
+
+ if attributes.has_key?(:'extra_header_2')
+ self.extra_header_2 = attributes[:'extra_header_2']
+ end
+
+ if attributes.has_key?(:'extra_value_1')
+ self.extra_value_1 = attributes[:'extra_value_1']
+ end
+
+ if attributes.has_key?(:'extra_value_2')
+ self.extra_value_2 = attributes[:'extra_value_2']
+ end
+
+ if attributes.has_key?(:'has_failed_signature_verification')
+ self.has_failed_signature_verification = attributes[:'has_failed_signature_verification']
+ end
+
+ if attributes.has_key?(:'index_package_count')
+ self.index_package_count = attributes[:'index_package_count']
+ end
+
+ if attributes.has_key?(:'index_status')
+ self.index_status = attributes[:'index_status']
+ end
+
+ if attributes.has_key?(:'is_active')
+ self.is_active = attributes[:'is_active']
+ end
+
+ if attributes.has_key?(:'last_indexed')
+ self.last_indexed = attributes[:'last_indexed']
+ end
+
+ if attributes.has_key?(:'mode')
+ self.mode = attributes[:'mode']
+ else
+ self.mode = 'Proxy Only'
+ end
+
+ if attributes.has_key?(:'name')
+ self.name = attributes[:'name']
+ end
+
+ if attributes.has_key?(:'pending_validation')
+ self.pending_validation = attributes[:'pending_validation']
+ end
+
+ if attributes.has_key?(:'priority')
+ self.priority = attributes[:'priority']
+ end
+
+ if attributes.has_key?(:'slug_perm')
+ self.slug_perm = attributes[:'slug_perm']
+ end
+
+ if attributes.has_key?(:'updated_at')
+ self.updated_at = attributes[:'updated_at']
+ end
+
+ if attributes.has_key?(:'upstream_prefix')
+ self.upstream_prefix = attributes[:'upstream_prefix']
+ end
+
+ if attributes.has_key?(:'upstream_url')
+ self.upstream_url = attributes[:'upstream_url']
+ end
+
+ if attributes.has_key?(:'verify_ssl')
+ self.verify_ssl = attributes[:'verify_ssl']
+ end
+ end
+
+ # Show invalid properties with the reasons. Usually used together with valid?
+ # @return Array for valid properties with the reasons
+ def list_invalid_properties
+ invalid_properties = Array.new
+ if @name.nil?
+ invalid_properties.push('invalid value for "name", name cannot be nil.')
+ end
+
+ if @upstream_url.nil?
+ invalid_properties.push('invalid value for "upstream_url", upstream_url cannot be nil.')
+ end
+
+ invalid_properties
+ end
+
+ # Check to see if the all the properties in the model are valid
+ # @return true if the model is valid
+ def valid?
+ auth_mode_validator = EnumAttributeValidator.new('String', ['None', 'Username and Password', 'Token'])
+ return false unless auth_mode_validator.valid?(@auth_mode)
+ disable_reason_validator = EnumAttributeValidator.new('String', ['N/A', 'Upstream points to its own repository', 'Missing upstream source', 'Upstream was disabled by request of user'])
+ return false unless disable_reason_validator.valid?(@disable_reason)
+ mode_validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy'])
+ return false unless mode_validator.valid?(@mode)
+ return false if @name.nil?
+ return false if @upstream_url.nil?
+ true
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] auth_mode Object to be assigned
+ def auth_mode=(auth_mode)
+ validator = EnumAttributeValidator.new('String', ['None', 'Username and Password', 'Token'])
+ unless validator.valid?(auth_mode)
+ fail ArgumentError, 'invalid value for "auth_mode", must be one of #{validator.allowable_values}.'
+ end
+ @auth_mode = auth_mode
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] disable_reason Object to be assigned
+ def disable_reason=(disable_reason)
+ validator = EnumAttributeValidator.new('String', ['N/A', 'Upstream points to its own repository', 'Missing upstream source', 'Upstream was disabled by request of user'])
+ unless validator.valid?(disable_reason)
+ fail ArgumentError, 'invalid value for "disable_reason", must be one of #{validator.allowable_values}.'
+ end
+ @disable_reason = disable_reason
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] mode Object to be assigned
+ def mode=(mode)
+ validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy'])
+ unless validator.valid?(mode)
+ fail ArgumentError, 'invalid value for "mode", must be one of #{validator.allowable_values}.'
+ end
+ @mode = mode
+ end
+
+ # Checks equality by comparing each attribute.
+ # @param [Object] Object to be compared
+ def ==(o)
+ return true if self.equal?(o)
+ self.class == o.class &&
+ auth_mode == o.auth_mode &&
+ auth_secret == o.auth_secret &&
+ auth_username == o.auth_username &&
+ available == o.available &&
+ can_reindex == o.can_reindex &&
+ created_at == o.created_at &&
+ disable_reason == o.disable_reason &&
+ disable_reason_text == o.disable_reason_text &&
+ extra_header_1 == o.extra_header_1 &&
+ extra_header_2 == o.extra_header_2 &&
+ extra_value_1 == o.extra_value_1 &&
+ extra_value_2 == o.extra_value_2 &&
+ has_failed_signature_verification == o.has_failed_signature_verification &&
+ index_package_count == o.index_package_count &&
+ index_status == o.index_status &&
+ is_active == o.is_active &&
+ last_indexed == o.last_indexed &&
+ mode == o.mode &&
+ name == o.name &&
+ pending_validation == o.pending_validation &&
+ priority == o.priority &&
+ slug_perm == o.slug_perm &&
+ updated_at == o.updated_at &&
+ upstream_prefix == o.upstream_prefix &&
+ upstream_url == o.upstream_url &&
+ verify_ssl == o.verify_ssl
+ end
+
+ # @see the `==` method
+ # @param [Object] Object to be compared
+ def eql?(o)
+ self == o
+ end
+
+ # Calculates hash code according to all attributes.
+ # @return [Fixnum] Hash code
+ def hash
+ [auth_mode, auth_secret, auth_username, available, can_reindex, created_at, disable_reason, disable_reason_text, extra_header_1, extra_header_2, extra_value_1, extra_value_2, has_failed_signature_verification, index_package_count, index_status, is_active, last_indexed, mode, name, pending_validation, priority, slug_perm, updated_at, upstream_prefix, upstream_url, verify_ssl].hash
+ end
+
+ # Builds the object from hash
+ # @param [Hash] attributes Model attributes in the form of hash
+ # @return [Object] Returns the model itself
+ def build_from_hash(attributes)
+ return nil unless attributes.is_a?(Hash)
+ self.class.swagger_types.each_pair do |key, type|
+ if type =~ /\AArray<(.*)>/i
+ # check to ensure the input is an array given that the attribute
+ # is documented as an array but the input is not
+ if attributes[self.class.attribute_map[key]].is_a?(Array)
+ self.send("#{key}=", attributes[self.class.attribute_map[key]].map { |v| _deserialize($1, v) })
+ end
+ elsif !attributes[self.class.attribute_map[key]].nil?
+ self.send("#{key}=", _deserialize(type, attributes[self.class.attribute_map[key]]))
+ end # or else data not found in attributes(hash), not an issue as the data can be optional
+ end
+
+ self
+ end
+
+ # Deserializes the data based on type
+ # @param string type Data type
+ # @param string value Value to be deserialized
+ # @return [Object] Deserialized data
+ def _deserialize(type, value)
+ case type.to_sym
+ when :DateTime
+ DateTime.parse(value)
+ when :Date
+ Date.parse(value)
+ when :String
+ value.to_s
+ when :Integer
+ value.to_i
+ when :Float
+ value.to_f
+ when :BOOLEAN
+ if value.to_s =~ /\A(true|t|yes|y|1)\z/i
+ true
+ else
+ false
+ end
+ when :Object
+ # generic object (usually a Hash), return directly
+ value
+ when /\AArray<(?.+)>\z/
+ inner_type = Regexp.last_match[:inner_type]
+ value.map { |v| _deserialize(inner_type, v) }
+ when /\AHash<(?.+?), (?.+)>\z/
+ k_type = Regexp.last_match[:k_type]
+ v_type = Regexp.last_match[:v_type]
+ {}.tap do |hash|
+ value.each do |k, v|
+ hash[_deserialize(k_type, k)] = _deserialize(v_type, v)
+ end
+ end
+ else # model
+ temp_model = CloudsmithApi.const_get(type).new
+ temp_model.build_from_hash(value)
+ end
+ end
+
+ # Returns the string representation of the object
+ # @return [String] String presentation of the object
+ def to_s
+ to_hash.to_s
+ end
+
+ # to_body is an alias to to_hash (backward compatibility)
+ # @return [Hash] Returns the object in the form of hash
+ def to_body
+ to_hash
+ end
+
+ # Returns the object in the form of hash
+ # @return [Hash] Returns the object in the form of hash
+ def to_hash
+ hash = {}
+ self.class.attribute_map.each_pair do |attr, param|
+ value = self.send(attr)
+ next if value.nil?
+ hash[param] = _to_hash(value)
+ end
+ hash
+ end
+
+ # Outputs non-array value in the form of hash
+ # For object, use to_hash. Otherwise, just return the value
+ # @param [Object] value Any valid value
+ # @return [Hash] Returns the value in the form of hash
+ def _to_hash(value)
+ if value.is_a?(Array)
+ value.compact.map { |v| _to_hash(v) }
+ elsif value.is_a?(Hash)
+ {}.tap do |hash|
+ value.each { |k, v| hash[k] = _to_hash(v) }
+ end
+ elsif value.respond_to? :to_hash
+ value.to_hash
+ else
+ value
+ end
+ end
+
+end
+end
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream_request.rb b/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream_request.rb
new file mode 100644
index 00000000..47eba5d0
--- /dev/null
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream_request.rb
@@ -0,0 +1,375 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'date'
+
+module CloudsmithApi
+class GenericUpstreamRequest
+ # The authentication mode to use when accessing this upstream.
+ attr_accessor :auth_mode
+
+ # Secret to provide with requests to upstream.
+ attr_accessor :auth_secret
+
+ # Username to provide with requests to upstream.
+ attr_accessor :auth_username
+
+ # The key for extra header #1 to send to upstream.
+ attr_accessor :extra_header_1
+
+ # The key for extra header #2 to send to upstream.
+ attr_accessor :extra_header_2
+
+ # The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ attr_accessor :extra_value_1
+
+ # The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ attr_accessor :extra_value_2
+
+ # Whether or not this upstream is active and ready for requests.
+ attr_accessor :is_active
+
+ # The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ attr_accessor :mode
+
+ # A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+ attr_accessor :name
+
+ # Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+ attr_accessor :priority
+
+ # A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+ attr_accessor :upstream_prefix
+
+ # The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+ attr_accessor :upstream_url
+
+ # If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+ attr_accessor :verify_ssl
+
+ class EnumAttributeValidator
+ attr_reader :datatype
+ attr_reader :allowable_values
+
+ def initialize(datatype, allowable_values)
+ @allowable_values = allowable_values.map do |value|
+ case datatype.to_s
+ when /Integer/i
+ value.to_i
+ when /Float/i
+ value.to_f
+ else
+ value
+ end
+ end
+ end
+
+ def valid?(value)
+ !value || allowable_values.include?(value)
+ end
+ end
+
+ # Attribute mapping from ruby-style variable name to JSON key.
+ def self.attribute_map
+ {
+ :'auth_mode' => :'auth_mode',
+ :'auth_secret' => :'auth_secret',
+ :'auth_username' => :'auth_username',
+ :'extra_header_1' => :'extra_header_1',
+ :'extra_header_2' => :'extra_header_2',
+ :'extra_value_1' => :'extra_value_1',
+ :'extra_value_2' => :'extra_value_2',
+ :'is_active' => :'is_active',
+ :'mode' => :'mode',
+ :'name' => :'name',
+ :'priority' => :'priority',
+ :'upstream_prefix' => :'upstream_prefix',
+ :'upstream_url' => :'upstream_url',
+ :'verify_ssl' => :'verify_ssl'
+ }
+ end
+
+ # Attribute type mapping.
+ def self.swagger_types
+ {
+ :'auth_mode' => :'String',
+ :'auth_secret' => :'String',
+ :'auth_username' => :'String',
+ :'extra_header_1' => :'String',
+ :'extra_header_2' => :'String',
+ :'extra_value_1' => :'String',
+ :'extra_value_2' => :'String',
+ :'is_active' => :'BOOLEAN',
+ :'mode' => :'String',
+ :'name' => :'String',
+ :'priority' => :'Integer',
+ :'upstream_prefix' => :'String',
+ :'upstream_url' => :'String',
+ :'verify_ssl' => :'BOOLEAN'
+ }
+ end
+
+ # Initializes the object
+ # @param [Hash] attributes Model attributes in the form of hash
+ def initialize(attributes = {})
+ return unless attributes.is_a?(Hash)
+
+ # convert string to symbol for hash key
+ attributes = attributes.each_with_object({}) { |(k, v), h| h[k.to_sym] = v }
+
+ if attributes.has_key?(:'auth_mode')
+ self.auth_mode = attributes[:'auth_mode']
+ else
+ self.auth_mode = 'None'
+ end
+
+ if attributes.has_key?(:'auth_secret')
+ self.auth_secret = attributes[:'auth_secret']
+ end
+
+ if attributes.has_key?(:'auth_username')
+ self.auth_username = attributes[:'auth_username']
+ end
+
+ if attributes.has_key?(:'extra_header_1')
+ self.extra_header_1 = attributes[:'extra_header_1']
+ end
+
+ if attributes.has_key?(:'extra_header_2')
+ self.extra_header_2 = attributes[:'extra_header_2']
+ end
+
+ if attributes.has_key?(:'extra_value_1')
+ self.extra_value_1 = attributes[:'extra_value_1']
+ end
+
+ if attributes.has_key?(:'extra_value_2')
+ self.extra_value_2 = attributes[:'extra_value_2']
+ end
+
+ if attributes.has_key?(:'is_active')
+ self.is_active = attributes[:'is_active']
+ end
+
+ if attributes.has_key?(:'mode')
+ self.mode = attributes[:'mode']
+ else
+ self.mode = 'Proxy Only'
+ end
+
+ if attributes.has_key?(:'name')
+ self.name = attributes[:'name']
+ end
+
+ if attributes.has_key?(:'priority')
+ self.priority = attributes[:'priority']
+ end
+
+ if attributes.has_key?(:'upstream_prefix')
+ self.upstream_prefix = attributes[:'upstream_prefix']
+ end
+
+ if attributes.has_key?(:'upstream_url')
+ self.upstream_url = attributes[:'upstream_url']
+ end
+
+ if attributes.has_key?(:'verify_ssl')
+ self.verify_ssl = attributes[:'verify_ssl']
+ end
+ end
+
+ # Show invalid properties with the reasons. Usually used together with valid?
+ # @return Array for valid properties with the reasons
+ def list_invalid_properties
+ invalid_properties = Array.new
+ if @name.nil?
+ invalid_properties.push('invalid value for "name", name cannot be nil.')
+ end
+
+ if @upstream_url.nil?
+ invalid_properties.push('invalid value for "upstream_url", upstream_url cannot be nil.')
+ end
+
+ invalid_properties
+ end
+
+ # Check to see if the all the properties in the model are valid
+ # @return true if the model is valid
+ def valid?
+ auth_mode_validator = EnumAttributeValidator.new('String', ['None', 'Username and Password', 'Token'])
+ return false unless auth_mode_validator.valid?(@auth_mode)
+ mode_validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy'])
+ return false unless mode_validator.valid?(@mode)
+ return false if @name.nil?
+ return false if @upstream_url.nil?
+ true
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] auth_mode Object to be assigned
+ def auth_mode=(auth_mode)
+ validator = EnumAttributeValidator.new('String', ['None', 'Username and Password', 'Token'])
+ unless validator.valid?(auth_mode)
+ fail ArgumentError, 'invalid value for "auth_mode", must be one of #{validator.allowable_values}.'
+ end
+ @auth_mode = auth_mode
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] mode Object to be assigned
+ def mode=(mode)
+ validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy'])
+ unless validator.valid?(mode)
+ fail ArgumentError, 'invalid value for "mode", must be one of #{validator.allowable_values}.'
+ end
+ @mode = mode
+ end
+
+ # Checks equality by comparing each attribute.
+ # @param [Object] Object to be compared
+ def ==(o)
+ return true if self.equal?(o)
+ self.class == o.class &&
+ auth_mode == o.auth_mode &&
+ auth_secret == o.auth_secret &&
+ auth_username == o.auth_username &&
+ extra_header_1 == o.extra_header_1 &&
+ extra_header_2 == o.extra_header_2 &&
+ extra_value_1 == o.extra_value_1 &&
+ extra_value_2 == o.extra_value_2 &&
+ is_active == o.is_active &&
+ mode == o.mode &&
+ name == o.name &&
+ priority == o.priority &&
+ upstream_prefix == o.upstream_prefix &&
+ upstream_url == o.upstream_url &&
+ verify_ssl == o.verify_ssl
+ end
+
+ # @see the `==` method
+ # @param [Object] Object to be compared
+ def eql?(o)
+ self == o
+ end
+
+ # Calculates hash code according to all attributes.
+ # @return [Fixnum] Hash code
+ def hash
+ [auth_mode, auth_secret, auth_username, extra_header_1, extra_header_2, extra_value_1, extra_value_2, is_active, mode, name, priority, upstream_prefix, upstream_url, verify_ssl].hash
+ end
+
+ # Builds the object from hash
+ # @param [Hash] attributes Model attributes in the form of hash
+ # @return [Object] Returns the model itself
+ def build_from_hash(attributes)
+ return nil unless attributes.is_a?(Hash)
+ self.class.swagger_types.each_pair do |key, type|
+ if type =~ /\AArray<(.*)>/i
+ # check to ensure the input is an array given that the attribute
+ # is documented as an array but the input is not
+ if attributes[self.class.attribute_map[key]].is_a?(Array)
+ self.send("#{key}=", attributes[self.class.attribute_map[key]].map { |v| _deserialize($1, v) })
+ end
+ elsif !attributes[self.class.attribute_map[key]].nil?
+ self.send("#{key}=", _deserialize(type, attributes[self.class.attribute_map[key]]))
+ end # or else data not found in attributes(hash), not an issue as the data can be optional
+ end
+
+ self
+ end
+
+ # Deserializes the data based on type
+ # @param string type Data type
+ # @param string value Value to be deserialized
+ # @return [Object] Deserialized data
+ def _deserialize(type, value)
+ case type.to_sym
+ when :DateTime
+ DateTime.parse(value)
+ when :Date
+ Date.parse(value)
+ when :String
+ value.to_s
+ when :Integer
+ value.to_i
+ when :Float
+ value.to_f
+ when :BOOLEAN
+ if value.to_s =~ /\A(true|t|yes|y|1)\z/i
+ true
+ else
+ false
+ end
+ when :Object
+ # generic object (usually a Hash), return directly
+ value
+ when /\AArray<(?.+)>\z/
+ inner_type = Regexp.last_match[:inner_type]
+ value.map { |v| _deserialize(inner_type, v) }
+ when /\AHash<(?.+?), (?.+)>\z/
+ k_type = Regexp.last_match[:k_type]
+ v_type = Regexp.last_match[:v_type]
+ {}.tap do |hash|
+ value.each do |k, v|
+ hash[_deserialize(k_type, k)] = _deserialize(v_type, v)
+ end
+ end
+ else # model
+ temp_model = CloudsmithApi.const_get(type).new
+ temp_model.build_from_hash(value)
+ end
+ end
+
+ # Returns the string representation of the object
+ # @return [String] String presentation of the object
+ def to_s
+ to_hash.to_s
+ end
+
+ # to_body is an alias to to_hash (backward compatibility)
+ # @return [Hash] Returns the object in the form of hash
+ def to_body
+ to_hash
+ end
+
+ # Returns the object in the form of hash
+ # @return [Hash] Returns the object in the form of hash
+ def to_hash
+ hash = {}
+ self.class.attribute_map.each_pair do |attr, param|
+ value = self.send(attr)
+ next if value.nil?
+ hash[param] = _to_hash(value)
+ end
+ hash
+ end
+
+ # Outputs non-array value in the form of hash
+ # For object, use to_hash. Otherwise, just return the value
+ # @param [Object] value Any valid value
+ # @return [Hash] Returns the value in the form of hash
+ def _to_hash(value)
+ if value.is_a?(Array)
+ value.compact.map { |v| _to_hash(v) }
+ elsif value.is_a?(Hash)
+ {}.tap do |hash|
+ value.each { |k, v| hash[k] = _to_hash(v) }
+ end
+ elsif value.respond_to? :to_hash
+ value.to_hash
+ else
+ value
+ end
+ end
+
+end
+end
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream_request_patch.rb b/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream_request_patch.rb
new file mode 100644
index 00000000..708711dc
--- /dev/null
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/generic_upstream_request_patch.rb
@@ -0,0 +1,365 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'date'
+
+module CloudsmithApi
+class GenericUpstreamRequestPatch
+ # The authentication mode to use when accessing this upstream.
+ attr_accessor :auth_mode
+
+ # Secret to provide with requests to upstream.
+ attr_accessor :auth_secret
+
+ # Username to provide with requests to upstream.
+ attr_accessor :auth_username
+
+ # The key for extra header #1 to send to upstream.
+ attr_accessor :extra_header_1
+
+ # The key for extra header #2 to send to upstream.
+ attr_accessor :extra_header_2
+
+ # The value for extra header #1 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ attr_accessor :extra_value_1
+
+ # The value for extra header #2 to send to upstream. This is stored as plaintext, and is NOT encrypted.
+ attr_accessor :extra_value_2
+
+ # Whether or not this upstream is active and ready for requests.
+ attr_accessor :is_active
+
+ # The mode that this upstream should operate in. Upstream sources can be used to proxy resolved packages, as well as operate in a proxy/cache or cache only mode.
+ attr_accessor :mode
+
+ # A descriptive name for this upstream source. A shortened version of this name will be used for tagging cached packages retrieved from this upstream.
+ attr_accessor :name
+
+ # Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
+ attr_accessor :priority
+
+ # A unique prefix used to distinguish this upstream source within the repository. Generic upstreams can represent entirely different file servers, and we do not attempt to blend them. The prefix ensures each source remains separate, and requests including this prefix are routed to the correct upstream.
+ attr_accessor :upstream_prefix
+
+ # The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
+ attr_accessor :upstream_url
+
+ # If enabled, SSL certificates are verified when requests are made to this upstream. It's recommended to leave this enabled for all public sources to help mitigate Man-In-The-Middle (MITM) attacks. Please note this only applies to HTTPS upstreams.
+ attr_accessor :verify_ssl
+
+ class EnumAttributeValidator
+ attr_reader :datatype
+ attr_reader :allowable_values
+
+ def initialize(datatype, allowable_values)
+ @allowable_values = allowable_values.map do |value|
+ case datatype.to_s
+ when /Integer/i
+ value.to_i
+ when /Float/i
+ value.to_f
+ else
+ value
+ end
+ end
+ end
+
+ def valid?(value)
+ !value || allowable_values.include?(value)
+ end
+ end
+
+ # Attribute mapping from ruby-style variable name to JSON key.
+ def self.attribute_map
+ {
+ :'auth_mode' => :'auth_mode',
+ :'auth_secret' => :'auth_secret',
+ :'auth_username' => :'auth_username',
+ :'extra_header_1' => :'extra_header_1',
+ :'extra_header_2' => :'extra_header_2',
+ :'extra_value_1' => :'extra_value_1',
+ :'extra_value_2' => :'extra_value_2',
+ :'is_active' => :'is_active',
+ :'mode' => :'mode',
+ :'name' => :'name',
+ :'priority' => :'priority',
+ :'upstream_prefix' => :'upstream_prefix',
+ :'upstream_url' => :'upstream_url',
+ :'verify_ssl' => :'verify_ssl'
+ }
+ end
+
+ # Attribute type mapping.
+ def self.swagger_types
+ {
+ :'auth_mode' => :'String',
+ :'auth_secret' => :'String',
+ :'auth_username' => :'String',
+ :'extra_header_1' => :'String',
+ :'extra_header_2' => :'String',
+ :'extra_value_1' => :'String',
+ :'extra_value_2' => :'String',
+ :'is_active' => :'BOOLEAN',
+ :'mode' => :'String',
+ :'name' => :'String',
+ :'priority' => :'Integer',
+ :'upstream_prefix' => :'String',
+ :'upstream_url' => :'String',
+ :'verify_ssl' => :'BOOLEAN'
+ }
+ end
+
+ # Initializes the object
+ # @param [Hash] attributes Model attributes in the form of hash
+ def initialize(attributes = {})
+ return unless attributes.is_a?(Hash)
+
+ # convert string to symbol for hash key
+ attributes = attributes.each_with_object({}) { |(k, v), h| h[k.to_sym] = v }
+
+ if attributes.has_key?(:'auth_mode')
+ self.auth_mode = attributes[:'auth_mode']
+ else
+ self.auth_mode = 'None'
+ end
+
+ if attributes.has_key?(:'auth_secret')
+ self.auth_secret = attributes[:'auth_secret']
+ end
+
+ if attributes.has_key?(:'auth_username')
+ self.auth_username = attributes[:'auth_username']
+ end
+
+ if attributes.has_key?(:'extra_header_1')
+ self.extra_header_1 = attributes[:'extra_header_1']
+ end
+
+ if attributes.has_key?(:'extra_header_2')
+ self.extra_header_2 = attributes[:'extra_header_2']
+ end
+
+ if attributes.has_key?(:'extra_value_1')
+ self.extra_value_1 = attributes[:'extra_value_1']
+ end
+
+ if attributes.has_key?(:'extra_value_2')
+ self.extra_value_2 = attributes[:'extra_value_2']
+ end
+
+ if attributes.has_key?(:'is_active')
+ self.is_active = attributes[:'is_active']
+ end
+
+ if attributes.has_key?(:'mode')
+ self.mode = attributes[:'mode']
+ else
+ self.mode = 'Proxy Only'
+ end
+
+ if attributes.has_key?(:'name')
+ self.name = attributes[:'name']
+ end
+
+ if attributes.has_key?(:'priority')
+ self.priority = attributes[:'priority']
+ end
+
+ if attributes.has_key?(:'upstream_prefix')
+ self.upstream_prefix = attributes[:'upstream_prefix']
+ end
+
+ if attributes.has_key?(:'upstream_url')
+ self.upstream_url = attributes[:'upstream_url']
+ end
+
+ if attributes.has_key?(:'verify_ssl')
+ self.verify_ssl = attributes[:'verify_ssl']
+ end
+ end
+
+ # Show invalid properties with the reasons. Usually used together with valid?
+ # @return Array for valid properties with the reasons
+ def list_invalid_properties
+ invalid_properties = Array.new
+ invalid_properties
+ end
+
+ # Check to see if the all the properties in the model are valid
+ # @return true if the model is valid
+ def valid?
+ auth_mode_validator = EnumAttributeValidator.new('String', ['None', 'Username and Password', 'Token'])
+ return false unless auth_mode_validator.valid?(@auth_mode)
+ mode_validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy'])
+ return false unless mode_validator.valid?(@mode)
+ true
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] auth_mode Object to be assigned
+ def auth_mode=(auth_mode)
+ validator = EnumAttributeValidator.new('String', ['None', 'Username and Password', 'Token'])
+ unless validator.valid?(auth_mode)
+ fail ArgumentError, 'invalid value for "auth_mode", must be one of #{validator.allowable_values}.'
+ end
+ @auth_mode = auth_mode
+ end
+
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] mode Object to be assigned
+ def mode=(mode)
+ validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy'])
+ unless validator.valid?(mode)
+ fail ArgumentError, 'invalid value for "mode", must be one of #{validator.allowable_values}.'
+ end
+ @mode = mode
+ end
+
+ # Checks equality by comparing each attribute.
+ # @param [Object] Object to be compared
+ def ==(o)
+ return true if self.equal?(o)
+ self.class == o.class &&
+ auth_mode == o.auth_mode &&
+ auth_secret == o.auth_secret &&
+ auth_username == o.auth_username &&
+ extra_header_1 == o.extra_header_1 &&
+ extra_header_2 == o.extra_header_2 &&
+ extra_value_1 == o.extra_value_1 &&
+ extra_value_2 == o.extra_value_2 &&
+ is_active == o.is_active &&
+ mode == o.mode &&
+ name == o.name &&
+ priority == o.priority &&
+ upstream_prefix == o.upstream_prefix &&
+ upstream_url == o.upstream_url &&
+ verify_ssl == o.verify_ssl
+ end
+
+ # @see the `==` method
+ # @param [Object] Object to be compared
+ def eql?(o)
+ self == o
+ end
+
+ # Calculates hash code according to all attributes.
+ # @return [Fixnum] Hash code
+ def hash
+ [auth_mode, auth_secret, auth_username, extra_header_1, extra_header_2, extra_value_1, extra_value_2, is_active, mode, name, priority, upstream_prefix, upstream_url, verify_ssl].hash
+ end
+
+ # Builds the object from hash
+ # @param [Hash] attributes Model attributes in the form of hash
+ # @return [Object] Returns the model itself
+ def build_from_hash(attributes)
+ return nil unless attributes.is_a?(Hash)
+ self.class.swagger_types.each_pair do |key, type|
+ if type =~ /\AArray<(.*)>/i
+ # check to ensure the input is an array given that the attribute
+ # is documented as an array but the input is not
+ if attributes[self.class.attribute_map[key]].is_a?(Array)
+ self.send("#{key}=", attributes[self.class.attribute_map[key]].map { |v| _deserialize($1, v) })
+ end
+ elsif !attributes[self.class.attribute_map[key]].nil?
+ self.send("#{key}=", _deserialize(type, attributes[self.class.attribute_map[key]]))
+ end # or else data not found in attributes(hash), not an issue as the data can be optional
+ end
+
+ self
+ end
+
+ # Deserializes the data based on type
+ # @param string type Data type
+ # @param string value Value to be deserialized
+ # @return [Object] Deserialized data
+ def _deserialize(type, value)
+ case type.to_sym
+ when :DateTime
+ DateTime.parse(value)
+ when :Date
+ Date.parse(value)
+ when :String
+ value.to_s
+ when :Integer
+ value.to_i
+ when :Float
+ value.to_f
+ when :BOOLEAN
+ if value.to_s =~ /\A(true|t|yes|y|1)\z/i
+ true
+ else
+ false
+ end
+ when :Object
+ # generic object (usually a Hash), return directly
+ value
+ when /\AArray<(?.+)>\z/
+ inner_type = Regexp.last_match[:inner_type]
+ value.map { |v| _deserialize(inner_type, v) }
+ when /\AHash<(?.+?), (?.+)>\z/
+ k_type = Regexp.last_match[:k_type]
+ v_type = Regexp.last_match[:v_type]
+ {}.tap do |hash|
+ value.each do |k, v|
+ hash[_deserialize(k_type, k)] = _deserialize(v_type, v)
+ end
+ end
+ else # model
+ temp_model = CloudsmithApi.const_get(type).new
+ temp_model.build_from_hash(value)
+ end
+ end
+
+ # Returns the string representation of the object
+ # @return [String] String presentation of the object
+ def to_s
+ to_hash.to_s
+ end
+
+ # to_body is an alias to to_hash (backward compatibility)
+ # @return [Hash] Returns the object in the form of hash
+ def to_body
+ to_hash
+ end
+
+ # Returns the object in the form of hash
+ # @return [Hash] Returns the object in the form of hash
+ def to_hash
+ hash = {}
+ self.class.attribute_map.each_pair do |attr, param|
+ value = self.send(attr)
+ next if value.nil?
+ hash[param] = _to_hash(value)
+ end
+ hash
+ end
+
+ # Outputs non-array value in the form of hash
+ # For object, use to_hash. Otherwise, just return the value
+ # @param [Object] value Any valid value
+ # @return [Hash] Returns the value in the form of hash
+ def _to_hash(value)
+ if value.is_a?(Array)
+ value.compact.map { |v| _to_hash(v) }
+ elsif value.is_a?(Hash)
+ {}.tap do |hash|
+ value.each { |k, v| hash[k] = _to_hash(v) }
+ end
+ elsif value.respond_to? :to_hash
+ value.to_hash
+ else
+ value
+ end
+ end
+
+end
+end
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream.rb b/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream.rb
index d55ee6f4..3114bf60 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream.rb
@@ -86,6 +86,9 @@ class MavenUpstream
attr_accessor :slug_perm
+ # Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ attr_accessor :trust_level
+
attr_accessor :updated_at
# The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
@@ -148,6 +151,7 @@ def self.attribute_map
:'pending_validation' => :'pending_validation',
:'priority' => :'priority',
:'slug_perm' => :'slug_perm',
+ :'trust_level' => :'trust_level',
:'updated_at' => :'updated_at',
:'upstream_url' => :'upstream_url',
:'verification_status' => :'verification_status',
@@ -184,6 +188,7 @@ def self.swagger_types
:'pending_validation' => :'BOOLEAN',
:'priority' => :'Integer',
:'slug_perm' => :'String',
+ :'trust_level' => :'String',
:'updated_at' => :'DateTime',
:'upstream_url' => :'String',
:'verification_status' => :'String',
@@ -311,6 +316,12 @@ def initialize(attributes = {})
self.slug_perm = attributes[:'slug_perm']
end
+ if attributes.has_key?(:'trust_level')
+ self.trust_level = attributes[:'trust_level']
+ else
+ self.trust_level = 'Trusted'
+ end
+
if attributes.has_key?(:'updated_at')
self.updated_at = attributes[:'updated_at']
end
@@ -357,6 +368,8 @@ def valid?
mode_validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy', 'Cache Only'])
return false unless mode_validator.valid?(@mode)
return false if @name.nil?
+ trust_level_validator = EnumAttributeValidator.new('String', ['Trusted', 'Untrusted'])
+ return false unless trust_level_validator.valid?(@trust_level)
return false if @upstream_url.nil?
verification_status_validator = EnumAttributeValidator.new('String', ['Unknown', 'Invalid', 'Valid', 'Invalid (No Key)'])
return false unless verification_status_validator.valid?(@verification_status)
@@ -403,6 +416,16 @@ def mode=(mode)
@mode = mode
end
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] trust_level Object to be assigned
+ def trust_level=(trust_level)
+ validator = EnumAttributeValidator.new('String', ['Trusted', 'Untrusted'])
+ unless validator.valid?(trust_level)
+ fail ArgumentError, 'invalid value for "trust_level", must be one of #{validator.allowable_values}.'
+ end
+ @trust_level = trust_level
+ end
+
# Custom attribute writer method checking allowed values (enum).
# @param [Object] verification_status Object to be assigned
def verification_status=(verification_status)
@@ -444,6 +467,7 @@ def ==(o)
pending_validation == o.pending_validation &&
priority == o.priority &&
slug_perm == o.slug_perm &&
+ trust_level == o.trust_level &&
updated_at == o.updated_at &&
upstream_url == o.upstream_url &&
verification_status == o.verification_status &&
@@ -459,7 +483,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [auth_mode, auth_secret, auth_username, available, can_reindex, created_at, disable_reason, disable_reason_text, extra_header_1, extra_header_2, extra_value_1, extra_value_2, gpg_key_fingerprint_short, gpg_key_inline, gpg_key_url, gpg_verification, has_failed_signature_verification, index_package_count, index_status, is_active, last_indexed, mode, name, pending_validation, priority, slug_perm, updated_at, upstream_url, verification_status, verify_ssl].hash
+ [auth_mode, auth_secret, auth_username, available, can_reindex, created_at, disable_reason, disable_reason_text, extra_header_1, extra_header_2, extra_value_1, extra_value_2, gpg_key_fingerprint_short, gpg_key_inline, gpg_key_url, gpg_verification, has_failed_signature_verification, index_package_count, index_status, is_active, last_indexed, mode, name, pending_validation, priority, slug_perm, trust_level, updated_at, upstream_url, verification_status, verify_ssl].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request.rb b/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request.rb
index 22d6589d..e274ade1 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request.rb
@@ -56,6 +56,9 @@ class MavenUpstreamRequest
# Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
attr_accessor :priority
+ # Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ attr_accessor :trust_level
+
# The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
attr_accessor :upstream_url
@@ -101,6 +104,7 @@ def self.attribute_map
:'mode' => :'mode',
:'name' => :'name',
:'priority' => :'priority',
+ :'trust_level' => :'trust_level',
:'upstream_url' => :'upstream_url',
:'verify_ssl' => :'verify_ssl'
}
@@ -123,6 +127,7 @@ def self.swagger_types
:'mode' => :'String',
:'name' => :'String',
:'priority' => :'Integer',
+ :'trust_level' => :'String',
:'upstream_url' => :'String',
:'verify_ssl' => :'BOOLEAN'
}
@@ -198,6 +203,12 @@ def initialize(attributes = {})
self.priority = attributes[:'priority']
end
+ if attributes.has_key?(:'trust_level')
+ self.trust_level = attributes[:'trust_level']
+ else
+ self.trust_level = 'Trusted'
+ end
+
if attributes.has_key?(:'upstream_url')
self.upstream_url = attributes[:'upstream_url']
end
@@ -232,6 +243,8 @@ def valid?
mode_validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy', 'Cache Only'])
return false unless mode_validator.valid?(@mode)
return false if @name.nil?
+ trust_level_validator = EnumAttributeValidator.new('String', ['Trusted', 'Untrusted'])
+ return false unless trust_level_validator.valid?(@trust_level)
return false if @upstream_url.nil?
true
end
@@ -266,6 +279,16 @@ def mode=(mode)
@mode = mode
end
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] trust_level Object to be assigned
+ def trust_level=(trust_level)
+ validator = EnumAttributeValidator.new('String', ['Trusted', 'Untrusted'])
+ unless validator.valid?(trust_level)
+ fail ArgumentError, 'invalid value for "trust_level", must be one of #{validator.allowable_values}.'
+ end
+ @trust_level = trust_level
+ end
+
# Checks equality by comparing each attribute.
# @param [Object] Object to be compared
def ==(o)
@@ -285,6 +308,7 @@ def ==(o)
mode == o.mode &&
name == o.name &&
priority == o.priority &&
+ trust_level == o.trust_level &&
upstream_url == o.upstream_url &&
verify_ssl == o.verify_ssl
end
@@ -298,7 +322,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [auth_mode, auth_secret, auth_username, extra_header_1, extra_header_2, extra_value_1, extra_value_2, gpg_key_inline, gpg_key_url, gpg_verification, is_active, mode, name, priority, upstream_url, verify_ssl].hash
+ [auth_mode, auth_secret, auth_username, extra_header_1, extra_header_2, extra_value_1, extra_value_2, gpg_key_inline, gpg_key_url, gpg_verification, is_active, mode, name, priority, trust_level, upstream_url, verify_ssl].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request_patch.rb b/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request_patch.rb
index 7f7514e7..ec682b56 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request_patch.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/maven_upstream_request_patch.rb
@@ -56,6 +56,9 @@ class MavenUpstreamRequestPatch
# Upstream sources are selected for resolving requests by sequential order (1..n), followed by creation date.
attr_accessor :priority
+ # Trust level allows for control of the visibility of upstream artifacts to native package managers. Where supported by formats, the default level (untrusted) is recommended for all upstreams, and helps to safeguard against common dependency confusion attack vectors.
+ attr_accessor :trust_level
+
# The URL for this upstream source. This must be a fully qualified URL including any path elements required to reach the root of the repository.
attr_accessor :upstream_url
@@ -101,6 +104,7 @@ def self.attribute_map
:'mode' => :'mode',
:'name' => :'name',
:'priority' => :'priority',
+ :'trust_level' => :'trust_level',
:'upstream_url' => :'upstream_url',
:'verify_ssl' => :'verify_ssl'
}
@@ -123,6 +127,7 @@ def self.swagger_types
:'mode' => :'String',
:'name' => :'String',
:'priority' => :'Integer',
+ :'trust_level' => :'String',
:'upstream_url' => :'String',
:'verify_ssl' => :'BOOLEAN'
}
@@ -198,6 +203,12 @@ def initialize(attributes = {})
self.priority = attributes[:'priority']
end
+ if attributes.has_key?(:'trust_level')
+ self.trust_level = attributes[:'trust_level']
+ else
+ self.trust_level = 'Trusted'
+ end
+
if attributes.has_key?(:'upstream_url')
self.upstream_url = attributes[:'upstream_url']
end
@@ -223,6 +234,8 @@ def valid?
return false unless gpg_verification_validator.valid?(@gpg_verification)
mode_validator = EnumAttributeValidator.new('String', ['Proxy Only', 'Cache and Proxy', 'Cache Only'])
return false unless mode_validator.valid?(@mode)
+ trust_level_validator = EnumAttributeValidator.new('String', ['Trusted', 'Untrusted'])
+ return false unless trust_level_validator.valid?(@trust_level)
true
end
@@ -256,6 +269,16 @@ def mode=(mode)
@mode = mode
end
+ # Custom attribute writer method checking allowed values (enum).
+ # @param [Object] trust_level Object to be assigned
+ def trust_level=(trust_level)
+ validator = EnumAttributeValidator.new('String', ['Trusted', 'Untrusted'])
+ unless validator.valid?(trust_level)
+ fail ArgumentError, 'invalid value for "trust_level", must be one of #{validator.allowable_values}.'
+ end
+ @trust_level = trust_level
+ end
+
# Checks equality by comparing each attribute.
# @param [Object] Object to be compared
def ==(o)
@@ -275,6 +298,7 @@ def ==(o)
mode == o.mode &&
name == o.name &&
priority == o.priority &&
+ trust_level == o.trust_level &&
upstream_url == o.upstream_url &&
verify_ssl == o.verify_ssl
end
@@ -288,7 +312,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [auth_mode, auth_secret, auth_username, extra_header_1, extra_header_2, extra_value_1, extra_value_2, gpg_key_inline, gpg_key_url, gpg_verification, is_active, mode, name, priority, upstream_url, verify_ssl].hash
+ [auth_mode, auth_secret, auth_username, extra_header_1, extra_header_2, extra_value_1, extra_value_2, gpg_key_inline, gpg_key_url, gpg_verification, is_active, mode, name, priority, trust_level, upstream_url, verify_ssl].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/organization_team.rb b/bindings/ruby/src/lib/cloudsmith-api/models/organization_team.rb
index d539e706..e1742aa8 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/organization_team.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/organization_team.rb
@@ -14,6 +14,7 @@
module CloudsmithApi
class OrganizationTeam
+ # A detailed description of the team.
attr_accessor :description
# A descriptive name for the team.
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request.rb b/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request.rb
index 0b6c6079..1f62b848 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request.rb
@@ -14,6 +14,7 @@
module CloudsmithApi
class OrganizationTeamRequest
+ # A detailed description of the team.
attr_accessor :description
# A descriptive name for the team.
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request_patch.rb b/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request_patch.rb
index ff77e78e..769c58ac 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request_patch.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/organization_team_request_patch.rb
@@ -14,6 +14,7 @@
module CloudsmithApi
class OrganizationTeamRequestPatch
+ # A detailed description of the team.
attr_accessor :description
# A descriptive name for the team.
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package.rb
index 601bd155..ae71f878 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package.rb
@@ -49,6 +49,9 @@ class Package
attr_accessor :filename
+ # Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ attr_accessor :filepath
+
attr_accessor :files
attr_accessor :format
@@ -251,6 +254,7 @@ def self.attribute_map
:'epoch' => :'epoch',
:'extension' => :'extension',
:'filename' => :'filename',
+ :'filepath' => :'filepath',
:'files' => :'files',
:'format' => :'format',
:'format_url' => :'format_url',
@@ -341,6 +345,7 @@ def self.swagger_types
:'epoch' => :'Integer',
:'extension' => :'String',
:'filename' => :'String',
+ :'filepath' => :'String',
:'files' => :'Array',
:'format' => :'String',
:'format_url' => :'String',
@@ -486,6 +491,10 @@ def initialize(attributes = {})
self.filename = attributes[:'filename']
end
+ if attributes.has_key?(:'filepath')
+ self.filepath = attributes[:'filepath']
+ end
+
if attributes.has_key?(:'files')
if (value = attributes[:'files']).is_a?(Array)
self.files = value
@@ -811,6 +820,7 @@ def ==(o)
epoch == o.epoch &&
extension == o.extension &&
filename == o.filename &&
+ filepath == o.filepath &&
files == o.files &&
format == o.format &&
format_url == o.format_url &&
@@ -890,7 +900,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
+ [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, filepath, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package_copy.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package_copy.rb
index 2267b792..975cbec0 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package_copy.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package_copy.rb
@@ -49,6 +49,9 @@ class PackageCopy
attr_accessor :filename
+ # Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ attr_accessor :filepath
+
attr_accessor :files
attr_accessor :format
@@ -252,6 +255,7 @@ def self.attribute_map
:'epoch' => :'epoch',
:'extension' => :'extension',
:'filename' => :'filename',
+ :'filepath' => :'filepath',
:'files' => :'files',
:'format' => :'format',
:'format_url' => :'format_url',
@@ -342,6 +346,7 @@ def self.swagger_types
:'epoch' => :'Integer',
:'extension' => :'String',
:'filename' => :'String',
+ :'filepath' => :'String',
:'files' => :'Array',
:'format' => :'String',
:'format_url' => :'String',
@@ -487,6 +492,10 @@ def initialize(attributes = {})
self.filename = attributes[:'filename']
end
+ if attributes.has_key?(:'filepath')
+ self.filepath = attributes[:'filepath']
+ end
+
if attributes.has_key?(:'files')
if (value = attributes[:'files']).is_a?(Array)
self.files = value
@@ -812,6 +821,7 @@ def ==(o)
epoch == o.epoch &&
extension == o.extension &&
filename == o.filename &&
+ filepath == o.filepath &&
files == o.files &&
format == o.format &&
format_url == o.format_url &&
@@ -891,7 +901,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
+ [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, filepath, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package_copy_request.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package_copy_request.rb
index 71b7eb07..bfbd9757 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package_copy_request.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package_copy_request.rb
@@ -14,6 +14,7 @@
module CloudsmithApi
class PackageCopyRequest
+ # The name of the destination repository without the namespace.
attr_accessor :destination
# If true, the package will overwrite any others with the same attributes (e.g. same version); otherwise, it will be flagged as a duplicate.
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package_move.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package_move.rb
index 2eaecd79..6c17bd9e 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package_move.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package_move.rb
@@ -49,6 +49,9 @@ class PackageMove
attr_accessor :filename
+ # Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ attr_accessor :filepath
+
attr_accessor :files
attr_accessor :format
@@ -252,6 +255,7 @@ def self.attribute_map
:'epoch' => :'epoch',
:'extension' => :'extension',
:'filename' => :'filename',
+ :'filepath' => :'filepath',
:'files' => :'files',
:'format' => :'format',
:'format_url' => :'format_url',
@@ -342,6 +346,7 @@ def self.swagger_types
:'epoch' => :'Integer',
:'extension' => :'String',
:'filename' => :'String',
+ :'filepath' => :'String',
:'files' => :'Array',
:'format' => :'String',
:'format_url' => :'String',
@@ -487,6 +492,10 @@ def initialize(attributes = {})
self.filename = attributes[:'filename']
end
+ if attributes.has_key?(:'filepath')
+ self.filepath = attributes[:'filepath']
+ end
+
if attributes.has_key?(:'files')
if (value = attributes[:'files']).is_a?(Array)
self.files = value
@@ -812,6 +821,7 @@ def ==(o)
epoch == o.epoch &&
extension == o.extension &&
filename == o.filename &&
+ filepath == o.filepath &&
files == o.files &&
format == o.format &&
format_url == o.format_url &&
@@ -891,7 +901,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
+ [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, filepath, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package_move_request.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package_move_request.rb
index a16ac07e..25c41edc 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package_move_request.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package_move_request.rb
@@ -14,6 +14,7 @@
module CloudsmithApi
class PackageMoveRequest
+ # The name of the destination repository without the namespace.
attr_accessor :destination
# Attribute mapping from ruby-style variable name to JSON key.
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package_quarantine.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package_quarantine.rb
index 86f839b5..732e0abb 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package_quarantine.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package_quarantine.rb
@@ -49,6 +49,9 @@ class PackageQuarantine
attr_accessor :filename
+ # Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ attr_accessor :filepath
+
attr_accessor :files
attr_accessor :format
@@ -248,6 +251,7 @@ def self.attribute_map
:'epoch' => :'epoch',
:'extension' => :'extension',
:'filename' => :'filename',
+ :'filepath' => :'filepath',
:'files' => :'files',
:'format' => :'format',
:'format_url' => :'format_url',
@@ -337,6 +341,7 @@ def self.swagger_types
:'epoch' => :'Integer',
:'extension' => :'String',
:'filename' => :'String',
+ :'filepath' => :'String',
:'files' => :'Array',
:'format' => :'String',
:'format_url' => :'String',
@@ -481,6 +486,10 @@ def initialize(attributes = {})
self.filename = attributes[:'filename']
end
+ if attributes.has_key?(:'filepath')
+ self.filepath = attributes[:'filepath']
+ end
+
if attributes.has_key?(:'files')
if (value = attributes[:'files']).is_a?(Array)
self.files = value
@@ -802,6 +811,7 @@ def ==(o)
epoch == o.epoch &&
extension == o.extension &&
filename == o.filename &&
+ filepath == o.filepath &&
files == o.files &&
format == o.format &&
format_url == o.format_url &&
@@ -880,7 +890,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
+ [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, filepath, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package_resync.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package_resync.rb
index 262607c7..9ada252c 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package_resync.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package_resync.rb
@@ -49,6 +49,9 @@ class PackageResync
attr_accessor :filename
+ # Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ attr_accessor :filepath
+
attr_accessor :files
attr_accessor :format
@@ -251,6 +254,7 @@ def self.attribute_map
:'epoch' => :'epoch',
:'extension' => :'extension',
:'filename' => :'filename',
+ :'filepath' => :'filepath',
:'files' => :'files',
:'format' => :'format',
:'format_url' => :'format_url',
@@ -341,6 +345,7 @@ def self.swagger_types
:'epoch' => :'Integer',
:'extension' => :'String',
:'filename' => :'String',
+ :'filepath' => :'String',
:'files' => :'Array',
:'format' => :'String',
:'format_url' => :'String',
@@ -486,6 +491,10 @@ def initialize(attributes = {})
self.filename = attributes[:'filename']
end
+ if attributes.has_key?(:'filepath')
+ self.filepath = attributes[:'filepath']
+ end
+
if attributes.has_key?(:'files')
if (value = attributes[:'files']).is_a?(Array)
self.files = value
@@ -811,6 +820,7 @@ def ==(o)
epoch == o.epoch &&
extension == o.extension &&
filename == o.filename &&
+ filepath == o.filepath &&
files == o.files &&
format == o.format &&
format_url == o.format_url &&
@@ -890,7 +900,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
+ [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, filepath, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/lib/cloudsmith-api/models/package_tag.rb b/bindings/ruby/src/lib/cloudsmith-api/models/package_tag.rb
index b8edf234..84ec4a48 100644
--- a/bindings/ruby/src/lib/cloudsmith-api/models/package_tag.rb
+++ b/bindings/ruby/src/lib/cloudsmith-api/models/package_tag.rb
@@ -49,6 +49,9 @@ class PackageTag
attr_accessor :filename
+ # Full path to the file, including filename e.g. bin/utils/tool.tar.gz
+ attr_accessor :filepath
+
attr_accessor :files
attr_accessor :format
@@ -252,6 +255,7 @@ def self.attribute_map
:'epoch' => :'epoch',
:'extension' => :'extension',
:'filename' => :'filename',
+ :'filepath' => :'filepath',
:'files' => :'files',
:'format' => :'format',
:'format_url' => :'format_url',
@@ -342,6 +346,7 @@ def self.swagger_types
:'epoch' => :'Integer',
:'extension' => :'String',
:'filename' => :'String',
+ :'filepath' => :'String',
:'files' => :'Array',
:'format' => :'String',
:'format_url' => :'String',
@@ -487,6 +492,10 @@ def initialize(attributes = {})
self.filename = attributes[:'filename']
end
+ if attributes.has_key?(:'filepath')
+ self.filepath = attributes[:'filepath']
+ end
+
if attributes.has_key?(:'files')
if (value = attributes[:'files']).is_a?(Array)
self.files = value
@@ -814,6 +823,7 @@ def ==(o)
epoch == o.epoch &&
extension == o.extension &&
filename == o.filename &&
+ filepath == o.filepath &&
files == o.files &&
format == o.format &&
format_url == o.format_url &&
@@ -893,7 +903,7 @@ def eql?(o)
# Calculates hash code according to all attributes.
# @return [Fixnum] Hash code
def hash
- [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_immutable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
+ [architectures, cdn_url, checksum_md5, checksum_sha1, checksum_sha256, checksum_sha512, dependencies_checksum_md5, dependencies_url, description, display_name, distro, distro_version, downloads, epoch, extension, filename, filepath, files, format, format_url, freeable_storage, fully_qualified_name, identifier_perm, identifiers, indexed, is_cancellable, is_copyable, is_deleteable, is_downloadable, is_immutable, is_moveable, is_quarantinable, is_quarantined, is_resyncable, is_security_scannable, is_sync_awaiting, is_sync_completed, is_sync_failed, is_sync_in_flight, is_sync_in_progress, license, name, namespace, namespace_url, num_files, origin_repository, origin_repository_url, package_type, policy_violated, raw_license, release, repository, repository_url, security_scan_completed_at, security_scan_started_at, security_scan_status, security_scan_status_updated_at, self_html_url, self_url, signature_url, size, slug, slug_perm, spdx_license, stage, stage_str, stage_updated_at, status, status_reason, status_str, status_updated_at, status_url, subtype, summary, sync_finished_at, sync_progress, tags_automatic, tags_immutable, type_display, uploaded_at, uploader, uploader_url, version, version_orig, vulnerability_scan_results_url].hash
end
# Builds the object from hash
diff --git a/bindings/ruby/src/spec/api/packages_api_spec.rb b/bindings/ruby/src/spec/api/packages_api_spec.rb
index 596f196e..6c4c0f4c 100644
--- a/bindings/ruby/src/spec/api/packages_api_spec.rb
+++ b/bindings/ruby/src/spec/api/packages_api_spec.rb
@@ -367,6 +367,20 @@
end
end
+ # unit tests for packages_upload_generic
+ # Create a new Generic package
+ # Create a new Generic package
+ # @param owner
+ # @param repo
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericPackageUploadRequest] :data
+ # @return [GenericPackageUpload]
+ describe 'packages_upload_generic test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
# unit tests for packages_upload_go
# Create a new Go package
# Create a new Go package
@@ -731,6 +745,20 @@
end
end
+ # unit tests for packages_validate_upload_generic
+ # Validate parameters for create Generic package
+ # Validate parameters for create Generic package
+ # @param owner
+ # @param repo
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericPackageUploadRequest] :data
+ # @return [nil]
+ describe 'packages_validate_upload_generic test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
# unit tests for packages_validate_upload_go
# Validate parameters for create Go package
# Validate parameters for create Go package
diff --git a/bindings/ruby/src/spec/api/repos_api_spec.rb b/bindings/ruby/src/spec/api/repos_api_spec.rb
index 9fae7aac..39fd75d9 100644
--- a/bindings/ruby/src/spec/api/repos_api_spec.rb
+++ b/bindings/ruby/src/spec/api/repos_api_spec.rb
@@ -1008,6 +1008,93 @@
end
end
+ # unit tests for repos_upstream_generic_create
+ # Create a Generic upstream config for this repository.
+ # Create a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequest] :data
+ # @return [GenericUpstream]
+ describe 'repos_upstream_generic_create test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ # unit tests for repos_upstream_generic_delete
+ # Delete a Generic upstream config for this repository.
+ # Delete a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @return [nil]
+ describe 'repos_upstream_generic_delete test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ # unit tests for repos_upstream_generic_list
+ # List Generic upstream configs for this repository.
+ # List Generic upstream configs for this repository.
+ # @param owner
+ # @param identifier
+ # @param [Hash] opts the optional parameters
+ # @option opts [Integer] :page A page number within the paginated result set.
+ # @option opts [Integer] :page_size Number of results to return per page.
+ # @return [Array]
+ describe 'repos_upstream_generic_list test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ # unit tests for repos_upstream_generic_partial_update
+ # Partially update a Generic upstream config for this repository.
+ # Partially update a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequestPatch] :data
+ # @return [GenericUpstream]
+ describe 'repos_upstream_generic_partial_update test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ # unit tests for repos_upstream_generic_read
+ # Retrieve a Generic upstream config for this repository.
+ # Retrieve a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @return [GenericUpstream]
+ describe 'repos_upstream_generic_read test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ # unit tests for repos_upstream_generic_update
+ # Update a Generic upstream config for this repository.
+ # Update a Generic upstream config for this repository.
+ # @param owner
+ # @param identifier
+ # @param slug_perm
+ # @param [Hash] opts the optional parameters
+ # @option opts [GenericUpstreamRequest] :data
+ # @return [GenericUpstream]
+ describe 'repos_upstream_generic_update test' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
# unit tests for repos_upstream_go_create
# Create a Go upstream config for this repository.
# Create a Go upstream config for this repository.
diff --git a/bindings/ruby/src/spec/models/format_support_spec.rb b/bindings/ruby/src/spec/models/format_support_spec.rb
index 30cd2302..e1bd1eaa 100644
--- a/bindings/ruby/src/spec/models/format_support_spec.rb
+++ b/bindings/ruby/src/spec/models/format_support_spec.rb
@@ -50,6 +50,12 @@
end
end
+ describe 'test attribute "filepaths"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
describe 'test attribute "metadata"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/generic_package_upload_request_spec.rb b/bindings/ruby/src/spec/models/generic_package_upload_request_spec.rb
new file mode 100644
index 00000000..774ab93d
--- /dev/null
+++ b/bindings/ruby/src/spec/models/generic_package_upload_request_spec.rb
@@ -0,0 +1,71 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'spec_helper'
+require 'json'
+require 'date'
+
+# Unit tests for CloudsmithApi::GenericPackageUploadRequest
+# Automatically generated by swagger-codegen (github.com/swagger-api/swagger-codegen)
+# Please update as you see appropriate
+describe 'GenericPackageUploadRequest' do
+ before do
+ # run before each test
+ @instance = CloudsmithApi::GenericPackageUploadRequest.new
+ end
+
+ after do
+ # run after each test
+ end
+
+ describe 'test an instance of GenericPackageUploadRequest' do
+ it 'should create an instance of GenericPackageUploadRequest' do
+ expect(@instance).to be_instance_of(CloudsmithApi::GenericPackageUploadRequest)
+ end
+ end
+ describe 'test attribute "filepath"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "name"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "package_file"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "republish"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "tags"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "version"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+end
diff --git a/bindings/ruby/src/spec/models/generic_package_upload_spec.rb b/bindings/ruby/src/spec/models/generic_package_upload_spec.rb
new file mode 100644
index 00000000..5e5821ac
--- /dev/null
+++ b/bindings/ruby/src/spec/models/generic_package_upload_spec.rb
@@ -0,0 +1,537 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'spec_helper'
+require 'json'
+require 'date'
+
+# Unit tests for CloudsmithApi::GenericPackageUpload
+# Automatically generated by swagger-codegen (github.com/swagger-api/swagger-codegen)
+# Please update as you see appropriate
+describe 'GenericPackageUpload' do
+ before do
+ # run before each test
+ @instance = CloudsmithApi::GenericPackageUpload.new
+ end
+
+ after do
+ # run after each test
+ end
+
+ describe 'test an instance of GenericPackageUpload' do
+ it 'should create an instance of GenericPackageUpload' do
+ expect(@instance).to be_instance_of(CloudsmithApi::GenericPackageUpload)
+ end
+ end
+ describe 'test attribute "architectures"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "cdn_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "checksum_md5"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "checksum_sha1"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "checksum_sha256"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "checksum_sha512"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "dependencies_checksum_md5"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "dependencies_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "description"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "display_name"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "distro"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "distro_version"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "downloads"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "epoch"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extension"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "filename"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "files"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "format"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "format_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "freeable_storage"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "fully_qualified_name"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "identifier_perm"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "identifiers"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "indexed"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_cancellable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_copyable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_deleteable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_downloadable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_moveable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_quarantinable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_quarantined"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_resyncable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_security_scannable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_sync_awaiting"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_sync_completed"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_sync_failed"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_sync_in_flight"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_sync_in_progress"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "license"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "name"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "namespace"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "namespace_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "num_files"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "origin_repository"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "origin_repository_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "package_type"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "policy_violated"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "raw_license"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "release"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "repository"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "repository_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "security_scan_completed_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "security_scan_started_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "security_scan_status"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["Awaiting Security Scan", "Security Scanning in Progress", "Scan Detected Vulnerabilities", "Scan Detected No Vulnerabilities", "Security Scanning Disabled", "Security Scanning Failed", "Security Scanning Skipped", "Security Scanning Not Supported"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.security_scan_status = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "security_scan_status_updated_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "self_html_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "self_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "signature_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "size"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "slug"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "slug_perm"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "spdx_license"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "stage"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "stage_str"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "stage_updated_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "status"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "status_reason"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "status_str"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "status_updated_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "status_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "subtype"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "summary"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "sync_finished_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "sync_progress"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "tags_automatic"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "tags_immutable"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "type_display"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "uploaded_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "uploader"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "uploader_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "version"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "version_orig"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "vulnerability_scan_results_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+end
diff --git a/bindings/ruby/src/spec/models/generic_upstream_request_patch_spec.rb b/bindings/ruby/src/spec/models/generic_upstream_request_patch_spec.rb
new file mode 100644
index 00000000..ba7a12c5
--- /dev/null
+++ b/bindings/ruby/src/spec/models/generic_upstream_request_patch_spec.rb
@@ -0,0 +1,127 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'spec_helper'
+require 'json'
+require 'date'
+
+# Unit tests for CloudsmithApi::GenericUpstreamRequestPatch
+# Automatically generated by swagger-codegen (github.com/swagger-api/swagger-codegen)
+# Please update as you see appropriate
+describe 'GenericUpstreamRequestPatch' do
+ before do
+ # run before each test
+ @instance = CloudsmithApi::GenericUpstreamRequestPatch.new
+ end
+
+ after do
+ # run after each test
+ end
+
+ describe 'test an instance of GenericUpstreamRequestPatch' do
+ it 'should create an instance of GenericUpstreamRequestPatch' do
+ expect(@instance).to be_instance_of(CloudsmithApi::GenericUpstreamRequestPatch)
+ end
+ end
+ describe 'test attribute "auth_mode"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["None", "Username and Password", "Token"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.auth_mode = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "auth_secret"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "auth_username"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_header_1"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_header_2"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_value_1"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_value_2"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_active"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "mode"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["Proxy Only", "Cache and Proxy"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.mode = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "name"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "priority"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "upstream_prefix"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "upstream_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "verify_ssl"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+end
diff --git a/bindings/ruby/src/spec/models/generic_upstream_request_spec.rb b/bindings/ruby/src/spec/models/generic_upstream_request_spec.rb
new file mode 100644
index 00000000..25c839b1
--- /dev/null
+++ b/bindings/ruby/src/spec/models/generic_upstream_request_spec.rb
@@ -0,0 +1,127 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'spec_helper'
+require 'json'
+require 'date'
+
+# Unit tests for CloudsmithApi::GenericUpstreamRequest
+# Automatically generated by swagger-codegen (github.com/swagger-api/swagger-codegen)
+# Please update as you see appropriate
+describe 'GenericUpstreamRequest' do
+ before do
+ # run before each test
+ @instance = CloudsmithApi::GenericUpstreamRequest.new
+ end
+
+ after do
+ # run after each test
+ end
+
+ describe 'test an instance of GenericUpstreamRequest' do
+ it 'should create an instance of GenericUpstreamRequest' do
+ expect(@instance).to be_instance_of(CloudsmithApi::GenericUpstreamRequest)
+ end
+ end
+ describe 'test attribute "auth_mode"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["None", "Username and Password", "Token"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.auth_mode = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "auth_secret"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "auth_username"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_header_1"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_header_2"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_value_1"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_value_2"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_active"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "mode"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["Proxy Only", "Cache and Proxy"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.mode = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "name"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "priority"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "upstream_prefix"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "upstream_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "verify_ssl"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+end
diff --git a/bindings/ruby/src/spec/models/generic_upstream_spec.rb b/bindings/ruby/src/spec/models/generic_upstream_spec.rb
new file mode 100644
index 00000000..dba64158
--- /dev/null
+++ b/bindings/ruby/src/spec/models/generic_upstream_spec.rb
@@ -0,0 +1,203 @@
+=begin
+#Cloudsmith API (v1)
+
+#The API to the Cloudsmith Service
+
+OpenAPI spec version: v1
+Contact: support@cloudsmith.io
+Generated by: https://github.com/swagger-api/swagger-codegen.git
+Swagger Codegen version: 2.4.50
+
+=end
+
+require 'spec_helper'
+require 'json'
+require 'date'
+
+# Unit tests for CloudsmithApi::GenericUpstream
+# Automatically generated by swagger-codegen (github.com/swagger-api/swagger-codegen)
+# Please update as you see appropriate
+describe 'GenericUpstream' do
+ before do
+ # run before each test
+ @instance = CloudsmithApi::GenericUpstream.new
+ end
+
+ after do
+ # run after each test
+ end
+
+ describe 'test an instance of GenericUpstream' do
+ it 'should create an instance of GenericUpstream' do
+ expect(@instance).to be_instance_of(CloudsmithApi::GenericUpstream)
+ end
+ end
+ describe 'test attribute "auth_mode"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["None", "Username and Password", "Token"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.auth_mode = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "auth_secret"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "auth_username"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "available"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "can_reindex"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "created_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "disable_reason"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["N/A", "Upstream points to its own repository", "Missing upstream source", "Upstream was disabled by request of user"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.disable_reason = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "disable_reason_text"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_header_1"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_header_2"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_value_1"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "extra_value_2"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "has_failed_signature_verification"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "index_package_count"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "index_status"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "is_active"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "last_indexed"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "mode"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["Proxy Only", "Cache and Proxy"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.mode = value }.not_to raise_error
+ # end
+ end
+ end
+
+ describe 'test attribute "name"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "pending_validation"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "priority"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "slug_perm"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "updated_at"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "upstream_prefix"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "upstream_url"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+ describe 'test attribute "verify_ssl"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
+end
diff --git a/bindings/ruby/src/spec/models/maven_upstream_request_patch_spec.rb b/bindings/ruby/src/spec/models/maven_upstream_request_patch_spec.rb
index 86f0d2b4..b46e0545 100644
--- a/bindings/ruby/src/spec/models/maven_upstream_request_patch_spec.rb
+++ b/bindings/ruby/src/spec/models/maven_upstream_request_patch_spec.rb
@@ -128,6 +128,16 @@
end
end
+ describe 'test attribute "trust_level"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["Trusted", "Untrusted"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.trust_level = value }.not_to raise_error
+ # end
+ end
+ end
+
describe 'test attribute "upstream_url"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/maven_upstream_request_spec.rb b/bindings/ruby/src/spec/models/maven_upstream_request_spec.rb
index 8bf4fd17..617b37ca 100644
--- a/bindings/ruby/src/spec/models/maven_upstream_request_spec.rb
+++ b/bindings/ruby/src/spec/models/maven_upstream_request_spec.rb
@@ -128,6 +128,16 @@
end
end
+ describe 'test attribute "trust_level"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["Trusted", "Untrusted"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.trust_level = value }.not_to raise_error
+ # end
+ end
+ end
+
describe 'test attribute "upstream_url"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/maven_upstream_spec.rb b/bindings/ruby/src/spec/models/maven_upstream_spec.rb
index 7cab0afb..ef972baa 100644
--- a/bindings/ruby/src/spec/models/maven_upstream_spec.rb
+++ b/bindings/ruby/src/spec/models/maven_upstream_spec.rb
@@ -204,6 +204,16 @@
end
end
+ describe 'test attribute "trust_level"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ # validator = Petstore::EnumTest::EnumAttributeValidator.new('String', ["Trusted", "Untrusted"])
+ # validator.allowable_values.each do |value|
+ # expect { @instance.trust_level = value }.not_to raise_error
+ # end
+ end
+ end
+
describe 'test attribute "updated_at"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/package_copy_spec.rb b/bindings/ruby/src/spec/models/package_copy_spec.rb
index e4474940..746e7178 100644
--- a/bindings/ruby/src/spec/models/package_copy_spec.rb
+++ b/bindings/ruby/src/spec/models/package_copy_spec.rb
@@ -128,6 +128,12 @@
end
end
+ describe 'test attribute "filepath"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
describe 'test attribute "files"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/package_move_spec.rb b/bindings/ruby/src/spec/models/package_move_spec.rb
index 0d41f383..ae6a62c1 100644
--- a/bindings/ruby/src/spec/models/package_move_spec.rb
+++ b/bindings/ruby/src/spec/models/package_move_spec.rb
@@ -128,6 +128,12 @@
end
end
+ describe 'test attribute "filepath"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
describe 'test attribute "files"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/package_quarantine_spec.rb b/bindings/ruby/src/spec/models/package_quarantine_spec.rb
index f191f592..fb6efedb 100644
--- a/bindings/ruby/src/spec/models/package_quarantine_spec.rb
+++ b/bindings/ruby/src/spec/models/package_quarantine_spec.rb
@@ -128,6 +128,12 @@
end
end
+ describe 'test attribute "filepath"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
describe 'test attribute "files"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/package_resync_spec.rb b/bindings/ruby/src/spec/models/package_resync_spec.rb
index 52f5f099..87493413 100644
--- a/bindings/ruby/src/spec/models/package_resync_spec.rb
+++ b/bindings/ruby/src/spec/models/package_resync_spec.rb
@@ -128,6 +128,12 @@
end
end
+ describe 'test attribute "filepath"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
describe 'test attribute "files"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/package_spec.rb b/bindings/ruby/src/spec/models/package_spec.rb
index b5c36137..2333b8e4 100644
--- a/bindings/ruby/src/spec/models/package_spec.rb
+++ b/bindings/ruby/src/spec/models/package_spec.rb
@@ -128,6 +128,12 @@
end
end
+ describe 'test attribute "filepath"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
describe 'test attribute "files"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
diff --git a/bindings/ruby/src/spec/models/package_tag_spec.rb b/bindings/ruby/src/spec/models/package_tag_spec.rb
index 7efdb80e..ec41d005 100644
--- a/bindings/ruby/src/spec/models/package_tag_spec.rb
+++ b/bindings/ruby/src/spec/models/package_tag_spec.rb
@@ -128,6 +128,12 @@
end
end
+ describe 'test attribute "filepath"' do
+ it 'should work' do
+ # assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers
+ end
+ end
+
describe 'test attribute "files"' do
it 'should work' do
# assertion here. ref: https://www.relishapp.com/rspec/rspec-expectations/docs/built-in-matchers