mlmodel

package
v0.1.0-beta.11 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 30, 2025 License: MIT Imports: 15 Imported by: 0

Documentation

Index

Examples

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type ClientFactory

type ClientFactory struct {
	// contains filtered or unexported fields
}

ClientFactory is a client factory used to create any client in this module. Don't use this type directly, use NewClientFactory instead.

func NewClientFactory

func NewClientFactory(credential azcore.TokenCredential, endpoint *string, options *azcore.ClientOptions) (*ClientFactory, error)

NewClientFactory creates a new instance of ClientFactory with the specified values. The parameter values will be propagated to any client created from this factory.

  • credential - used to authorize requests. Usually a credential from azidentity.
  • endpoint - pass nil to accept the default values.
  • options - pass nil to accept the default values.

func NewClientFactoryWithClient

func NewClientFactoryWithClient(client fabric.Client) *ClientFactory

NewClientFactoryWithClient creates a new instance of ClientFactory with sharable Client. The Client will be propagated to any client created from this factory.

  • client - Client created in the containing module: github.com/microsoft/fabric-sdk-go/fabric

func (*ClientFactory) NewItemsClient

func (c *ClientFactory) NewItemsClient() *ItemsClient

NewItemsClient creates a new instance of ItemsClient.

type CreateMLModelRequest

type CreateMLModelRequest struct {
	// REQUIRED; The machine learning model display name. The display name must follow naming rules according to item type.
	DisplayName *string

	// The machine learning model description. Maximum length is 256 characters.
	Description *string
}

CreateMLModelRequest - Create machine learning model request payload.

func (CreateMLModelRequest) MarshalJSON

func (c CreateMLModelRequest) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type CreateMLModelRequest.

func (*CreateMLModelRequest) UnmarshalJSON

func (c *CreateMLModelRequest) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type CreateMLModelRequest.

type ItemType

type ItemType string

ItemType - The type of the item. Additional item types may be added over time.

const (
	// ItemTypeDashboard - PowerBI dashboard.
	ItemTypeDashboard ItemType = "Dashboard"
	// ItemTypeDataPipeline - A data pipeline.
	ItemTypeDataPipeline ItemType = "DataPipeline"
	// ItemTypeDatamart - PowerBI datamart.
	ItemTypeDatamart ItemType = "Datamart"
	// ItemTypeEnvironment - An environment.
	ItemTypeEnvironment ItemType = "Environment"
	// ItemTypeEventhouse - An eventhouse.
	ItemTypeEventhouse ItemType = "Eventhouse"
	// ItemTypeEventstream - An eventstream.
	ItemTypeEventstream ItemType = "Eventstream"
	// ItemTypeGraphQLAPI - An API for GraphQL item.
	ItemTypeGraphQLAPI ItemType = "GraphQLApi"
	// ItemTypeKQLDashboard - A KQL dashboard.
	ItemTypeKQLDashboard ItemType = "KQLDashboard"
	// ItemTypeKQLDatabase - A KQL database.
	ItemTypeKQLDatabase ItemType = "KQLDatabase"
	// ItemTypeKQLQueryset - A KQL queryset.
	ItemTypeKQLQueryset ItemType = "KQLQueryset"
	// ItemTypeLakehouse - A lakehouse.
	ItemTypeLakehouse ItemType = "Lakehouse"
	// ItemTypeMLExperiment - A machine learning experiment.
	ItemTypeMLExperiment ItemType = "MLExperiment"
	// ItemTypeMLModel - A machine learning model.
	ItemTypeMLModel ItemType = "MLModel"
	// ItemTypeMirroredDatabase - A mirrored database.
	ItemTypeMirroredDatabase ItemType = "MirroredDatabase"
	// ItemTypeMirroredWarehouse - A mirrored warehouse.
	ItemTypeMirroredWarehouse ItemType = "MirroredWarehouse"
	// ItemTypeNotebook - A notebook.
	ItemTypeNotebook ItemType = "Notebook"
	// ItemTypePaginatedReport - PowerBI paginated report.
	ItemTypePaginatedReport ItemType = "PaginatedReport"
	// ItemTypeReflex - A Reflex.
	ItemTypeReflex ItemType = "Reflex"
	// ItemTypeReport - PowerBI report.
	ItemTypeReport ItemType = "Report"
	// ItemTypeSQLEndpoint - An SQL endpoint.
	ItemTypeSQLEndpoint ItemType = "SQLEndpoint"
	// ItemTypeSemanticModel - PowerBI semantic model.
	ItemTypeSemanticModel ItemType = "SemanticModel"
	// ItemTypeSparkJobDefinition - A spark job definition.
	ItemTypeSparkJobDefinition ItemType = "SparkJobDefinition"
	// ItemTypeWarehouse - A warehouse.
	ItemTypeWarehouse ItemType = "Warehouse"
)

func PossibleItemTypeValues

func PossibleItemTypeValues() []ItemType

PossibleItemTypeValues returns the possible values for the ItemType const type.

type ItemsClient

type ItemsClient struct {
	// contains filtered or unexported fields
}

ItemsClient contains the methods for the Items group. Don't use this type directly, use a constructor function instead.

func (*ItemsClient) BeginCreateMLModel

func (client *ItemsClient) BeginCreateMLModel(ctx context.Context, workspaceID string, createMLModelRequest CreateMLModelRequest, options *ItemsClientBeginCreateMLModelOptions) (*runtime.Poller[ItemsClientCreateMLModelResponse], error)

BeginCreateMLModel - This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation]. This API does not support create an machine learning model with definition. PERMISSIONS THE CALLER MUST HAVE CONTRIBUTOR OR HIGHER WORKSPACE ROLE. REQUIRED DELEGATED SCOPES MLModel.ReadWrite.All or Item.ReadWrite.All LIMITATIONS * To create a machine learning model the workspace must be on a supported Fabric capacity. For more information see: Microsoft Fabric license types [/fabric/enterprise/licenses#microsoft-fabric-license-types]. MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • createMLModelRequest - Create item request payload.
  • options - ItemsClientBeginCreateMLModelOptions contains the optional parameters for the ItemsClient.BeginCreateMLModel method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/mlmodel"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := mlmodel.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	poller, err := clientFactory.NewItemsClient().BeginCreateMLModel(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", mlmodel.CreateMLModelRequest{
		Description: to.Ptr("A machine learning model description."),
		DisplayName: to.Ptr("MLModel_1"),
	}, nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	_, err = poller.PollUntilDone(ctx, nil)
	if err != nil {
		log.Fatalf("failed to pull the result: %v", err)
	}
}
Output:

func (*ItemsClient) CreateMLModel

func (client *ItemsClient) CreateMLModel(ctx context.Context, workspaceID string, createMLModelRequest CreateMLModelRequest, options *ItemsClientBeginCreateMLModelOptions) (ItemsClientCreateMLModelResponse, error)

CreateMLModel - returns ItemsClientCreateMLModelResponse in sync mode. This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation].

This API does not support create an machine learning model with definition.

PERMISSIONS THE CALLER MUST HAVE CONTRIBUTOR OR HIGHER WORKSPACE ROLE. REQUIRED DELEGATED SCOPES MLModel.ReadWrite.All or Item.ReadWrite.All

LIMITATIONS

  • To create a machine learning model the workspace must be on a supported Fabric capacity. For more information see: Microsoft Fabric license types [/fabric/enterprise/licenses#microsoft-fabric-license-types].

MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section.

| Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No |

INTERFACE Generated from API version v1

  • workspaceID - The workspace ID.
  • createMLModelRequest - Create item request payload.
  • options - ItemsClientBeginCreateMLModelOptions contains the optional parameters for the ItemsClient.BeginCreateMLModel method.

func (*ItemsClient) DeleteMLModel

func (client *ItemsClient) DeleteMLModel(ctx context.Context, workspaceID string, mlModelID string, options *ItemsClientDeleteMLModelOptions) (ItemsClientDeleteMLModelResponse, error)

DeleteMLModel - PERMISSIONS The caller must have contributor or higher workspace role. REQUIRED DELEGATED SCOPES MLModel.ReadWrite.All or Item.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • mlModelID - The machine learning model ID.
  • options - ItemsClientDeleteMLModelOptions contains the optional parameters for the ItemsClient.DeleteMLModel method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/mlmodel"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := mlmodel.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	_, err = clientFactory.NewItemsClient().DeleteMLModel(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", "5b218778-e7a5-4d73-8187-f10824047715", nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
}
Output:

func (*ItemsClient) GetMLModel

func (client *ItemsClient) GetMLModel(ctx context.Context, workspaceID string, mlModelID string, options *ItemsClientGetMLModelOptions) (ItemsClientGetMLModelResponse, error)

GetMLModel - PERMISSIONS The caller must have viewer or higher workspace role. REQUIRED DELEGATED SCOPES MLModel.Read.All or MLModel.ReadWrite.All or Item.Read.All or Item.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • mlModelID - The machine learning model ID.
  • options - ItemsClientGetMLModelOptions contains the optional parameters for the ItemsClient.GetMLModel method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/mlmodel"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := mlmodel.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	res, err := clientFactory.NewItemsClient().GetMLModel(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", "5b218778-e7a5-4d73-8187-f10824047715", nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	// You could use response here. We use blank identifier for just demo purposes.
	_ = res
	// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
	// res.MLModel = mlmodel.MLModel{
	// 	Type: to.Ptr(mlmodel.ItemTypeMLModel),
	// 	Description: to.Ptr("A machine learning model description."),
	// 	DisplayName: to.Ptr("MLModel_1"),
	// 	ID: to.Ptr("5b218778-e7a5-4d73-8187-f10824047715"),
	// 	WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
	// }
}
Output:

func (*ItemsClient) ListMLModels

func (client *ItemsClient) ListMLModels(ctx context.Context, workspaceID string, options *ItemsClientListMLModelsOptions) ([]MLModel, error)

ListMLModels - returns array of MLModel from all pages. This API supports pagination [/rest/api/fabric/articles/pagination].

PERMISSIONS The caller must have viewer or higher workspace role.

REQUIRED DELEGATED SCOPES Workspace.Read.All or Workspace.ReadWrite.All

MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section.

| Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No |

INTERFACE Generated from API version v1

  • workspaceID - The workspace ID.
  • options - ItemsClientListMLModelsOptions contains the optional parameters for the ItemsClient.NewListMLModelsPager method.

func (*ItemsClient) NewListMLModelsPager

func (client *ItemsClient) NewListMLModelsPager(workspaceID string, options *ItemsClientListMLModelsOptions) *runtime.Pager[ItemsClientListMLModelsResponse]

NewListMLModelsPager - This API supports pagination [/rest/api/fabric/articles/pagination]. PERMISSIONS The caller must have viewer or higher workspace role. REQUIRED DELEGATED SCOPES Workspace.Read.All or Workspace.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No | INTERFACE

Generated from API version v1

  • workspaceID - The workspace ID.
  • options - ItemsClientListMLModelsOptions contains the optional parameters for the ItemsClient.NewListMLModelsPager method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/mlmodel"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := mlmodel.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	pager := clientFactory.NewItemsClient().NewListMLModelsPager("cfafbeb1-8037-4d0c-896e-a46fb27ff229", &mlmodel.ItemsClientListMLModelsOptions{ContinuationToken: nil})
	for pager.More() {
		page, err := pager.NextPage(ctx)
		if err != nil {
			log.Fatalf("failed to advance page: %v", err)
		}
		for _, v := range page.Value {
			// You could use page here. We use blank identifier for just demo purposes.
			_ = v
		}
		// If the HTTP response code is 200 as defined in example definition, your page structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
		// page.MLModels = mlmodel.MLModels{
		// 	Value: []mlmodel.MLModel{
		// 		{
		// 			Type: to.Ptr(mlmodel.ItemTypeMLModel),
		// 			Description: to.Ptr("A machine learning model description."),
		// 			DisplayName: to.Ptr("MLModel_1"),
		// 			ID: to.Ptr("3546052c-ae64-4526-b1a8-52af7761426f"),
		// 			WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
		// 		},
		// 		{
		// 			Type: to.Ptr(mlmodel.ItemTypeMLModel),
		// 			Description: to.Ptr("A machine learning model description."),
		// 			DisplayName: to.Ptr("MLModel_2"),
		// 			ID: to.Ptr("f2a6411d-c204-47d3-b992-5338be0d2cee"),
		// 			WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
		// 	}},
		// }
	}
}
Output:

func (*ItemsClient) UpdateMLModel

func (client *ItemsClient) UpdateMLModel(ctx context.Context, workspaceID string, mlModelID string, updateMLModelRequest UpdateMLModelRequest, options *ItemsClientUpdateMLModelOptions) (ItemsClientUpdateMLModelResponse, error)

UpdateMLModel - PERMISSIONS The caller must have contributor or higher workspace role. REQUIRED DELEGATED SCOPES MLModel.ReadWrite.All or Item.ReadWrite.All LIMITATIONS * MLModel display name cannot be changed. MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • mlModelID - The machine learning model ID.
  • updateMLModelRequest - Update machine learning model request payload.
  • options - ItemsClientUpdateMLModelOptions contains the optional parameters for the ItemsClient.UpdateMLModel method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/mlmodel"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := mlmodel.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	res, err := clientFactory.NewItemsClient().UpdateMLModel(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", "5b218778-e7a5-4d73-8187-f10824047715", mlmodel.UpdateMLModelRequest{
		Description: to.Ptr("A new description for machine learning model."),
	}, nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	// You could use response here. We use blank identifier for just demo purposes.
	_ = res
	// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
	// res.MLModel = mlmodel.MLModel{
	// 	Type: to.Ptr(mlmodel.ItemTypeMLModel),
	// 	Description: to.Ptr("A new description for machine learning model."),
	// 	DisplayName: to.Ptr("MLModel's name"),
	// 	ID: to.Ptr("5b218778-e7a5-4d73-8187-f10824047715"),
	// 	WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
	// }
}
Output:

type ItemsClientBeginCreateMLModelOptions

type ItemsClientBeginCreateMLModelOptions struct {
	// Resumes the long-running operation from the provided token.
	ResumeToken string
}

ItemsClientBeginCreateMLModelOptions contains the optional parameters for the ItemsClient.BeginCreateMLModel method.

type ItemsClientCreateMLModelResponse

type ItemsClientCreateMLModelResponse struct {
	// A machine learning model object.
	MLModel
}

ItemsClientCreateMLModelResponse contains the response from method ItemsClient.BeginCreateMLModel.

type ItemsClientDeleteMLModelOptions

type ItemsClientDeleteMLModelOptions struct {
}

ItemsClientDeleteMLModelOptions contains the optional parameters for the ItemsClient.DeleteMLModel method.

type ItemsClientDeleteMLModelResponse

type ItemsClientDeleteMLModelResponse struct {
}

ItemsClientDeleteMLModelResponse contains the response from method ItemsClient.DeleteMLModel.

type ItemsClientGetMLModelOptions

type ItemsClientGetMLModelOptions struct {
}

ItemsClientGetMLModelOptions contains the optional parameters for the ItemsClient.GetMLModel method.

type ItemsClientGetMLModelResponse

type ItemsClientGetMLModelResponse struct {
	// A machine learning model object.
	MLModel
}

ItemsClientGetMLModelResponse contains the response from method ItemsClient.GetMLModel.

type ItemsClientListMLModelsOptions

type ItemsClientListMLModelsOptions struct {
	// A token for retrieving the next page of results.
	ContinuationToken *string
}

ItemsClientListMLModelsOptions contains the optional parameters for the ItemsClient.NewListMLModelsPager method.

type ItemsClientListMLModelsResponse

type ItemsClientListMLModelsResponse struct {
	// A list of machine learning models.
	MLModels
}

ItemsClientListMLModelsResponse contains the response from method ItemsClient.NewListMLModelsPager.

type ItemsClientUpdateMLModelOptions

type ItemsClientUpdateMLModelOptions struct {
}

ItemsClientUpdateMLModelOptions contains the optional parameters for the ItemsClient.UpdateMLModel method.

type ItemsClientUpdateMLModelResponse

type ItemsClientUpdateMLModelResponse struct {
	// A machine learning model object.
	MLModel
}

ItemsClientUpdateMLModelResponse contains the response from method ItemsClient.UpdateMLModel.

type MLModel

type MLModel struct {
	// REQUIRED; The item type.
	Type *ItemType

	// The item description.
	Description *string

	// The item display name.
	DisplayName *string

	// READ-ONLY; The item ID.
	ID *string

	// READ-ONLY; The workspace ID.
	WorkspaceID *string
}

MLModel - A machine learning model object.

func (MLModel) MarshalJSON

func (m MLModel) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type MLModel.

func (*MLModel) UnmarshalJSON

func (m *MLModel) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type MLModel.

type MLModels

type MLModels struct {
	// REQUIRED; A list of machine learning models.
	Value []MLModel

	// The token for the next result set batch. If there are no more records, it's removed from the response.
	ContinuationToken *string

	// The URI of the next result set batch. If there are no more records, it's removed from the response.
	ContinuationURI *string
}

MLModels - A list of machine learning models.

func (MLModels) MarshalJSON

func (m MLModels) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type MLModels.

func (*MLModels) UnmarshalJSON

func (m *MLModels) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type MLModels.

type UpdateMLModelRequest

type UpdateMLModelRequest struct {
	// The machine learning model description. Maximum length is 256 characters.
	Description *string
}

UpdateMLModelRequest - Update machine learning model request.

func (UpdateMLModelRequest) MarshalJSON

func (u UpdateMLModelRequest) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type UpdateMLModelRequest.

func (*UpdateMLModelRequest) UnmarshalJSON

func (u *UpdateMLModelRequest) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type UpdateMLModelRequest.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL