sparkjobdefinition

package
v0.1.0-beta.11 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 30, 2025 License: MIT Imports: 16 Imported by: 0

Documentation

Index

Examples

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type BackgroundJobsClient

type BackgroundJobsClient struct {
	// contains filtered or unexported fields
}

BackgroundJobsClient contains the methods for the BackgroundJobs group. Don't use this type directly, use a constructor function instead.

func (*BackgroundJobsClient) RunOnDemandSparkJobDefinition

func (client *BackgroundJobsClient) RunOnDemandSparkJobDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, jobType string, options *BackgroundJobsClientRunOnDemandSparkJobDefinitionOptions) (BackgroundJobsClientRunOnDemandSparkJobDefinitionResponse, error)

RunOnDemandSparkJobDefinition - REQUIRED DELEGATED SCOPES SparkJobDefinition.Execute.All or Item.Execute.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | No | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The Spark job definition item ID.
  • jobType - The supported job type for Spark job definition is sparkjob.
  • options - BackgroundJobsClientRunOnDemandSparkJobDefinitionOptions contains the optional parameters for the BackgroundJobsClient.RunOnDemandSparkJobDefinition method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	_, err = clientFactory.NewBackgroundJobsClient().RunOnDemandSparkJobDefinition(ctx, "4b218778-e7a5-4d73-8187-f10824047715", "431e8d7b-4a95-4c02-8ccd-6faef5ba1bd7", "sparkjob", nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
}
Output:

type BackgroundJobsClientRunOnDemandSparkJobDefinitionOptions

type BackgroundJobsClientRunOnDemandSparkJobDefinitionOptions struct {
}

BackgroundJobsClientRunOnDemandSparkJobDefinitionOptions contains the optional parameters for the BackgroundJobsClient.RunOnDemandSparkJobDefinition method.

type BackgroundJobsClientRunOnDemandSparkJobDefinitionResponse

type BackgroundJobsClientRunOnDemandSparkJobDefinitionResponse struct {
	// Location contains the information returned from the Location header response.
	Location *string

	// RetryAfter contains the information returned from the Retry-After header response.
	RetryAfter *int32
}

BackgroundJobsClientRunOnDemandSparkJobDefinitionResponse contains the response from method BackgroundJobsClient.RunOnDemandSparkJobDefinition.

type ClientFactory

type ClientFactory struct {
	// contains filtered or unexported fields
}

ClientFactory is a client factory used to create any client in this module. Don't use this type directly, use NewClientFactory instead.

func NewClientFactory

func NewClientFactory(credential azcore.TokenCredential, endpoint *string, options *azcore.ClientOptions) (*ClientFactory, error)

NewClientFactory creates a new instance of ClientFactory with the specified values. The parameter values will be propagated to any client created from this factory.

  • credential - used to authorize requests. Usually a credential from azidentity.
  • endpoint - pass nil to accept the default values.
  • options - pass nil to accept the default values.

func NewClientFactoryWithClient

func NewClientFactoryWithClient(client fabric.Client) *ClientFactory

NewClientFactoryWithClient creates a new instance of ClientFactory with sharable Client. The Client will be propagated to any client created from this factory.

  • client - Client created in the containing module: github.com/microsoft/fabric-sdk-go/fabric

func (*ClientFactory) NewBackgroundJobsClient

func (c *ClientFactory) NewBackgroundJobsClient() *BackgroundJobsClient

NewBackgroundJobsClient creates a new instance of BackgroundJobsClient.

func (*ClientFactory) NewItemsClient

func (c *ClientFactory) NewItemsClient() *ItemsClient

NewItemsClient creates a new instance of ItemsClient.

type CreateSparkJobDefinitionRequest

type CreateSparkJobDefinitionRequest struct {
	// REQUIRED; The spark job definition display name. The display name must follow naming rules according to item type.
	DisplayName *string

	// The spark job definition public definition.
	Definition *PublicDefinition

	// The spark job definition description. Maximum length is 256 characters.
	Description *string
}

CreateSparkJobDefinitionRequest - Create spark job definition request payload.

func (CreateSparkJobDefinitionRequest) MarshalJSON

func (c CreateSparkJobDefinitionRequest) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type CreateSparkJobDefinitionRequest.

func (*CreateSparkJobDefinitionRequest) UnmarshalJSON

func (c *CreateSparkJobDefinitionRequest) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type CreateSparkJobDefinitionRequest.

type ItemType

type ItemType string

ItemType - The type of the item. Additional item types may be added over time.

const (
	// ItemTypeDashboard - PowerBI dashboard.
	ItemTypeDashboard ItemType = "Dashboard"
	// ItemTypeDataPipeline - A data pipeline.
	ItemTypeDataPipeline ItemType = "DataPipeline"
	// ItemTypeDatamart - PowerBI datamart.
	ItemTypeDatamart ItemType = "Datamart"
	// ItemTypeEnvironment - An environment.
	ItemTypeEnvironment ItemType = "Environment"
	// ItemTypeEventhouse - An eventhouse.
	ItemTypeEventhouse ItemType = "Eventhouse"
	// ItemTypeEventstream - An eventstream.
	ItemTypeEventstream ItemType = "Eventstream"
	// ItemTypeGraphQLAPI - An API for GraphQL item.
	ItemTypeGraphQLAPI ItemType = "GraphQLApi"
	// ItemTypeKQLDashboard - A KQL dashboard.
	ItemTypeKQLDashboard ItemType = "KQLDashboard"
	// ItemTypeKQLDatabase - A KQL database.
	ItemTypeKQLDatabase ItemType = "KQLDatabase"
	// ItemTypeKQLQueryset - A KQL queryset.
	ItemTypeKQLQueryset ItemType = "KQLQueryset"
	// ItemTypeLakehouse - A lakehouse.
	ItemTypeLakehouse ItemType = "Lakehouse"
	// ItemTypeMLExperiment - A machine learning experiment.
	ItemTypeMLExperiment ItemType = "MLExperiment"
	// ItemTypeMLModel - A machine learning model.
	ItemTypeMLModel ItemType = "MLModel"
	// ItemTypeMirroredDatabase - A mirrored database.
	ItemTypeMirroredDatabase ItemType = "MirroredDatabase"
	// ItemTypeMirroredWarehouse - A mirrored warehouse.
	ItemTypeMirroredWarehouse ItemType = "MirroredWarehouse"
	// ItemTypeNotebook - A notebook.
	ItemTypeNotebook ItemType = "Notebook"
	// ItemTypePaginatedReport - PowerBI paginated report.
	ItemTypePaginatedReport ItemType = "PaginatedReport"
	// ItemTypeReflex - A Reflex.
	ItemTypeReflex ItemType = "Reflex"
	// ItemTypeReport - PowerBI report.
	ItemTypeReport ItemType = "Report"
	// ItemTypeSQLEndpoint - An SQL endpoint.
	ItemTypeSQLEndpoint ItemType = "SQLEndpoint"
	// ItemTypeSemanticModel - PowerBI semantic model.
	ItemTypeSemanticModel ItemType = "SemanticModel"
	// ItemTypeSparkJobDefinition - A spark job definition.
	ItemTypeSparkJobDefinition ItemType = "SparkJobDefinition"
	// ItemTypeWarehouse - A warehouse.
	ItemTypeWarehouse ItemType = "Warehouse"
)

func PossibleItemTypeValues

func PossibleItemTypeValues() []ItemType

PossibleItemTypeValues returns the possible values for the ItemType const type.

type ItemsClient

type ItemsClient struct {
	// contains filtered or unexported fields
}

ItemsClient contains the methods for the Items group. Don't use this type directly, use a constructor function instead.

func (*ItemsClient) BeginCreateSparkJobDefinition

func (client *ItemsClient) BeginCreateSparkJobDefinition(ctx context.Context, workspaceID string, createSparkJobDefinitionRequest CreateSparkJobDefinitionRequest, options *ItemsClientBeginCreateSparkJobDefinitionOptions) (*runtime.Poller[ItemsClientCreateSparkJobDefinitionResponse], error)

BeginCreateSparkJobDefinition - This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation]. To create spark job definition with a public definition, refer to Spark job definition [/rest/api/fabric/articles/item-management/definitions/spark-job-definition] article. PERMISSIONS THE CALLER MUST HAVE CONTRIBUTOR OR HIGHER WORKSPACE ROLE. REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All LIMITATIONS * To create a spark job definition the workspace must be on a supported Fabric capacity. For more information see: Microsoft Fabric license types [/fabric/enterprise/licenses#microsoft-fabric-license-types]. MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • createSparkJobDefinitionRequest - Create item request payload.
  • options - ItemsClientBeginCreateSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.BeginCreateSparkJobDefinition method.
Example (CreateASparkJobDefinitionExample)

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	poller, err := clientFactory.NewItemsClient().BeginCreateSparkJobDefinition(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", sparkjobdefinition.CreateSparkJobDefinitionRequest{
		Description: to.Ptr("A spark job definition description."),
		DisplayName: to.Ptr("SparkJobDefinition 1"),
	}, nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	_, err = poller.PollUntilDone(ctx, nil)
	if err != nil {
		log.Fatalf("failed to pull the result: %v", err)
	}
}
Output:

Example (CreateASparkJobDefinitionWithPublicDefinitionExample)

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	poller, err := clientFactory.NewItemsClient().BeginCreateSparkJobDefinition(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", sparkjobdefinition.CreateSparkJobDefinitionRequest{
		Description: to.Ptr("A spark job definition description."),
		Definition: &sparkjobdefinition.PublicDefinition{
			Format: to.Ptr("SparkJobDefinitionV1"),
			Parts: []sparkjobdefinition.PublicDefinitionPart{
				{
					Path:        to.Ptr("SparkJobDefinitionV1.json"),
					Payload:     to.Ptr("eyJleGVjdXRhYmxlRm..OWRmNDhhY2ZmZTgifQ=="),
					PayloadType: to.Ptr(sparkjobdefinition.PayloadTypeInlineBase64),
				},
				{
					Path:        to.Ptr(".platform"),
					Payload:     to.Ptr("ZG90UGxhdGZvcm1CYXNlNjRTdHJpbmc="),
					PayloadType: to.Ptr(sparkjobdefinition.PayloadTypeInlineBase64),
				}},
		},
		DisplayName: to.Ptr("SparkJobDefinition 1"),
	}, nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	_, err = poller.PollUntilDone(ctx, nil)
	if err != nil {
		log.Fatalf("failed to pull the result: %v", err)
	}
}
Output:

func (*ItemsClient) BeginGetSparkJobDefinitionDefinition

func (client *ItemsClient) BeginGetSparkJobDefinitionDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, options *ItemsClientBeginGetSparkJobDefinitionDefinitionOptions) (*runtime.Poller[ItemsClientGetSparkJobDefinitionDefinitionResponse], error)

BeginGetSparkJobDefinitionDefinition - This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation]. When you get a spark job definition's public definition, the sensitivity label is not a part of the definition. PERMISSIONS The caller must have contributor or higher workspace role. REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All LIMITATIONS This API is blocked for a spark job definition with an encrypted sensitivity label. MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The spark job definition ID.
  • options - ItemsClientBeginGetSparkJobDefinitionDefinitionOptions contains the optional parameters for the ItemsClient.BeginGetSparkJobDefinitionDefinition method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	poller, err := clientFactory.NewItemsClient().BeginGetSparkJobDefinitionDefinition(ctx, "6e335e92-a2a2-4b5a-970a-bd6a89fbb765", "cfafbeb1-8037-4d0c-896e-a46fb27ff229", &sparkjobdefinition.ItemsClientBeginGetSparkJobDefinitionDefinitionOptions{Format: nil})
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	res, err := poller.PollUntilDone(ctx, nil)
	if err != nil {
		log.Fatalf("failed to pull the result: %v", err)
	}
	// You could use response here. We use blank identifier for just demo purposes.
	_ = res
	// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
	// res.Response = sparkjobdefinition.Response{
	// 	Definition: &sparkjobdefinition.PublicDefinition{
	// 		Parts: []sparkjobdefinition.PublicDefinitionPart{
	// 			{
	// 				Path: to.Ptr("SparkJobDefinitionV1.json"),
	// 				Payload: to.Ptr("ew0KICAiZXhlY3V0YW..OWRmNDhhY2ZmZTgifQ"),
	// 				PayloadType: to.Ptr(sparkjobdefinition.PayloadTypeInlineBase64),
	// 			},
	// 			{
	// 				Path: to.Ptr(".platform"),
	// 				Payload: to.Ptr("ZG90UGxhdGZvcm1CYXNlNjRTdHJpbmc="),
	// 				PayloadType: to.Ptr(sparkjobdefinition.PayloadTypeInlineBase64),
	// 		}},
	// 	},
	// }
}
Output:

func (*ItemsClient) BeginUpdateSparkJobDefinitionDefinition

func (client *ItemsClient) BeginUpdateSparkJobDefinitionDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, updateSparkJobDefinitionRequest UpdateSparkJobDefinitionDefinitionRequest, options *ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions) (*runtime.Poller[ItemsClientUpdateSparkJobDefinitionDefinitionResponse], error)

BeginUpdateSparkJobDefinitionDefinition - This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation]. Updating the spark job definition's definition, does not affect its sensitivity label. PERMISSIONS The API caller must have contributor or higher workspace role. REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The spark job definition ID.
  • updateSparkJobDefinitionRequest - Update spark job definition definition request payload.
  • options - ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions contains the optional parameters for the ItemsClient.BeginUpdateSparkJobDefinitionDefinition method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	poller, err := clientFactory.NewItemsClient().BeginUpdateSparkJobDefinitionDefinition(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", "5b218778-e7a5-4d73-8187-f10824047715", sparkjobdefinition.UpdateSparkJobDefinitionDefinitionRequest{}, &sparkjobdefinition.ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions{UpdateMetadata: to.Ptr(true)})
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	_, err = poller.PollUntilDone(ctx, nil)
	if err != nil {
		log.Fatalf("failed to pull the result: %v", err)
	}
}
Output:

func (*ItemsClient) CreateSparkJobDefinition

func (client *ItemsClient) CreateSparkJobDefinition(ctx context.Context, workspaceID string, createSparkJobDefinitionRequest CreateSparkJobDefinitionRequest, options *ItemsClientBeginCreateSparkJobDefinitionOptions) (ItemsClientCreateSparkJobDefinitionResponse, error)

CreateSparkJobDefinition - returns ItemsClientCreateSparkJobDefinitionResponse in sync mode. This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation].

To create spark job definition with a public definition, refer to Spark job definition [/rest/api/fabric/articles/item-management/definitions/spark-job-definition] article.

PERMISSIONS THE CALLER MUST HAVE CONTRIBUTOR OR HIGHER WORKSPACE ROLE. REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All

LIMITATIONS

  • To create a spark job definition the workspace must be on a supported Fabric capacity. For more information see: Microsoft Fabric license types [/fabric/enterprise/licenses#microsoft-fabric-license-types].

MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section.

| Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes |

INTERFACE Generated from API version v1

  • workspaceID - The workspace ID.
  • createSparkJobDefinitionRequest - Create item request payload.
  • options - ItemsClientBeginCreateSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.BeginCreateSparkJobDefinition method.

func (*ItemsClient) DeleteSparkJobDefinition

func (client *ItemsClient) DeleteSparkJobDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, options *ItemsClientDeleteSparkJobDefinitionOptions) (ItemsClientDeleteSparkJobDefinitionResponse, error)

DeleteSparkJobDefinition - PERMISSIONS The caller must have contributor or higher workspace role. REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The spark job definition ID.
  • options - ItemsClientDeleteSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.DeleteSparkJobDefinition method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	_, err = clientFactory.NewItemsClient().DeleteSparkJobDefinition(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", "5b218778-e7a5-4d73-8187-f10824047715", nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
}
Output:

func (*ItemsClient) GetSparkJobDefinition

func (client *ItemsClient) GetSparkJobDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, options *ItemsClientGetSparkJobDefinitionOptions) (ItemsClientGetSparkJobDefinitionResponse, error)

GetSparkJobDefinition - PERMISSIONS The caller must have viewer or higher workspace role. REQUIRED DELEGATED SCOPES SparkJobDefinition.Read.All or SparkJobDefinition.ReadWrite.All or Item.Read.All or Item.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The spark job definition ID.
  • options - ItemsClientGetSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.GetSparkJobDefinition method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	res, err := clientFactory.NewItemsClient().GetSparkJobDefinition(ctx, "f089354e-8366-4e18-aea3-4cb4a3a50b48", "41ce06d1-d81b-4ea0-bc6d-2ce3dd2f8e87", nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	// You could use response here. We use blank identifier for just demo purposes.
	_ = res
	// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
	// res.SparkJobDefinition = sparkjobdefinition.SparkJobDefinition{
	// 	Type: to.Ptr(sparkjobdefinition.ItemTypeSparkJobDefinition),
	// 	Description: to.Ptr("A spark job definition description."),
	// 	DisplayName: to.Ptr("SparkJobDefinition 1"),
	// 	ID: to.Ptr("5b218778-e7a5-4d73-8187-f10824047715"),
	// 	WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
	// 	Properties: &sparkjobdefinition.Properties{
	// 		OneLakeRootPath: to.Ptr("https://onelake.dfs.fabric.microsoft.com/f089354e-8366-4e18-aea3-4cb4a3a50b48/41ce06d1-d81b-4ea0-bc6d-2ce3dd2f8e87"),
	// 	},
	// }
}
Output:

func (*ItemsClient) GetSparkJobDefinitionDefinition

func (client *ItemsClient) GetSparkJobDefinitionDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, options *ItemsClientBeginGetSparkJobDefinitionDefinitionOptions) (ItemsClientGetSparkJobDefinitionDefinitionResponse, error)

GetSparkJobDefinitionDefinition - returns ItemsClientGetSparkJobDefinitionDefinitionResponse in sync mode. This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation].

When you get a spark job definition's public definition, the sensitivity label is not a part of the definition.

PERMISSIONS The caller must have contributor or higher workspace role.

REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All

LIMITATIONS This API is blocked for a spark job definition with an encrypted sensitivity label.

MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section.

| Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes |

INTERFACE Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The spark job definition ID.
  • options - ItemsClientBeginGetSparkJobDefinitionDefinitionOptions contains the optional parameters for the ItemsClient.BeginGetSparkJobDefinitionDefinition method.

func (*ItemsClient) ListSparkJobDefinitions

func (client *ItemsClient) ListSparkJobDefinitions(ctx context.Context, workspaceID string, options *ItemsClientListSparkJobDefinitionsOptions) ([]SparkJobDefinition, error)

ListSparkJobDefinitions - returns array of SparkJobDefinition from all pages. This API supports pagination [/rest/api/fabric/articles/pagination].

PERMISSIONS The caller must have viewer or higher workspace role.

REQUIRED DELEGATED SCOPES Workspace.Read.All or Workspace.ReadWrite.All

MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section.

| Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes |

INTERFACE Generated from API version v1

  • workspaceID - The workspace ID.
  • options - ItemsClientListSparkJobDefinitionsOptions contains the optional parameters for the ItemsClient.NewListSparkJobDefinitionsPager method.

func (*ItemsClient) NewListSparkJobDefinitionsPager

func (client *ItemsClient) NewListSparkJobDefinitionsPager(workspaceID string, options *ItemsClientListSparkJobDefinitionsOptions) *runtime.Pager[ItemsClientListSparkJobDefinitionsResponse]

NewListSparkJobDefinitionsPager - This API supports pagination [/rest/api/fabric/articles/pagination]. PERMISSIONS The caller must have viewer or higher workspace role. REQUIRED DELEGATED SCOPES Workspace.Read.All or Workspace.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes | INTERFACE

Generated from API version v1

  • workspaceID - The workspace ID.
  • options - ItemsClientListSparkJobDefinitionsOptions contains the optional parameters for the ItemsClient.NewListSparkJobDefinitionsPager method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	pager := clientFactory.NewItemsClient().NewListSparkJobDefinitionsPager("cfafbeb1-8037-4d0c-896e-a46fb27ff229", &sparkjobdefinition.ItemsClientListSparkJobDefinitionsOptions{ContinuationToken: nil})
	for pager.More() {
		page, err := pager.NextPage(ctx)
		if err != nil {
			log.Fatalf("failed to advance page: %v", err)
		}
		for _, v := range page.Value {
			// You could use page here. We use blank identifier for just demo purposes.
			_ = v
		}
		// If the HTTP response code is 200 as defined in example definition, your page structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
		// page.SparkJobDefinitions = sparkjobdefinition.SparkJobDefinitions{
		// 	Value: []sparkjobdefinition.SparkJobDefinition{
		// 		{
		// 			Type: to.Ptr(sparkjobdefinition.ItemTypeSparkJobDefinition),
		// 			Description: to.Ptr("A spark job definition description."),
		// 			DisplayName: to.Ptr("SparkJobDefinition Name 1"),
		// 			ID: to.Ptr("3546052c-ae64-4526-b1a8-52af7761426f"),
		// 			WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
		// 			Properties: &sparkjobdefinition.Properties{
		// 				OneLakeRootPath: to.Ptr("https://onelake.dfs.fabric.microsoft.com/f089354e-8366-4e18-aea3-4cb4a3a50b48/41ce06d1-d81b-4ea0-bc6d-2ce3dd2f8e87"),
		// 			},
		// 		},
		// 		{
		// 			Type: to.Ptr(sparkjobdefinition.ItemTypeSparkJobDefinition),
		// 			Description: to.Ptr("A spark job definition description."),
		// 			DisplayName: to.Ptr("SparkJobDefinition Name 2"),
		// 			ID: to.Ptr("f697fb63-abd4-4399-9548-be7e3c3c0dac"),
		// 			WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
		// 			Properties: &sparkjobdefinition.Properties{
		// 				OneLakeRootPath: to.Ptr("https://onelake.dfs.fabric.microsoft.com/f089354e-8366-4e18-aea3-4cb4a3a50b48/d8f6cf16-3aac-4440-9d76-a03d86b7ae3e"),
		// 			},
		// 	}},
		// }
	}
}
Output:

func (*ItemsClient) UpdateSparkJobDefinition

func (client *ItemsClient) UpdateSparkJobDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, updateSparkJobDefinitionRequest UpdateSparkJobDefinitionRequest, options *ItemsClientUpdateSparkJobDefinitionOptions) (ItemsClientUpdateSparkJobDefinitionResponse, error)

UpdateSparkJobDefinition - PERMISSIONS The caller must have contributor or higher workspace role. REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section. | Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes | INTERFACE If the operation fails it returns an *core.ResponseError type.

Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The spark job definition ID.
  • updateSparkJobDefinitionRequest - Update spark job definition request payload.
  • options - ItemsClientUpdateSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.UpdateSparkJobDefinition method.
Example

Generated from example definition

package main

import (
	"context"
	"log"

	"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"

	"github.com/microsoft/fabric-sdk-go/fabric/sparkjobdefinition"
)

func main() {
	cred, err := azidentity.NewDefaultAzureCredential(nil)
	if err != nil {
		log.Fatalf("failed to obtain a credential: %v", err)
	}
	ctx := context.Background()
	clientFactory, err := sparkjobdefinition.NewClientFactory(cred, nil, nil)
	if err != nil {
		log.Fatalf("failed to create client: %v", err)
	}
	res, err := clientFactory.NewItemsClient().UpdateSparkJobDefinition(ctx, "cfafbeb1-8037-4d0c-896e-a46fb27ff229", "5b218778-e7a5-4d73-8187-f10824047715", sparkjobdefinition.UpdateSparkJobDefinitionRequest{
		Description: to.Ptr("SparkJobDefinition's New description"),
		DisplayName: to.Ptr("SparkJobDefinition's New name"),
	}, nil)
	if err != nil {
		log.Fatalf("failed to finish the request: %v", err)
	}
	// You could use response here. We use blank identifier for just demo purposes.
	_ = res
	// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
	// res.SparkJobDefinition = sparkjobdefinition.SparkJobDefinition{
	// 	Type: to.Ptr(sparkjobdefinition.ItemTypeSparkJobDefinition),
	// 	Description: to.Ptr("SparkJobDefinition's New description"),
	// 	DisplayName: to.Ptr("SparkJobDefinition's New name"),
	// 	ID: to.Ptr("5b218778-e7a5-4d73-8187-f10824047715"),
	// 	WorkspaceID: to.Ptr("cfafbeb1-8037-4d0c-896e-a46fb27ff229"),
	// }
}
Output:

func (*ItemsClient) UpdateSparkJobDefinitionDefinition

func (client *ItemsClient) UpdateSparkJobDefinitionDefinition(ctx context.Context, workspaceID string, sparkJobDefinitionID string, updateSparkJobDefinitionRequest UpdateSparkJobDefinitionDefinitionRequest, options *ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions) (ItemsClientUpdateSparkJobDefinitionDefinitionResponse, error)

UpdateSparkJobDefinitionDefinition - returns ItemsClientUpdateSparkJobDefinitionDefinitionResponse in sync mode. This API supports long running operations (LRO) [/rest/api/fabric/articles/long-running-operation].

Updating the spark job definition's definition, does not affect its sensitivity label.

PERMISSIONS The API caller must have contributor or higher workspace role.

REQUIRED DELEGATED SCOPES SparkJobDefinition.ReadWrite.All or Item.ReadWrite.All

MICROSOFT ENTRA SUPPORTED IDENTITIES This API supports the Microsoft identities [/rest/api/fabric/articles/identity-support] listed in this section.

| Identity | Support | |-|-| | User | Yes | | Service principal [/entra/identity-platform/app-objects-and-service-principals#service-principal-object] and Managed identities [/entra/identity/managed-identities-azure-resources/overview] | Yes |

INTERFACE Generated from API version v1

  • workspaceID - The workspace ID.
  • sparkJobDefinitionID - The spark job definition ID.
  • updateSparkJobDefinitionRequest - Update spark job definition definition request payload.
  • options - ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions contains the optional parameters for the ItemsClient.BeginUpdateSparkJobDefinitionDefinition method.

type ItemsClientBeginCreateSparkJobDefinitionOptions

type ItemsClientBeginCreateSparkJobDefinitionOptions struct {
	// Resumes the long-running operation from the provided token.
	ResumeToken string
}

ItemsClientBeginCreateSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.BeginCreateSparkJobDefinition method.

type ItemsClientBeginGetSparkJobDefinitionDefinitionOptions

type ItemsClientBeginGetSparkJobDefinitionDefinitionOptions struct {
	// The format of the spark job definition public definition.
	Format *string

	// Resumes the long-running operation from the provided token.
	ResumeToken string
}

ItemsClientBeginGetSparkJobDefinitionDefinitionOptions contains the optional parameters for the ItemsClient.BeginGetSparkJobDefinitionDefinition method.

type ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions

type ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions struct {
	// Resumes the long-running operation from the provided token.
	ResumeToken string

	// When set to true and the .platform file is provided as part of the definition, the item's metadata is updated using the
	// metadata in the .platform file
	UpdateMetadata *bool
}

ItemsClientBeginUpdateSparkJobDefinitionDefinitionOptions contains the optional parameters for the ItemsClient.BeginUpdateSparkJobDefinitionDefinition method.

type ItemsClientCreateSparkJobDefinitionResponse

type ItemsClientCreateSparkJobDefinitionResponse struct {
	// A spark job definition object.
	SparkJobDefinition
}

ItemsClientCreateSparkJobDefinitionResponse contains the response from method ItemsClient.BeginCreateSparkJobDefinition.

type ItemsClientDeleteSparkJobDefinitionOptions

type ItemsClientDeleteSparkJobDefinitionOptions struct {
}

ItemsClientDeleteSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.DeleteSparkJobDefinition method.

type ItemsClientDeleteSparkJobDefinitionResponse

type ItemsClientDeleteSparkJobDefinitionResponse struct {
}

ItemsClientDeleteSparkJobDefinitionResponse contains the response from method ItemsClient.DeleteSparkJobDefinition.

type ItemsClientGetSparkJobDefinitionDefinitionResponse

type ItemsClientGetSparkJobDefinitionDefinitionResponse struct {
	// Spark job definition public definition response.
	Response
}

ItemsClientGetSparkJobDefinitionDefinitionResponse contains the response from method ItemsClient.BeginGetSparkJobDefinitionDefinition.

type ItemsClientGetSparkJobDefinitionOptions

type ItemsClientGetSparkJobDefinitionOptions struct {
}

ItemsClientGetSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.GetSparkJobDefinition method.

type ItemsClientGetSparkJobDefinitionResponse

type ItemsClientGetSparkJobDefinitionResponse struct {
	// A spark job definition object.
	SparkJobDefinition
}

ItemsClientGetSparkJobDefinitionResponse contains the response from method ItemsClient.GetSparkJobDefinition.

type ItemsClientListSparkJobDefinitionsOptions

type ItemsClientListSparkJobDefinitionsOptions struct {
	// A token for retrieving the next page of results.
	ContinuationToken *string
}

ItemsClientListSparkJobDefinitionsOptions contains the optional parameters for the ItemsClient.NewListSparkJobDefinitionsPager method.

type ItemsClientListSparkJobDefinitionsResponse

type ItemsClientListSparkJobDefinitionsResponse struct {
	// A list of spark job definitions.
	SparkJobDefinitions
}

ItemsClientListSparkJobDefinitionsResponse contains the response from method ItemsClient.NewListSparkJobDefinitionsPager.

type ItemsClientUpdateSparkJobDefinitionDefinitionResponse

type ItemsClientUpdateSparkJobDefinitionDefinitionResponse struct {
}

ItemsClientUpdateSparkJobDefinitionDefinitionResponse contains the response from method ItemsClient.BeginUpdateSparkJobDefinitionDefinition.

type ItemsClientUpdateSparkJobDefinitionOptions

type ItemsClientUpdateSparkJobDefinitionOptions struct {
}

ItemsClientUpdateSparkJobDefinitionOptions contains the optional parameters for the ItemsClient.UpdateSparkJobDefinition method.

type ItemsClientUpdateSparkJobDefinitionResponse

type ItemsClientUpdateSparkJobDefinitionResponse struct {
	// A spark job definition object.
	SparkJobDefinition
}

ItemsClientUpdateSparkJobDefinitionResponse contains the response from method ItemsClient.UpdateSparkJobDefinition.

type PayloadType

type PayloadType string

PayloadType - The type of the definition part payload. Additional payload types may be added over time.

const (
	// PayloadTypeInlineBase64 - Inline Base 64.
	PayloadTypeInlineBase64 PayloadType = "InlineBase64"
)

func PossiblePayloadTypeValues

func PossiblePayloadTypeValues() []PayloadType

PossiblePayloadTypeValues returns the possible values for the PayloadType const type.

type Properties

type Properties struct {
	// REQUIRED; OneLake path to the SparkJobDefinition root directory.
	OneLakeRootPath *string
}

Properties - The spark job definition properties.

func (Properties) MarshalJSON

func (p Properties) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type Properties.

func (*Properties) UnmarshalJSON

func (p *Properties) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type Properties.

type PublicDefinition

type PublicDefinition struct {
	// REQUIRED; A list of definition parts.
	Parts []PublicDefinitionPart

	// The format of the item definition. Supported format: SparkJobDefinitionV1.
	Format *string
}

PublicDefinition - Spark job definition public definition object. Refer to this article [/rest/api/fabric/articles/item-management/definitions/spark-job-definition] for more details on how to craft a spark job definition public definition.

func (PublicDefinition) MarshalJSON

func (p PublicDefinition) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type PublicDefinition.

func (*PublicDefinition) UnmarshalJSON

func (p *PublicDefinition) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type PublicDefinition.

type PublicDefinitionPart

type PublicDefinitionPart struct {
	// The spark job definition public definition part path.
	Path *string

	// The spark job definition public definition part payload.
	Payload *string

	// The payload type.
	PayloadType *PayloadType
}

PublicDefinitionPart - Spark job definition definition part object.

func (PublicDefinitionPart) MarshalJSON

func (p PublicDefinitionPart) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type PublicDefinitionPart.

func (*PublicDefinitionPart) UnmarshalJSON

func (p *PublicDefinitionPart) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type PublicDefinitionPart.

type Response

type Response struct {
	// READ-ONLY; Spark job definition public definition object. Refer to this article [/rest/api/fabric/articles/item-management/definitions/spark-job-definition]
	// for more details on how to craft a spark job
	// definition public definition.
	Definition *PublicDefinition
}

Response - Spark job definition public definition response.

func (Response) MarshalJSON

func (r Response) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type Response.

func (*Response) UnmarshalJSON

func (r *Response) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type Response.

type SparkJobDefinition

type SparkJobDefinition struct {
	// REQUIRED; The item type.
	Type *ItemType

	// The item description.
	Description *string

	// The item display name.
	DisplayName *string

	// The spark job definition properties.
	Properties *Properties

	// READ-ONLY; The item ID.
	ID *string

	// READ-ONLY; The workspace ID.
	WorkspaceID *string
}

SparkJobDefinition - A spark job definition object.

func (SparkJobDefinition) MarshalJSON

func (s SparkJobDefinition) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type SparkJobDefinition.

func (*SparkJobDefinition) UnmarshalJSON

func (s *SparkJobDefinition) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type SparkJobDefinition.

type SparkJobDefinitions

type SparkJobDefinitions struct {
	// REQUIRED; A list of spark job definitions.
	Value []SparkJobDefinition

	// The token for the next result set batch. If there are no more records, it's removed from the response.
	ContinuationToken *string

	// The URI of the next result set batch. If there are no more records, it's removed from the response.
	ContinuationURI *string
}

SparkJobDefinitions - A list of spark job definitions.

func (SparkJobDefinitions) MarshalJSON

func (s SparkJobDefinitions) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type SparkJobDefinitions.

func (*SparkJobDefinitions) UnmarshalJSON

func (s *SparkJobDefinitions) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type SparkJobDefinitions.

type UpdateSparkJobDefinitionDefinitionRequest

type UpdateSparkJobDefinitionDefinitionRequest struct {
	// REQUIRED; Spark job definition public definition object. Refer to this article [/rest/api/fabric/articles/item-management/definitions/spark-job-definition]
	// for more details on how to craft a spark job
	// definition public definition.
	Definition *PublicDefinition
}

UpdateSparkJobDefinitionDefinitionRequest - Update spark job definition public definition request payload.

func (UpdateSparkJobDefinitionDefinitionRequest) MarshalJSON

MarshalJSON implements the json.Marshaller interface for type UpdateSparkJobDefinitionDefinitionRequest.

func (*UpdateSparkJobDefinitionDefinitionRequest) UnmarshalJSON

func (u *UpdateSparkJobDefinitionDefinitionRequest) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type UpdateSparkJobDefinitionDefinitionRequest.

type UpdateSparkJobDefinitionRequest

type UpdateSparkJobDefinitionRequest struct {
	// The spark job definition description. Maximum length is 256 characters.
	Description *string

	// The spark job definition display name. The display name must follow naming rules according to item type.
	DisplayName *string
}

UpdateSparkJobDefinitionRequest - Update spark job definition request.

func (UpdateSparkJobDefinitionRequest) MarshalJSON

func (u UpdateSparkJobDefinitionRequest) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaller interface for type UpdateSparkJobDefinitionRequest.

func (*UpdateSparkJobDefinitionRequest) UnmarshalJSON

func (u *UpdateSparkJobDefinitionRequest) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the json.Unmarshaller interface for type UpdateSparkJobDefinitionRequest.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL