Skip to content

Commit 715c9c8

Browse files
author
praneeth_paikray-data
committed
Move skill-authoring into databricks-skills/ per review feedback
Reviewer asked to move the skill into .claude/ where developer skills live. Since .claude/ is gitignored (it's the install target), the correct home is databricks-skills/ alongside all other skills, with install_skills.sh handling the copy to .claude/skills/. - Move .skill-authoring/ → databricks-skills/skill-authoring/ - Delete install_skill_authoring.sh (install_skills.sh handles it now) - Register in install_skills.sh, install.sh, install.ps1 - Add to CORE_SKILLS (always installed for contributors) - Update path references in CONTRIBUTING.md and README.md - Add to skills table in databricks-skills/README.md Co-authored-by: Isaac
1 parent 2491033 commit 715c9c8

File tree

9 files changed

+16
-36
lines changed

9 files changed

+16
-36
lines changed

.skill-authoring/install_skill_authoring.sh

Lines changed: 0 additions & 25 deletions
This file was deleted.

CONTRIBUTING.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ Ensure your changes work with a live Databricks workspace.
9292

9393
### Recommended: Use the Authoring Skill
9494

95-
The fastest way to create a high-quality skill is with the `skill-authoring` skill (available in `.skill-authoring/` when you clone the repo). Ask Claude:
95+
The fastest way to create a high-quality skill is with the `skill-authoring` skill (available in `databricks-skills/skill-authoring/`). Ask Claude:
9696

9797
> "Help me create a new skill for [Databricks feature]"
9898
@@ -154,8 +154,8 @@ Before submitting a PR for a new skill, verify:
154154
### Skill Format Reference
155155

156156
For detailed format specifications, see:
157-
- `.skill-authoring/references/skill-format.md` — Frontmatter rules, progressive disclosure, section conventions
158-
- `.skill-authoring/references/test-format.md` — ground_truth.yaml and manifest.yaml schemas
157+
- `databricks-skills/skill-authoring/references/skill-format.md` — Frontmatter rules, progressive disclosure, section conventions
158+
- `databricks-skills/skill-authoring/references/test-format.md` — ground_truth.yaml and manifest.yaml schemas
159159

160160
### Evaluation & Optimization
161161

databricks-skills/README.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -110,6 +110,9 @@ cp -r ai-dev-kit/databricks-skills/databricks-agent-bricks .claude/skills/
110110
### 📚 Reference
111111
- **databricks-docs** - Documentation index via llms.txt
112112

113+
### 🛠️ Contributing
114+
- **skill-authoring** - Guided workflow for creating new Databricks skills for ai-dev-kit
115+
113116
## How It Works
114117

115118
```
@@ -155,7 +158,7 @@ For contributors creating new skills, ai-dev-kit provides a guided authoring wor
155158

156159
### Quick Start (Contributors)
157160

158-
1. **Use the authoring skill** — Clone the repo and ask Claude: "Help me create a new skill for [feature]". The `skill-authoring` skill (in `.skill-authoring/`) will guide you through the full workflow: interview, draft, test, validate, register.
161+
1. **Use the authoring skill** — Clone the repo and ask Claude: "Help me create a new skill for [feature]". The `skill-authoring` skill (in `databricks-skills/skill-authoring/`) will guide you through the full workflow: interview, draft, test, validate, register.
159162

160163
2. **Or start manually** — Copy the template and fill in the sections:
161164
```bash

databricks-skills/install_skills.sh

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ MLFLOW_REPO_RAW_URL="https://raw.githubusercontent.com/mlflow/skills"
4747
MLFLOW_REPO_REF="main"
4848

4949
# Databricks skills (hosted in this repo)
50-
DATABRICKS_SKILLS="databricks-agent-bricks databricks-ai-functions databricks-aibi-dashboards databricks-bundles databricks-app-python databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-python-sdk databricks-execution-compute databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest spark-python-data-source"
50+
DATABRICKS_SKILLS="databricks-agent-bricks databricks-ai-functions databricks-aibi-dashboards databricks-bundles databricks-app-python databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-python-sdk databricks-execution-compute databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest skill-authoring spark-python-data-source"
5151

5252
# MLflow skills (fetched from mlflow/skills repo)
5353
MLFLOW_SKILLS="agent-evaluation analyze-mlflow-chat-session analyze-mlflow-trace instrumenting-with-mlflow-tracing mlflow-onboarding querying-mlflow-metrics retrieving-mlflow-traces searching-mlflow-docs"
@@ -94,6 +94,7 @@ get_skill_description() {
9494
"databricks-unstructured-pdf-generation") echo "Generate synthetic PDFs for RAG" ;;
9595
"databricks-vector-search") echo "Vector Search - endpoints, indexes, and queries for RAG" ;;
9696
"databricks-zerobus-ingest") echo "Zerobus Ingest - gRPC data ingestion into Delta tables" ;;
97+
"skill-authoring") echo "Guided workflow for creating new Databricks skills for ai-dev-kit" ;;
9798
# MLflow skills (from mlflow/skills repo)
9899
"agent-evaluation") echo "End-to-end agent evaluation workflow" ;;
99100
"analyze-mlflow-chat-session") echo "Debug multi-turn conversations" ;;
@@ -132,6 +133,7 @@ get_skill_extra_files() {
132133
"databricks-spark-structured-streaming") echo "checkpoint-best-practices.md kafka-streaming.md merge-operations.md multi-sink-writes.md stateful-operations.md stream-static-joins.md stream-stream-joins.md streaming-best-practices.md trigger-and-cost-optimization.md" ;;
133134
"databricks-vector-search") echo "index-types.md end-to-end-rag.md" ;;
134135
"databricks-zerobus-ingest") echo "1-setup-and-authentication.md 2-python-client.md 3-multilanguage-clients.md 4-protobuf-schema.md 5-operations-and-limits.md" ;;
136+
"skill-authoring") echo "references/skill-format.md references/test-format.md" ;;
135137
*) echo "" ;;
136138
esac
137139
}
File renamed without changes.
File renamed without changes.
File renamed without changes.

install.ps1

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ $script:Skills = @(
8585
"databricks-metric-views", "databricks-mlflow-evaluation", "databricks-model-serving", "databricks-ai-functions",
8686
"databricks-python-sdk", "databricks-spark-declarative-pipelines", "databricks-spark-structured-streaming",
8787
"databricks-synthetic-data-gen", "databricks-unity-catalog", "databricks-unstructured-pdf-generation",
88-
"databricks-vector-search", "databricks-zerobus-ingest", "spark-python-data-source"
88+
"databricks-vector-search", "databricks-zerobus-ingest", "skill-authoring", "spark-python-data-source"
8989
)
9090

9191
# MLflow skills (fetched from mlflow/skills repo)
@@ -101,7 +101,7 @@ $script:ApxSkills = @("databricks-app-apx")
101101
$ApxRawUrl = "https://raw.githubusercontent.com/databricks-solutions/apx/main/skills/apx"
102102

103103
# ─── Skill profiles ──────────────────────────────────────────
104-
$script:CoreSkills = @("databricks-config", "databricks-docs", "databricks-python-sdk", "databricks-unity-catalog")
104+
$script:CoreSkills = @("databricks-config", "databricks-docs", "databricks-python-sdk", "databricks-unity-catalog", "skill-authoring")
105105

106106
$script:ProfileDataEngineer = @(
107107
"databricks-spark-declarative-pipelines", "databricks-spark-structured-streaming",
@@ -1100,7 +1100,7 @@ function Invoke-PromptCustomSkills {
11001100

11011101
Write-Host ""
11021102
Write-Host " Select individual skills" -ForegroundColor White
1103-
Write-Host " Core skills (config, docs, python-sdk, unity-catalog) are always installed" -ForegroundColor DarkGray
1103+
Write-Host " Core skills (config, docs, python-sdk, unity-catalog, skill-authoring) are always installed" -ForegroundColor DarkGray
11041104

11051105
$items = @(
11061106
@{ Label = "Spark Pipelines"; Value = "databricks-spark-declarative-pipelines"; State = ($preselected -contains "databricks-spark-declarative-pipelines"); Hint = "SDP/LDP, CDC, SCD Type 2" }

install.sh

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ MIN_SDK_VERSION="0.85.0"
8888
G='\033[0;32m' Y='\033[1;33m' R='\033[0;31m' BL='\033[0;34m' B='\033[1m' D='\033[2m' N='\033[0m'
8989

9090
# Databricks skills (bundled in repo)
91-
SKILLS="databricks-agent-bricks databricks-ai-functions databricks-aibi-dashboards databricks-app-python databricks-bundles databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-python-sdk databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest spark-python-data-source"
91+
SKILLS="databricks-agent-bricks databricks-ai-functions databricks-aibi-dashboards databricks-app-python databricks-bundles databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-python-sdk databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest skill-authoring spark-python-data-source"
9292

9393
# MLflow skills (fetched from mlflow/skills repo)
9494
MLFLOW_SKILLS="agent-evaluation analyze-mlflow-chat-session analyze-mlflow-trace instrumenting-with-mlflow-tracing mlflow-onboarding querying-mlflow-metrics retrieving-mlflow-traces searching-mlflow-docs"
@@ -100,7 +100,7 @@ APX_RAW_URL="https://raw.githubusercontent.com/databricks-solutions/apx/main/ski
100100

101101
# ─── Skill profiles ──────────────────────────────────────────
102102
# Core skills always installed regardless of profile selection
103-
CORE_SKILLS="databricks-config databricks-docs databricks-python-sdk databricks-unity-catalog"
103+
CORE_SKILLS="databricks-config databricks-docs databricks-python-sdk databricks-unity-catalog skill-authoring"
104104

105105
# Profile definitions (non-core skills only — core skills are always added)
106106
PROFILE_DATA_ENGINEER="databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-jobs databricks-bundles databricks-dbsql databricks-iceberg databricks-zerobus-ingest spark-python-data-source databricks-metric-views databricks-synthetic-data-gen"
@@ -898,7 +898,7 @@ prompt_custom_skills() {
898898

899899
echo ""
900900
echo -e " ${B}Select individual skills${N}"
901-
echo -e " ${D}Core skills (config, docs, python-sdk, unity-catalog) are always installed${N}"
901+
echo -e " ${D}Core skills (config, docs, python-sdk, unity-catalog, skill-authoring) are always installed${N}"
902902

903903
local selected
904904
selected=$(checkbox_select \

0 commit comments

Comments
 (0)