# geogfm.tasks
Course Roadmap Mapping
This weekβs work in the broader GFM plan.
Week | Stage | Focus | You will build (geogfm) | Library tools | Outcome |
---|---|---|---|---|---|
8 | Stage 3: Apply & Deploy | Task Fine-tuning | tasks/{classification.py|segmentation.py} (light heads) |
torch.nn.CrossEntropyLoss ; timm optional |
Head swap on frozen encoder; small dataset demo |
Weekly goals
- Implement a simple classifier/segmentation head
- Fine-tune with frozen encoder; evaluate on a tiny dataset
- Discuss efficient strategies (LoRA/prompting as concepts)
Session Outline (and Tangled Code)
- Concepts β Components mapping
- Classification/segmentation heads β
tasks/*.py
- Freezing encoder and training head β usage snippets
- Classification/segmentation heads β
Package inits
1) Classification Head
from __future__ import annotations
import torch
import torch.nn as nn
class ClassificationHead(nn.Module):
def __init__(self, embed_dim: int, num_classes: int):
super().__init__()
self.fc = nn.Linear(embed_dim, num_classes)
def forward(self, tokens: torch.Tensor) -> torch.Tensor:
# tokens: (B, N, D). Use mean pooling over tokens.
= tokens.mean(dim=1)
x return self.fc(x)
2) Segmentation Head (token-wise classifier)
from __future__ import annotations
import torch
import torch.nn as nn
class SegmentationHead(nn.Module):
def __init__(self, embed_dim: int, num_classes: int):
super().__init__()
self.fc = nn.Linear(embed_dim, num_classes)
def forward(self, tokens: torch.Tensor) -> torch.Tensor:
# tokens: (B, N, D) -> (B, N, C)
return self.fc(tokens)
Usage snippet (non-tangled)
# Example of freezing encoder and training a head:
# encoder = GeoViTBackbone(cfg)
# for p in encoder.parameters():
# p.requires_grad = False
# head = ClassificationHead(embed_dim=cfg.embed_dim, num_classes=5)
# logits = head(encoder(images))
# loss = torch.nn.functional.cross_entropy(logits, labels)