Chapter 10: Domain Knowledge Acquisition
You've built the architecture (Chapter 8) and designed the interface (Chapter 9). But without deep domain knowledge, you're just building a generic tool. This chapter shows how to systematically acquire domain expertise and encode it into your system.
The Knowledge Challenge: Domain experts know what documents they need, but can't articulate the underlying patterns. Developers know patterns, but don't understand domain nuances. Bridging this gap is the critical path to success.
The Goal: Transform tacit knowledge ("I just know what a good report card looks like") into explicit specifications that guide implementation.
10.1 The Knowledge Acquisition Process
Domain knowledge acquisition follows a six-phase cycle:
1. DISCOVER → 2. DOCUMENT → 3. MODEL
(Interviews, (Sample docs, (Entity model,
observation) workflows) patterns)
↓ ↓ ↓
6. ITERATE ← 5. VALIDATE ← 4. IMPLEMENT
(Refine based (User testing, (Build templates,
on feedback) expert review) business rules)
Each phase builds on the previous. Skip steps and you'll build the wrong thing.
10.2 Phase 1: Discovery Through Interviews
Goal: Understand the domain from expert perspective
10.2.1 Finding the Right Experts
Not all domain users are experts. You need people who: - Have deep experience (3+ years in role) - Understand why things are done, not just what - Can articulate problems with current process - Represent different perspectives (coordinator, instructor, parent)
For homeschool co-ops: - Interview 5-7 coordinators from different co-ops - Include experienced (10+ years) and newer (2-3 years) coordinators - Talk to instructors who receive documents - Talk to parents who consume documents
10.2.2 Interview Structure
Don't ask: "What documents do you create?" Reason: You'll get a list without context
Better approach: Walk through the calendar year
Interview Script Example:
"Walk me through your year. Let's start with summer - what happens
before the school year starts?"
Expert: "In July, I start planning classes for fall. I need to see
which instructors are returning, what they want to teach..."
"What documents do you create during this phase?"
Expert: "I create a class planning sheet - shows proposed classes,
instructor, day/time, capacity..."
"Show me an example. What information is on this sheet?"
Expert: "Here's last year's... it has the class title, instructor name,
their email and phone, which day they prefer, how many students they
can take, and any special requirements like 'needs science lab'..."
"How do you use this document?"
Expert: "I share it with the board for approval, then with families
for enrollment planning..."
"What's frustrating about creating this document?"
Expert: "I have to update it constantly as instructors change their
minds. And I'm copying the same instructor info over and over - name,
email, phone - across multiple documents..."
Why This Works: - Chronological flow reveals natural workflow - "Show me" gets concrete examples, not abstractions - "How do you use" reveals document purpose and audience - "What's frustrating" uncovers pain points and automation opportunities
10.2.3 Key Discovery Questions
About Documents: - "Walk me through creating [document]. What steps do you follow?" - "Where does the data for this document come from?" - "Who receives this document? What do they do with it?" - "How often does this document change? Why?" - "What happens if this document has an error?"
About Workflows: - "What triggers you to create this document?" - "What approvals are needed before you send it?" - "What other documents depend on this one?" - "What's the deadline pressure like?"
About Data: - "What information do you track about students/classes/instructors?" - "What do you wish you could track but don't?" - "How do you know if information is current?" - "What data entry mistakes happen most often?"
About Relationships: - "Can a student be in multiple classes?" - "Can a class have multiple instructors?" - "What happens when a student withdraws mid-semester?" - "How do you handle siblings? Same parent contact info?"
10.2.4 Interview Output
After 5-7 expert interviews, you should have:
Document List (with metadata):
1. Class Planning Sheet
- Created: July (pre-semester)
- Frequency: Updated weekly until finalized
- Purpose: Planning and board approval
- Audience: Board members, instructors
- Pain points: Constant updates, duplicate data entry
2. Student Roster (by class)
- Created: September (start of semester)
- Frequency: Once per semester, occasional updates
- Purpose: Instructor reference, emergency contacts
- Audience: Instructors, parents
- Pain points: Photo collection, keeping current
[... 18 more documents]
Workflow Map:
JULY-AUGUST (Pre-semester)
→ Plan classes
→ Recruit instructors
→ Create class planning sheet
→ Get board approval
→ Open enrollment
AUGUST-SEPTEMBER (Enrollment)
→ Families register students
→ Create class rosters
→ Create student schedules
→ Distribute to instructors and families
SEPTEMBER-DECEMBER (Active semester)
→ Track attendance
→ Record grades
→ Mid-semester progress reports
DECEMBER (End of semester)
→ Calculate final grades
→ Generate report cards
→ Certificates for completed classes
[... full year cycle]
Entity List (preliminary):
- Student (name, grade level, parent, photo, allergies...)
- Parent (name, contact, emergency contact flag...)
- Instructor (name, contact, bio, classes taught...)
- Class (title, instructor, schedule, capacity...)
- Enrollment (student in class, with grade/attendance...)
- Semester (name, dates, active flag...)
[... more entities identified]
10.3 Phase 2: Document Collection & Analysis
Goal: Understand document structure and patterns
10.3.1 Collecting Samples
Ask experts for: - Real documents from past semesters (anonymize student names) - Multiple variations of same document type (different semesters, coordinators) - Edge cases (student with special needs, class with co-instructors) - Current templates (Word docs, Excel files they use now)
What to collect: - Minimum 3 examples per document type - Different data scenarios (2 students vs. 20 students in class) - Failures (documents that had errors, what went wrong)
10.3.2 Document Analysis Template
For each document type, complete this analysis:
DOCUMENT: Student Roster by Class
────────────────────────────────────────────────────
PURPOSE: Provide instructors with student information
PATTERN: Directory (Grid layout with photos)
FREQUENCY: Once per semester + updates when students join/drop
DATA SOURCES:
- Student table (name, grade, photo, allergies)
- Parent table (contact info, emergency contact)
- Enrollment table (which students in which classes)
- Class table (class name, instructor, schedule)
LAYOUT:
┌─────────────────────────────────────────────────────────────┐
│ RIVERSIDE HOMESCHOOL CO-OP │
│ Student Roster: Biology 101 │
│ Instructor: Dr. Sarah Johnson | Fall 2024 │
├─────────────────────────────────────────────────────────────┤
│ [Photo] Emma Anderson, Grade 5 │
│ Parent: Michael Anderson (555-0101) │
│ Allergies: Peanuts │
│ │
│ [Photo] Noah Baker, Grade 5 │
│ Parent: Jennifer Baker (555-0102) │
│ Allergies: None │
│ [... more students] │
└─────────────────────────────────────────────────────────────┘
BUSINESS RULES:
1. Only show active students
2. Sort by last name
3. Highlight students with allergies (red text)
4. Include emergency contact with ★ indicator
5. Students without photos show placeholder
VARIATIONS OBSERVED:
- Some coordinators include birth dates (privacy concern?)
- Some show both parents if different
- Some highlight new students ("New this semester")
PAIN POINTS (from interviews):
- Collecting and inserting photos is tedious
- Keeping emergency contacts current
- Knowing which parent is primary contact
AUTOMATION OPPORTUNITY:
- Photos stored once, reused across all documents
- Parent data linked (update once, reflect everywhere)
- Flag for incomplete data (missing photo, no emergency contact)
Complete this for all 20 document types. Time-consuming but critical.
10.3.3 Pattern Recognition
After analyzing documents, group by pattern:
DIRECTORY PATTERNS (9 documents):
- Student Roster by Class
- Master Student Directory
- Instructor Directory
- Emergency Contact Sheet
- Parent Contact List
[... 4 more]
MASTER-DETAIL PATTERNS (6 documents):
- Report Card (student → grades by class)
- Class Schedule (student → classes enrolled)
- Progress Report (student → current performance)
[... 3 more]
ATOMIC PATTERNS (5 documents):
- Completion Certificate
- Achievement Award
[... 3 more]
This grouping reveals reusable template structures.
10.4 Phase 3: Modeling the Domain
Goal: Create formal ontology (entity-relationship model)
10.4.1 Entity Identification
From interviews and documents, extract entities:
Start with nouns: - Student, Parent, Instructor, Class, Subject, Semester, Enrollment, Grade, Attendance, Assignment, Event, Permission, Announcement
For each entity, define:
ENTITY: Student
────────────────────────────────────────────────────
ATTRIBUTES:
- student_id (unique identifier)
- first_name (required)
- last_name (required)
- grade_level (0-12)
- birth_date (for age calculation)
- photo (optional, but encouraged)
- allergies (important for instructors)
- special_needs (accommodations needed)
- status (Active, Inactive, Alumni)
RELATIONSHIPS:
- Has many: Enrollments
- Has many: Grades (through Enrollments)
- Has many: Attendance records
- Belongs to many: Parents (M:N - siblings share parents)
BUSINESS RULES:
- Must have at least one parent
- At least one parent must be emergency contact
- Cannot enroll in class if already at capacity
- Grade level determines age-appropriate classes
CALCULATED FIELDS:
- full_name = first_name + " " + last_name
- age = current_year - birth_date.year
- current_gpa = weighted average of current grades
Repeat for all 15 entities.
10.4.2 Relationship Mapping
Key relationships determine document complexity:
SIMPLE (1:N):
Student → Attendance records (one student, many dates)
Class → Assignments (one class, many assignments)
MODERATE (N:1):
Enrollments → Class (many students enrolled in one class)
Enrollments → Student (one student in many classes)
COMPLEX (M:N):
Student ↔ Class (via Enrollment junction table)
→ Student can be in multiple classes
→ Class has multiple students
Parent ↔ Student (via Parent_Student junction table)
→ Parent can have multiple children
→ Student can have multiple parents (blended families)
HIERARCHICAL:
Subject → Subject (self-referential)
→ Science → Biology → Marine Biology
Department → Department (organizational hierarchy)
→ Elementary → Grade 5 → Mrs. Smith's Class
Document complexity correlates with relationship complexity: - Simple relationships → Simple documents (rosters) - Complex relationships → Complex documents (report cards with cross-referenced data)
10.4.3 Validation Rules
Schema rules (data format):
student_id: String, required, unique, format "S-###"
grade_level: Integer, required, 0-12
email: String, optional, must be valid email format
Business rules (domain logic):
Rule: Student must have parent
IF Student.exists()
THEN Parent_Student.exists(student_id)
Rule: Emergency contact required
IF Student.status = 'Active'
THEN Parents.any(emergency_contact = true)
Rule: Class capacity
IF Class.current_enrollment >= Class.capacity
THEN Enrollment.new(class_id).reject("Class full")
Rule: Grade-appropriate enrollment
IF Class.grade_min > Student.grade_level
OR Class.grade_max < Student.grade_level
THEN Warning("Student may not be age-appropriate")
These rules become validation code in your system.
10.5 Phase 4: Implementation & Encoding Knowledge
Goal: Transform domain knowledge into working code
10.5.1 Template Creation
From document analysis to template:
Take the Student Roster analysis (Section 10.3.2) and create template:
{{! Student Roster Template }}
{{! Header }}
<div class="header">
<h1>{{co_op_name}}</h1>
<h2>Student Roster: {{class.title}}</h2>
<p>Instructor: {{class.instructor.name}} | {{semester.name}}</p>
</div>
{{! Student List }}
{{#each students}}
<div class="student-card">
<div class="photo">
{{#if photo}}
<img src="{{photo}}" alt="{{full_name}}">
{{else}}
<img src="/images/placeholder.png" alt="No photo">
{{/if}}
</div>
<div class="info">
<h3>{{full_name}}, Grade {{grade_level}}</h3>
{{! Parents }}
<p><strong>Parent:</strong>
{{#each parents}}
{{name}} ({{phone}})
{{#if emergency_contact}}★{{/if}}
{{#unless @last}}, {{/unless}}
{{/each}}
</p>
{{! Allergies - highlighted if present }}
{{#if allergies}}
<p class="allergies"><strong>⚠️ Allergies:</strong> {{allergies}}</p>
{{/if}}
</div>
</div>
{{/each}}
CSS for highlighting (domain knowledge encoded):
/* Business rule: Highlight allergies for safety */
.allergies {
color: #dc3545; /* Red */
font-weight: bold;
background-color: #fff3cd; /* Yellow highlight */
padding: 4px;
}
/* Business rule: Emergency contact indicator */
.emergency-contact::before {
content: "★ ";
color: #ffc107;
}
Domain expertise is now executable code.
10.5.2 Business Logic Implementation
From business rules to code:
class ReportCardGenerator {
async validateStudentForReportCard(student_id, semester_id) {
const errors = [];
const warnings = [];
// Domain rule: Must have enrollments
const enrollments = await Enrollment.findByStudent(student_id, semester_id);
if (enrollments.length === 0) {
errors.push({
rule: 'has_enrollments',
message: `Student ${student.full_name} not enrolled in any classes`,
fix: 'Add enrollments in enrollments.csv'
});
}
// Domain rule: All enrolled classes must have final grade
for (const enrollment of enrollments) {
if (!enrollment.final_grade) {
errors.push({
rule: 'grades_complete',
message: `No final grade for ${enrollment.class.title}`,
fix: 'Enter final grade in grades.csv'
});
}
}
// Domain rule: Should have photo (warning, not error)
if (!student.photo) {
warnings.push({
rule: 'photo_recommended',
message: 'Student has no photo',
fix: 'Upload photo for better presentation'
});
}
return { errors, warnings };
}
async calculateGPA(student_id, semester_id) {
const enrollments = await Enrollment.findByStudent(student_id, semester_id);
// Domain rule: GPA calculation method
// (Coordinator specified: weighted average, A=4.0, A-=3.7, B+=3.3, etc.)
let totalPoints = 0;
let totalCredits = 0;
for (const enrollment of enrollments) {
const gradePoints = this.letterGradeToPoints(enrollment.final_grade);
const credits = enrollment.class.credits || 1.0;
totalPoints += gradePoints * credits;
totalCredits += credits;
}
return totalCredits > 0 ? totalPoints / totalCredits : 0;
}
letterGradeToPoints(grade) {
// Domain knowledge: Co-op's grading scale
const scale = {
'A': 4.0, 'A-': 3.7,
'B+': 3.3, 'B': 3.0, 'B-': 2.7,
'C+': 2.3, 'C': 2.0, 'C-': 1.7,
'D': 1.0, 'F': 0.0
};
return scale[grade] || 0;
}
}
Every method encodes domain expertise learned from coordinators.
10.5.3 Default Values & Conventions
Domain conventions become smart defaults:
const DOMAIN_DEFAULTS = {
// Learned: Most co-ops operate on semester system
semester_length_weeks: 16,
// Learned: Typical meeting day is Friday
default_meeting_day: 'Friday',
// Learned: Most classes are 1 credit
default_class_credits: 1.0,
// Learned: Typical class size
default_class_capacity: 15,
// Learned: Grade scale preference
passing_grade: 'D',
honor_roll_gpa: 3.5,
// Learned: Document preferences
default_output_format: 'PDF',
default_page_orientation: 'Portrait',
// Learned: Coordinator workflow
current_semester_default: true, // 90% of time, use current semester
active_students_only: true // Default filter
};
These defaults reduce clicks and cognitive load.
10.6 Phase 5: Validation with Users
Goal: Verify that encoded knowledge matches reality
10.6.1 Expert Review Sessions
Show don't tell: 1. Generate documents using real data 2. Have experts review for accuracy 3. Ask: "What's wrong? What's missing? What's confusing?"
Review Script:
"Here's a report card I generated for Emma Anderson using your data.
Take a look and tell me what you notice."
[Expert reviews document]
"On a scale of 1-10, how close is this to what you'd create manually?"
Expert: "It's about an 8. The GPA calculation is right, the layout
is good, but I'd include the attendance summary at the top, not the
bottom. And the teacher comments should wrap better..."
"Show me where you'd put attendance."
[Expert marks up document]
"Got it. What else?"
Expert: "The grade for Biology shows 'B+' but I want to show it as
'B+ (89%)' - both letter and percentage."
"Okay, so format change. Anything about the data itself?"
Expert: "Well, this shows Emma's GPA as 3.2, but I calculate it as
3.3. How are you calculating it?"
"I'm using this formula: [explain]. How do you calculate it?"
Expert: "Oh, I weight lab classes as 1.5 credits, not 1.0..."
Key Discovery: Domain nuances you'd never learn without showing real output.
10.6.2 Usability Testing
Watch users actually use the system: 1. Give them a task: "Generate report cards for 5th graders" 2. Don't help unless they're completely stuck 3. Note where they struggle 4. Ask them to think aloud: "What are you looking for? What would you click?"
Common discoveries: - They don't understand technical terms ("What's 'schema validation'?") - They expect features in different order - They're confused by error messages - They want confirmation before bulk operations
Fix issues immediately and test again.
10.6.3 Pilot Deployment
Deploy to 2-3 friendly users for a semester: - Real data, real deadlines - Weekly check-ins: "What worked? What didn't?" - Track support requests (what questions do they ask?) - Measure time savings (compare to manual process)
Success metrics: - Users successfully generate all needed documents - <5% error rate in generated documents - Users prefer system to manual methods - Users recommend to other coordinators
10.7 Phase 6: Iteration & Refinement
Goal: Continuously improve based on real usage
10.7.1 Feedback Loops
Built-in feedback mechanisms:
┌─────────────────────────────────────────────┐
│ ✓ Report card generated successfully │
├─────────────────────────────────────────────┤
│ │
│ Was this document what you expected? │
│ 👍 Yes 👎 No, here's what was wrong │
└─────────────────────────────────────────────┘
Track common issues: - "23 users clicked 'Generate' but got validation errors" → Improve pre-flight validation - "Users preview the same document 5 times on average" → Add confidence-building tips - "15% of users don't understand 'semester' dropdown" → Change label to "School term"
10.7.2 Domain Evolution
Domains change over time: - New regulations (privacy laws, reporting requirements) - New needs (pandemic → virtual learning documents) - User sophistication increases (advanced features requested)
Quarterly review: 1. Interview power users: "What's changed? What's needed?" 2. Review support tickets: What questions are repeated? 3. Check competitor offerings: What features are they adding? 4. Plan roadmap: What to build next?
10.7.3 Scaling Knowledge Acquisition
As you grow, systematize knowledge capture:
Knowledge Base:
/docs
/domain
- homeschool-co-ops-overview.md
- grading-systems.md
- semester-vs-year-round.md
- enrollment-workflows.md
/patterns
- report-card-patterns.md
- roster-patterns.md
- certificate-patterns.md
/decisions
- why-we-calculate-gpa-this-way.md
- why-we-default-to-current-semester.md
Decision Log:
DECISION: GPA Calculation Method
DATE: 2024-03-15
CONTEXT: Users asked for GPA on report cards
OPTIONS CONSIDERED:
1. Unweighted (all classes count equally)
2. Weighted (lab classes 1.5x, regular 1.0x)
DECISION: Weighted, because coordinators told us lab classes
require more work and should count more
CONSEQUENCES: Need to track class.credits field, slightly more
complex calculation
New team members can read knowledge base and understand domain.
10.8 Common Knowledge Acquisition Mistakes
Mistake 1: Assuming You Know - Symptom: "I've seen rosters before, I know how they work" - Result: Build generic roster, miss domain-specific needs - Fix: Always interview, even for "obvious" documents
Mistake 2: Interviewing Only One Person - Symptom: System works perfectly for one user, confuses everyone else - Result: Narrow solution that doesn't generalize - Fix: Minimum 5 experts, varied perspectives
Mistake 3: Asking Leading Questions - Symptom: "Wouldn't it be great if we could..." biases responses - Result: Build features users don't actually need - Fix: Open-ended questions, observe actual behavior
Mistake 4: Ignoring Edge Cases - Symptom: "Most students have two parents, we'll assume that" - Result: System breaks for single parents, blended families - Fix: Ask "What are the exceptions? What's unusual?"
Mistake 5: Building Before Validating - Symptom: Complete system built before showing users - Result: Major rework when users say "this isn't what we meant" - Fix: Show mockups and samples early and often
Mistake 6: Not Documenting Decisions - Symptom: Six months later, "Why did we do it this way?" - Result: Accidental changes break domain logic - Fix: Maintain decision log
10.9 Knowledge Acquisition Toolkit
Essential tools for domain knowledge capture:
Interview Tools: - Audio recorder (with permission) - don't miss details - Screen sharing (watch users work) - Collaborative whiteboard (draw entity relationships together)
Document Analysis Tools: - Spreadsheet for document catalog - Drawing tool for wireframes (Figma, Sketch, even PowerPoint) - Sample document repository (organized by type)
Modeling Tools: - ER diagram tool (draw.io, Lucidchart, or pen and paper) - Spreadsheet for entity attributes - Mind mapping tool for concept relationships
Validation Tools: - Usability testing software (UserTesting, Lookback) - Analytics (track what users actually do) - Feedback forms (thumbs up/down, comment boxes)
10.10 Chapter Summary
This chapter established systematic approaches to domain knowledge acquisition:
The Six-Phase Process: 1. Discover: Interview 5-7 experts, walk through annual calendar, uncover pain points 2. Document: Collect real samples, analyze structure, identify patterns 3. Model: Create entity-relationship model, define business rules, map relationships 4. Implement: Build templates, encode business logic, set smart defaults 5. Validate: Expert review, usability testing, pilot deployment 6. Iterate: Feedback loops, track issues, evolve with domain
Interview Techniques: - Chronological walkthrough (more natural than feature list) - "Show me" (concrete examples over abstractions) - "What's frustrating?" (reveals automation opportunities) - Open-ended questions (avoid leading responses)
Document Analysis: - Collect 3+ examples per type - Analyze purpose, pattern, frequency, data sources - Identify business rules and variations - Group by shared patterns
Domain Modeling: - Extract entities from nouns - Map relationships (1:N, M:N, hierarchical) - Define validation rules (schema + business logic) - Document calculated fields and conventions
Knowledge Encoding: - Templates encode document structure - Business logic encodes domain rules - Defaults encode common conventions - All domain knowledge becomes executable code
Validation & Iteration: - Show, don't tell (real documents with real data) - Watch users work (usability testing) - Pilot with friendly users (real-world validation) - Continuous feedback loops (system evolves with domain)
Key Insight: Domain knowledge isn't something you acquire once - it's an ongoing conversation with users that continuously refines the system.
Further Reading
On Knowledge Engineering: - Feigenbaum, Edward A., and Pamela McCorduck. The Fifth Generation. Addison-Wesley, 1983. (Early knowledge engineering) - Schreiber, Guus, et al. Knowledge Engineering and Management: The CommonKADS Methodology. MIT Press, 1999.
On Domain-Driven Design: - Evans, Eric. Domain-Driven Design: Tackling Complexity in the Heart of Software. Addison-Wesley, 2003. (The definitive DDD book) - Vernon, Vaughn. Implementing Domain-Driven Design. Addison-Wesley, 2013. (Practical implementation) - Vernon, Vaughn. Domain-Driven Design Distilled. Addison-Wesley, 2016. (Concise introduction)
On Ethnographic Methods: - Beyer, Hugh, and Karen Holtzblatt. Contextual Design. Morgan Kaufmann, 1997. (Contextual inquiry methodology) - Spinuzzi, Clay. "The Methodology of Participatory Design." Technical Communication 52 (2005): 163-174. - Blomberg, Jeanette, and Helena Karasti. "Ethnography: Positioning Ethnography within Participatory Design." In Routledge Handbook of Participatory Design. Routledge, 2012.
On Interview Techniques: - Portigal, Steve. Interviewing Users: How to Uncover Compelling Insights. Rosenfeld Media, 2013. - "How to Conduct User Interviews." Nielsen Norman Group. https://www.nngroup.com/articles/user-interviews/
On Participatory Design: - Schuler, Douglas, and Aki Namioka, eds. Participatory Design: Principles and Practices. CRC Press, 1993. - Sanders, Elizabeth B.-N., and Pieter Jan Stappers. Convivial Toolbox: Generative Research for the Front End of Design. BIS Publishers, 2012.
On Requirements Engineering: - Wiegers, Karl, and Joy Beatty. Software Requirements, 3rd Edition. Microsoft Press, 2013. (Comprehensive requirements guide) - Robertson, Suzanne, and James Robertson. Mastering the Requirements Process. Addison-Wesley, 2012.
On Community-Driven Development: - Raymond, Eric S. The Cathedral and the Bazaar. O'Reilly, 1999. (Open source development models) - "Building Community." Open Source Guides. https://opensource.guide/building-community/ (Community building practices)
Related Patterns in This Trilogy: - Volume 2, Pattern 11 (Historical Pattern Matching): Learning from similar past cases - Volume 2, Pattern 16 (Automated Pattern Mining): Discovering patterns in domain data - All patterns in all volumes emerged from domain knowledge acquisition
Tools for Knowledge Capture: - Miro: https://miro.com/ (Visual collaboration for domain modeling) - Notion: https://www.notion.so/ (Knowledge base and documentation) - Roam Research: https://roamresearch.com/ (Networked thought for domain knowledge)