BMAD-METHOD/bmad-core/data/research-methodologies.md

8.8 KiB

Research Methodologies

Domain-Specific Research Approaches

Technical Research Methodologies

Technology Assessment Framework

  • Capability Analysis: Feature sets, performance characteristics, scalability limits
  • Implementation Evaluation: Complexity, learning curve, integration requirements
  • Ecosystem Assessment: Community support, documentation quality, maintenance status
  • Performance Benchmarking: Speed, resource usage, throughput comparisons
  • Security Analysis: Vulnerability assessment, security model evaluation

Technical Source Priorities

  1. Official Documentation: Primary source for capabilities and limitations
  2. GitHub Repositories: Code quality, activity level, issue resolution patterns
  3. Technical Blogs: Implementation experiences, best practices, lessons learned
  4. Stack Overflow: Common problems, community solutions, adoption challenges
  5. Benchmark Studies: Performance comparisons, scalability test results

Market Research Methodologies

Market Analysis Framework

  • Market Sizing: TAM/SAM/SOM analysis, growth rate assessment
  • Competitive Landscape: Player mapping, market share analysis, positioning
  • Customer Segmentation: Demographics, psychographics, behavioral patterns
  • Trend Analysis: Market direction, disruption potential, timing factors
  • Opportunity Assessment: Market gaps, underserved segments, entry barriers

Market Source Priorities

  1. Industry Reports: Analyst research, market studies, trend analyses
  2. Financial Data: Public company reports, funding announcements, valuations
  3. Survey Data: Customer research, market studies, adoption surveys
  4. Trade Publications: Industry news, expert opinions, market insights
  5. Government Data: Economic indicators, regulatory information, statistics

User Research Methodologies

User-Centered Research Framework

  • Behavioral Analysis: User journey mapping, interaction patterns, pain points
  • Needs Assessment: Jobs-to-be-done analysis, unmet needs identification
  • Experience Evaluation: Usability assessment, satisfaction measurement
  • Preference Research: Feature prioritization, willingness to pay, adoption factors
  • Context Analysis: Use case scenarios, environmental factors, constraints

User Research Source Priorities

  1. User Studies: Direct research, surveys, interviews, focus groups
  2. Product Reviews: Customer feedback, ratings, detailed experiences
  3. Social Media: User discussions, complaints, feature requests
  4. Support Forums: Common issues, user questions, community solutions
  5. Analytics Data: Usage patterns, conversion rates, engagement metrics

Competitive Research Methodologies

Competitive Intelligence Framework

  • Feature Comparison: Capability matrices, feature gap analysis
  • Strategic Analysis: Business model evaluation, positioning assessment
  • Performance Benchmarking: Speed, reliability, user experience comparisons
  • Market Position: Share analysis, customer perception, brand strength
  • Innovation Tracking: Product roadmaps, patent filings, investment areas

Competitive Source Priorities

  1. Competitor Websites: Product information, pricing, positioning messages
  2. Product Demos: Hands-on evaluation, feature testing, user experience
  3. Press Releases: Strategic announcements, product launches, partnerships
  4. Analyst Reports: Third-party assessments, market positioning studies
  5. Customer Feedback: Reviews comparing competitors, switching reasons

Scientific Research Methodologies

Scientific Analysis Framework

  • Literature Review: Peer-reviewed research, citation analysis, consensus building
  • Methodology Assessment: Research design quality, statistical validity, reproducibility
  • Evidence Evaluation: Study quality, sample sizes, control factors
  • Consensus Analysis: Scientific agreement levels, controversial areas
  • Application Assessment: Practical implications, implementation feasibility

Scientific Source Priorities

  1. Peer-Reviewed Journals: Primary research, systematic reviews, meta-analyses
  2. Academic Databases: Research repositories, citation networks, preprints
  3. Conference Proceedings: Latest research, emerging trends, expert presentations
  4. Expert Opinions: Thought leader insights, expert interviews, panel discussions
  5. Research Institutions: University studies, lab reports, institutional research

Research Quality Standards

Source Credibility Assessment

Primary Source Evaluation

  • Authority: Expertise of authors, institutional affiliation, credentials
  • Accuracy: Fact-checking, peer review process, error correction mechanisms
  • Objectivity: Bias assessment, funding sources, conflict of interest disclosure
  • Currency: Publication date, information recency, update frequency
  • Coverage: Scope comprehensiveness, detail level, methodology transparency

Secondary Source Validation

  • Citation Quality: Primary source references, citation accuracy, source diversity
  • Synthesis Quality: Analysis depth, logical coherence, balanced perspective
  • Author Expertise: Subject matter knowledge, track record, reputation
  • Publication Standards: Editorial process, fact-checking procedures, corrections policy
  • Bias Assessment: Perspective limitations, stakeholder influences, agenda identification

Information Synthesis Approaches

Multi-Perspective Integration

  • Convergence Analysis: Identify areas where sources agree consistently
  • Divergence Documentation: Note significant disagreements and analyze causes
  • Confidence Weighting: Assign confidence levels based on source quality and consensus
  • Gap Identification: Recognize areas lacking sufficient information or research
  • Uncertainty Quantification: Document limitations and areas of unclear evidence

Evidence Hierarchy

  1. High Confidence: Multiple credible sources, recent information, expert consensus
  2. Medium Confidence: Some credible sources, mixed consensus, moderate currency
  3. Low Confidence: Limited sources, significant disagreement, dated information
  4. Speculative: Minimal evidence, high uncertainty, expert opinion only
  5. Unknown: Insufficient information available for assessment

Domain-Specific Analysis Frameworks

Technical Analysis Framework

  • Feasibility Assessment: Technical viability, implementation complexity, resource requirements
  • Scalability Analysis: Performance under load, growth accommodation, architectural limits
  • Integration Evaluation: Compatibility assessment, integration complexity, ecosystem fit
  • Maintenance Considerations: Support requirements, update frequency, long-term viability
  • Risk Assessment: Technical risks, dependency risks, obsolescence potential

Business Analysis Framework

  • Value Proposition: Customer value delivery, competitive advantage, market differentiation
  • Financial Impact: Cost analysis, revenue potential, ROI assessment, budget implications
  • Strategic Alignment: Goal consistency, priority alignment, resource allocation fit
  • Implementation Feasibility: Resource requirements, timeline considerations, capability gaps
  • Risk-Benefit Analysis: Potential rewards vs implementation risks and costs

User Impact Framework

  • User Experience: Ease of use, learning curve, satisfaction factors, accessibility
  • Adoption Factors: Barriers to adoption, motivation drivers, change management needs
  • Value Delivery: User benefit realization, problem solving effectiveness, outcome achievement
  • Support Requirements: Training needs, documentation requirements, ongoing support
  • Success Metrics: User satisfaction measures, adoption rates, outcome indicators

Research Coordination Best Practices

Multi-Researcher Coordination

  • Perspective Assignment: Clear domain boundaries, minimal overlap, comprehensive coverage
  • Communication Protocols: Regular check-ins, conflict resolution processes, coordination methods
  • Quality Standards: Consistent source credibility requirements, analysis depth expectations
  • Timeline Management: Milestone coordination, dependency management, delivery synchronization
  • Integration Planning: Synthesis approach design, conflict resolution strategies, gap handling

Research Efficiency Optimization

  • Source Sharing: Avoid duplicate source evaluation across researchers
  • Finding Coordination: Share relevant discoveries between perspectives
  • Quality Checks: Cross-validation of key findings, source verification collaboration
  • Scope Management: Prevent research scope creep, maintain focus on objectives
  • Resource Optimization: Leverage each researcher's domain expertise most effectively