sf-scraper

Scrape employee data from a logged-in SAP SuccessFactors browser session using browser automation. Use when: user provides an employee ID and wants employee details (name, email, department, manager, etc.) scraped directly from the SuccessFactors UI — NOT via OData/API. Requires the user to have SuccessFactors open and logged in via Chrome with the OpenClaw Browser Relay extension attached. Triggers on: "get employee name", "look up employee", "scrape SF", "find employee in SuccessFactors", or any request combining an employee ID with SuccessFactors data lookup.

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "sf-scraper" with this command: npx skills add venkatalokesh-dot/sf-scrapper

SF Scraper — SuccessFactors Browser Scraping Skill

Scrape employee data from a live, logged-in SAP SuccessFactors session via browser automation.

Prerequisites

  • User must have SAP SuccessFactors open and logged in on a Chrome tab.
  • OpenClaw Browser Relay Chrome extension must be active (badge ON) on that tab.
  • Use profile="chrome" for all browser calls.

Workflow

1. Discover the SuccessFactors Base URL

Take a snapshot of the attached Chrome tab to identify the current SF domain:

browser(action="snapshot", profile="chrome", compact=true)

Extract the base URL (e.g., https://<company>.successfactors.com or https://pmsalesdemo8.successfactors.com).

2. Navigate to Employee Search / People Profile

SuccessFactors supports deep-link URLs. Use the following pattern to go directly to an employee's profile:

Primary pattern (People Profile / Live Profile):

{base_url}/sf/liveprofile?selected_user={employee_id}

Fallback patterns if primary doesn't work:

{base_url}/xi/ui/peopleprofile/pages/index.xhtml?selected_user={employee_id}
{base_url}/sf/home?selected_user={employee_id}

Navigate using:

browser(action="navigate", profile="chrome", targetUrl="{constructed_url}")

Wait briefly for the page to load, then snapshot.

3. If Deep Link Fails — Use Search

If the deep link lands on a generic page or errors, fall back to the global search:

  1. Snapshot the page to find the search bar (usually top-right, role: searchbox or textbox with name containing "Search").
  2. Click the search box, type the employee ID, press Enter.
  3. Snapshot results, click the matching employee profile link.

4. Scrape Employee Data

Once on the profile page, take a snapshot:

browser(action="snapshot", profile="chrome", compact=true)

Extract the following fields from the rendered accessibility tree:

FieldWhere to Look
NamePage heading / heading role, or prominent text near avatar
Employee IDUsually in a details section or the URL itself
EmailLook for link with mailto: or text containing @
Job TitleNear name, often under heading
DepartmentIn profile details / info card
ManagerIn profile details, often a clickable link
LocationIn profile details section
PhoneIn contact info section

Not all fields will always be visible — return what's available.

5. If Profile Page Uses Tabs/Sections

SuccessFactors profiles often have tabs (Personal Info, Employment Info, Job Info, etc.). If needed data isn't visible:

  1. Snapshot to find tab elements.
  2. Click the relevant tab (e.g., "Personal Information", "Job Information", "Employment Details").
  3. Snapshot again and extract.

6. Return Results

Format results clearly:

Employee: John Doe
ID: 12345
Email: john.doe@company.com
Title: Senior Developer
Department: Engineering
Manager: Jane Smith
Location: Bangalore, India

Only include fields that were actually found on the page. Do not guess or fabricate data.

Configuration

The user should create a config in TOOLS.md or the workspace with:

### SuccessFactors
- Base URL: https://yourcompany.successfactors.com

If no base URL is configured, discover it from the currently open tab.

Error Handling

  • Not logged in: If snapshot shows a login page, tell the user to log in first.
  • Access denied / no profile found: Report clearly — the user may lack permissions for that employee.
  • Page timeout: Retry snapshot once after 3 seconds. If still loading, inform the user.
  • Search returns multiple results: List them and ask the user to clarify.

Batch Mode

If user provides multiple employee IDs, iterate through each one sequentially using the same workflow. Collect results and present as a table.

Important Notes

  • Never use OData API, REST endpoints, or any programmatic API. This skill is purely browser-based scraping.
  • Always use profile="chrome" — never profile="openclaw" (we need the user's authenticated session).
  • Be patient with page loads — SF can be slow. Use snapshots to verify page state before extracting.
  • Respect the user's session — don't navigate away from SF without warning.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Automation

x402 Agent Marketplace

Provide AI agent services paid via SOL micro-payments using the x402 HTTP 402 payment protocol with zero-custody, supporting 15 specialized AI agents.

Registry SourceRecently Updated
Automation

OpenFleet

Manage your OpenFleet multi-agent workspace — create tasks, assign agents, trigger pulse cycles, manage automations, and monitor activity. Full bidirectional...

Registry SourceRecently Updated
Automation

EODHD API

Provides tools and workflows to interact with the EODHD (EOD Historical Data) API for financial data. Use this skill to fetch market data, fundamental data,...

Registry SourceRecently Updated
Automation

Vincent - Twitter

Twitter/X.com data access for agents. Use this skill when users want to search tweets, look up user profiles, or retrieve recent tweets. Pay-per-call via Vin...

Registry SourceRecently Updated