npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

fdy-scraping

v1.0.3

Published

`fdy-scraping` is a versatile HTTP client designed for making API requests with support for proxy configuration, debugging, and detailed error handling. It utilizes the [`got-scraping`](https://github.com/apify/got-scraping) library for HTTP operations.

Downloads

7,825

Readme

fdy-scraping

fdy-scraping is a versatile HTTP client designed for making API requests with support for proxy configuration, debugging, and detailed error handling. It utilizes the got-scraping library for HTTP operations.

Installation

To use fdy-scraping, you need to install the got-scraping library. Run the following command to install it via npm:

npm install fdy-scraping

Usage

Importing the Client

Import fdy-scraping into your Node.js application:

const fdy = require("fdy-scraping");

OR

import fdy from "fdy-scraping";

Creating a Client Instance

Create a new client instance with custom options:

create(options?: FetchClientOptions, debug?: boolean): FdyFetchClient;
const client = fdy.create(
  {
    headers: { Authorization: "Bearer your-token" },
    proxy: {
      ip: "127.0.0.1",
      port: 8080,
      protocol: "http",
      username: "username",
      password: "password",
    },
    baseUrl: "https://api.example.com",
  },
  true
); // Set debug mode to true

Making HTTP Requests

You can use the client to make various types of HTTP requests:

Request

request(url, method, body = undefined, headers = undefined, options = undefined)
client
  .request("/endpoint", "POST", undefined, { Accept: "application/json" })
  .then((response) => {
    console.log(response.data); // Response data
  })
  .catch((error) => {
    console.error("Error:", error.message);
  });

GET Request

get(url, headers = {}, options = {}, enableDebug = false)
client
  .get("/endpoint", { Accept: "application/json" })
  .then((response) => {
    console.log(response.data); // Response data
  })
  .catch((error) => {
    console.error("Error:", error.message);
  });

POST Request

post(url, body = undefined, headers = {}, options = {}, enableDebug = false)
client
  .post("/endpoint", JSON.stringify({ key: "value" }), {
    "Content-Type": "application/json",
  })
  .then((response) => {
    console.log(response.data); // Response data
  })
  .catch((error) => {
    console.error("Error:", error.message);
  });

PUT Request

put(url, body = undefined, headers = {}, options = {}, enableDebug = false)
client
  .put("/endpoint", JSON.stringify({ key: "new-value" }), {
    "Content-Type": "application/json",
  })
  .then((response) => {
    console.log(response.data); // Response data
  })
  .catch((error) => {
    console.error("Error:", error.message);
  });

DELETE Request

delete(url, headers = {}, options = {}, enableDebug = false)
client
  .delete("/endpoint", { Accept: "application/json" })
  .then((response) => {
    console.log(response.data); // Response data
  })
  .catch((error) => {
    console.error("Error:", error.message);
  });

Error Handling

fdy-scraping provides detailed error information through the FdyFetchClientError class. Errors include the status code, response data, and request configuration.

client.get("/invalid-endpoint").catch((error) => {
  if (error instanceof fdy.FdyFetchClientError) {
    console.error("Custom Error Details:");
    console.error("Status:", error.status);
    console.error("Response:", error.response);
    console.error("Config:", error.config);
  } else {
    console.error("General Error:", error.message);
  }
});

Configuration Options

FetchClientOptions

  • headers: Optional object containing default headers for requests.
  • proxy: Optional object for proxy configuration.
    • ip: IP address of the proxy server.
    • port: Port of the proxy server.
    • protocol: Protocol used by the proxy (http or https).
    • username: Optional username for proxy authentication.
    • password: Optional password for proxy authentication.
  • baseUrl: Optional base URL for requests.

Debug Mode

Debug mode can be enabled by passing true as the second argument to FdyFetchClient.create. When enabled, additional error information will be logged to the console.

License

This project is licensed under the MIT License - see the LICENSE file for details.


Feel free to adjust the paths, options, or methods according to your actual implementation and package setup.