VYGA Data serves as the foundational engine of our comprehensive cross-chain product discovery platform, orchestrating the seamless transformation of data sourced from various web applications into purchasable items (NFTs - Non-Fungible-Tokens).
Our mission revolves around simplifying the entire product discovery journey, beginning with the extraction of data from third-party web applications to providing a comprehensive overview of best products to our clients.
This data is securely housed within a central database.
Once stored, VYGA Data works its magic by converting this data into a unique and customizable NFT collection that caters to specific requirements.
The products displayed are made accessible for purchase right here on VYGA with transactions on HBAR.
Our platform offers an end-to-end experience, spanning data acquisition, product discovery, creation of collections by the user, order management, and user engagement.
VYGA Data lies at the core of our commitment to delivering innovative data solutions, benefiting businesses, developers, and individuals seeking to unlock the full potential of data for both profit and progress in their W3 journey.
This link contains uninstallation and installation, follow the installation one.
virtualenv env
source ../env/bin/activate
Clone this project and install the requirements.txt via pip command.
pip install -r requirements.txt
Create an .env (file)
SQL_DATABASE=
TEST_DATABASE_NAME=
X_FRAME_OPTIONS=
server=
port=
username=
password=
Do database migration
python manage.py makemigrations --settings=Aggregators.test_settings
python manage.py migrate --settings=Aggregators.test_settings
Create an admin user
python manage.py createsuperuser--settings=Aggregators.test_settings
Now Django's requirements installation part is done; you can verify it by running the server.
python manage.py runserver --settings=Aggregators.test_settings
Add a new package in the requirements.in file. Run ./requirements/update_requirements.sh
For more info, check here: pip-tools
For more details on the scraping panel, please check out this document.
Thanks