Compare commits
16 commits
busti/prot
...
stable
Author | SHA1 | Date | |
---|---|---|---|
9acf5a97e2 | |||
8d64a3c528 | |||
0fd49bc023 | |||
6b5ccc2be6 | |||
bbc2faeb7b | |||
48b9e595ff | |||
f8dacef309 | |||
c4c49931a4 | |||
bea56f101a | |||
8cf0897ec5 | |||
f2db5d9dad | |||
b3bae6f5ad | |||
52dbe93d3c | |||
f4894d3a8c | |||
93335b2776 | |||
0d394d531b |
163 changed files with 2803 additions and 59448 deletions
.build.yml.gitignoreREADME.mddocker-compose.yml
backend
.idea
authentication
configure.pyhostadmin
toolshed
deploy
dev
Dockerfile.backendDockerfile.dnsDockerfile.frontendDockerfile.proxyDockerfile.wikidns_server.py
docker-compose.override.ymlinstance_a
instance_b
zone.jsondocs
frontend
Dockerfilefullchain.pemnginx.conf
node_modules
package-lock.jsonpackage.jsonprivkey.pempublic/assets/img
avatars
photos
src
App.vue
assets
css
fonts
Inter-Black.woffInter-Black.woff2Inter-BlackItalic.woffInter-BlackItalic.woff2Inter-Bold.woffInter-Bold.woff2Inter-BoldItalic.woffInter-BoldItalic.woff2Inter-ExtraBold.woffInter-ExtraBold.woff2Inter-ExtraBoldItalic.woffInter-ExtraBoldItalic.woff2Inter-ExtraLight-BETA.woffInter-ExtraLight-BETA.woff2Inter-ExtraLightItalic-BETA.woffInter-ExtraLightItalic-BETA.woff2Inter-Italic.woffInter-Italic.woff2Inter-Light-BETA.woffInter-Light-BETA.woff2Inter-LightItalic-BETA.woffInter-LightItalic-BETA.woff2Inter-Medium.woffInter-Medium.woff2Inter-MediumItalic.woffInter-MediumItalic.woff2Inter-Regular.woffInter-Regular.woff2Inter-SemiBold.woffInter-SemiBold.woff2Inter-SemiBoldItalic.woffInter-SemiBoldItalic.woff2Inter-Thin-BETA.woffInter-Thin-BETA.woff2Inter-ThinItalic-BETA.woffInter-ThinItalic-BETA.woff2
|
@ -13,4 +13,5 @@ steps:
|
||||||
- apk add --no-cache gcc musl-dev python3-dev
|
- apk add --no-cache gcc musl-dev python3-dev
|
||||||
- pip install --upgrade pip && pip install -r requirements.txt
|
- pip install --upgrade pip && pip install -r requirements.txt
|
||||||
- python3 configure.py
|
- python3 configure.py
|
||||||
- coverage run --parallel-mode --concurrency=multiprocessing manage.py test --parallel=$(nproc) && coverage report
|
- coverage run manage.py test
|
||||||
|
- coverage report
|
||||||
|
|
4
.gitignore
vendored
4
.gitignore
vendored
|
@ -130,5 +130,5 @@ dmypy.json
|
||||||
|
|
||||||
staticfiles/
|
staticfiles/
|
||||||
userfiles/
|
userfiles/
|
||||||
backend/templates/
|
testdata.py
|
||||||
backend/testdata.py
|
*.sqlite3
|
||||||
|
|
43
README.md
43
README.md
|
@ -1,6 +1,6 @@
|
||||||
# toolshed
|
# toolshed
|
||||||
|
|
||||||
## Installation / Development
|
## Development
|
||||||
|
|
||||||
``` bash
|
``` bash
|
||||||
git clone https://github.com/gr4yj3d1/toolshed.git
|
git clone https://github.com/gr4yj3d1/toolshed.git
|
||||||
|
@ -12,7 +12,10 @@ or
|
||||||
git clone https://git.neulandlabor.de/j3d1/toolshed.git
|
git clone https://git.neulandlabor.de/j3d1/toolshed.git
|
||||||
```
|
```
|
||||||
|
|
||||||
### Backend
|
all following development mode commands support auto-reloading and hot-reloading where applicable, they do not need to bw
|
||||||
|
restarted after changes.
|
||||||
|
|
||||||
|
### Backend only
|
||||||
|
|
||||||
``` bash
|
``` bash
|
||||||
cd toolshed/backend
|
cd toolshed/backend
|
||||||
|
@ -26,7 +29,7 @@ python manage.py runserver 0.0.0.0:8000 --insecure
|
||||||
to run this in properly in production, you need to configure a webserver to serve the static files and proxy the
|
to run this in properly in production, you need to configure a webserver to serve the static files and proxy the
|
||||||
requests to the backend, then run the backend with just `python manage.py runserver` without the `--insecure` flag.
|
requests to the backend, then run the backend with just `python manage.py runserver` without the `--insecure` flag.
|
||||||
|
|
||||||
### Frontend
|
### Frontend only
|
||||||
|
|
||||||
``` bash
|
``` bash
|
||||||
cd toolshed/frontend
|
cd toolshed/frontend
|
||||||
|
@ -34,13 +37,45 @@ npm install
|
||||||
npm run dev
|
npm run dev
|
||||||
```
|
```
|
||||||
|
|
||||||
### Docs
|
### Docs only
|
||||||
|
|
||||||
``` bash
|
``` bash
|
||||||
cd toolshed/docs
|
cd toolshed/docs
|
||||||
mkdocs serve
|
mkdocs serve
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Full stack
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd toolshed
|
||||||
|
docker-compose -f deploy/docker-compose.override.yml up --build
|
||||||
|
```
|
||||||
|
|
||||||
|
## Deployment
|
||||||
|
|
||||||
|
### Requirements
|
||||||
|
|
||||||
|
- python3
|
||||||
|
- python3-pip
|
||||||
|
- python3-venv
|
||||||
|
- wget
|
||||||
|
- unzip
|
||||||
|
- nginx
|
||||||
|
- uwsgi
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
* Get the latest release from
|
||||||
|
`https://git.neulandlabor.de/j3d1/toolshed/releases/download/<version>/toolshed.zip` or
|
||||||
|
`https://github.com/gr4yj3d1/toolshed/archive/refs/tags/<version>.zip`.
|
||||||
|
* Unpack it to `/var/www` or wherever you want to install toolshed.
|
||||||
|
* Create a virtual environment and install the requirements.
|
||||||
|
* Then run the configuration script.
|
||||||
|
* Configure your webserver to serve the static files and proxy the requests to the backend.
|
||||||
|
* Configure your webserver to run the backend with uwsgi.
|
||||||
|
|
||||||
|
for detailed instructions see [docs](/docs/deployment.md).
|
||||||
|
|
||||||
## CLI Client
|
## CLI Client
|
||||||
|
|
||||||
### Requirements
|
### Requirements
|
||||||
|
|
2
backend/.idea/.gitignore
generated
vendored
2
backend/.idea/.gitignore
generated
vendored
|
@ -6,3 +6,5 @@
|
||||||
# Datasource local storage ignored files
|
# Datasource local storage ignored files
|
||||||
/dataSources/
|
/dataSources/
|
||||||
/dataSources.local.xml
|
/dataSources.local.xml
|
||||||
|
# GitHub Copilot persisted chat sessions
|
||||||
|
/copilot/chatSessions
|
||||||
|
|
|
@ -60,6 +60,8 @@ def verify_incoming_friend_request(request, raw_request_body):
|
||||||
befriender_key = request.data['befriender_key']
|
befriender_key = request.data['befriender_key']
|
||||||
except KeyError:
|
except KeyError:
|
||||||
return False
|
return False
|
||||||
|
if not befriender or not befriender_key:
|
||||||
|
return False
|
||||||
if username + "@" + domain != befriender:
|
if username + "@" + domain != befriender:
|
||||||
return False
|
return False
|
||||||
if len(befriender_key) != 64:
|
if len(befriender_key) != 64:
|
||||||
|
|
|
@ -8,9 +8,18 @@ import dotenv
|
||||||
from django.db import transaction, IntegrityError
|
from django.db import transaction, IntegrityError
|
||||||
|
|
||||||
|
|
||||||
def yesno(prompt, default=False):
|
class CmdCtx:
|
||||||
if not sys.stdin.isatty():
|
|
||||||
|
def __init__(self, args):
|
||||||
|
self.args = args
|
||||||
|
|
||||||
|
def yesno(self, prompt, default=False):
|
||||||
|
if not sys.stdin.isatty() or self.args.noninteractive:
|
||||||
return default
|
return default
|
||||||
|
elif self.args.yes:
|
||||||
|
return True
|
||||||
|
elif self.args.no:
|
||||||
|
return False
|
||||||
yes = {'yes', 'y', 'ye'}
|
yes = {'yes', 'y', 'ye'}
|
||||||
no = {'no', 'n'}
|
no = {'no', 'n'}
|
||||||
|
|
||||||
|
@ -31,9 +40,9 @@ def yesno(prompt, default=False):
|
||||||
print('Please respond with "yes" or "no"')
|
print('Please respond with "yes" or "no"')
|
||||||
|
|
||||||
|
|
||||||
def configure():
|
def configure(ctx):
|
||||||
if not os.path.exists('.env'):
|
if not os.path.exists('.env'):
|
||||||
if not yesno("the .env file does not exist, do you want to create it?", default=True):
|
if not ctx.yesno("the .env file does not exist, do you want to create it?", default=True):
|
||||||
print('Aborting')
|
print('Aborting')
|
||||||
exit(0)
|
exit(0)
|
||||||
if not os.path.exists('.env.dist'):
|
if not os.path.exists('.env.dist'):
|
||||||
|
@ -56,7 +65,7 @@ def configure():
|
||||||
current_hosts = os.getenv('ALLOWED_HOSTS')
|
current_hosts = os.getenv('ALLOWED_HOSTS')
|
||||||
print('Current ALLOWED_HOSTS: {}'.format(current_hosts))
|
print('Current ALLOWED_HOSTS: {}'.format(current_hosts))
|
||||||
|
|
||||||
if yesno("Do you want to add ALLOWED_HOSTS?"):
|
if ctx.yesno("Do you want to add ALLOWED_HOSTS?"):
|
||||||
hosts = input("Enter a comma-separated list of allowed hosts: ")
|
hosts = input("Enter a comma-separated list of allowed hosts: ")
|
||||||
joined_hosts = current_hosts + ',' + hosts if current_hosts else hosts
|
joined_hosts = current_hosts + ',' + hosts if current_hosts else hosts
|
||||||
dotenv.set_key('.env', 'ALLOWED_HOSTS', joined_hosts)
|
dotenv.set_key('.env', 'ALLOWED_HOSTS', joined_hosts)
|
||||||
|
@ -67,26 +76,29 @@ def configure():
|
||||||
django.setup()
|
django.setup()
|
||||||
|
|
||||||
if not os.path.exists('db.sqlite3'):
|
if not os.path.exists('db.sqlite3'):
|
||||||
if not yesno("No database found, do you want to create one?", default=True):
|
if not ctx.yesno("No database found, do you want to create one?", default=True):
|
||||||
print('Aborting')
|
print('Aborting')
|
||||||
exit(0)
|
exit(0)
|
||||||
|
|
||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
call_command('migrate')
|
call_command('migrate')
|
||||||
|
|
||||||
if yesno("Do you want to create a superuser?"):
|
if ctx.yesno("Do you want to create a superuser?"):
|
||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
call_command('createsuperuser')
|
call_command('createsuperuser')
|
||||||
|
|
||||||
call_command('collectstatic', '--no-input')
|
call_command('collectstatic', '--no-input')
|
||||||
|
|
||||||
if yesno("Do you want to import all categories, properties and tags contained in this repository?", default=True):
|
if ctx.yesno("Do you want to import all categories, properties and tags contained in this repository?",
|
||||||
|
default=True):
|
||||||
from hostadmin.serializers import CategorySerializer, PropertySerializer, TagSerializer
|
from hostadmin.serializers import CategorySerializer, PropertySerializer, TagSerializer
|
||||||
from hostadmin.models import ImportedIdentifierSets
|
from hostadmin.models import ImportedIdentifierSets
|
||||||
|
from hashlib import sha256
|
||||||
if not os.path.exists('shared_data'):
|
if not os.path.exists('shared_data'):
|
||||||
os.mkdir('shared_data')
|
os.mkdir('shared_data')
|
||||||
files = os.listdir('shared_data')
|
files = os.listdir('shared_data')
|
||||||
idsets = {}
|
idsets = {}
|
||||||
|
hashes = {}
|
||||||
for file in files:
|
for file in files:
|
||||||
if file.endswith('.json'):
|
if file.endswith('.json'):
|
||||||
name = "git:" + file[:-5]
|
name = "git:" + file[:-5]
|
||||||
|
@ -94,6 +106,8 @@ def configure():
|
||||||
try:
|
try:
|
||||||
idset = json.load(f)
|
idset = json.load(f)
|
||||||
idsets[name] = idset
|
idsets[name] = idset
|
||||||
|
f.seek(0)
|
||||||
|
hashes[name] = sha256(f.read().encode()).hexdigest()
|
||||||
except json.decoder.JSONDecodeError:
|
except json.decoder.JSONDecodeError:
|
||||||
print('Error: invalid JSON in file {}'.format(file))
|
print('Error: invalid JSON in file {}'.format(file))
|
||||||
imported_sets = ImportedIdentifierSets.objects.all()
|
imported_sets = ImportedIdentifierSets.objects.all()
|
||||||
|
@ -108,9 +122,13 @@ def configure():
|
||||||
unmet_deps = [dep for dep in idset['depends'] if not imported_sets.filter(name=dep).exists()]
|
unmet_deps = [dep for dep in idset['depends'] if not imported_sets.filter(name=dep).exists()]
|
||||||
if unmet_deps:
|
if unmet_deps:
|
||||||
if all([dep in idsets.keys() for dep in unmet_deps]):
|
if all([dep in idsets.keys() for dep in unmet_deps]):
|
||||||
|
if all([dep in queue for dep in unmet_deps]):
|
||||||
print('Not all dependencies for {} are imported, postponing'.format(name))
|
print('Not all dependencies for {} are imported, postponing'.format(name))
|
||||||
queue.append(name)
|
queue.append(name)
|
||||||
continue
|
continue
|
||||||
|
else:
|
||||||
|
print('Error: unresolvable dependencies for {}: {}'.format(name, unmet_deps))
|
||||||
|
continue
|
||||||
else:
|
else:
|
||||||
print('unknown dependencies for {}: {}'.format(name, unmet_deps))
|
print('unknown dependencies for {}: {}'.format(name, unmet_deps))
|
||||||
continue
|
continue
|
||||||
|
@ -131,10 +149,15 @@ def configure():
|
||||||
serializer = TagSerializer(data=tag)
|
serializer = TagSerializer(data=tag)
|
||||||
if serializer.is_valid():
|
if serializer.is_valid():
|
||||||
serializer.save(origin=name)
|
serializer.save(origin=name)
|
||||||
imported_sets.create(name=name)
|
imported_sets.create(name=name, hash=hashes[name])
|
||||||
except IntegrityError:
|
except IntegrityError:
|
||||||
print('Error: integrity error while importing {}\n\tmight be cause by name conflicts with existing'
|
print('Error: integrity error while importing {}\n\tmight be cause by name conflicts with existing'
|
||||||
' categories, properties or tags'.format(name))
|
' categories, properties or tags'.format(name))
|
||||||
|
transaction.set_rollback(True)
|
||||||
|
continue
|
||||||
|
except Exception as e:
|
||||||
|
print('Error: {}'.format(e))
|
||||||
|
transaction.set_rollback(True)
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
|
||||||
|
@ -183,6 +206,7 @@ def main():
|
||||||
parser = ArgumentParser(description='Toolshed Server Configuration')
|
parser = ArgumentParser(description='Toolshed Server Configuration')
|
||||||
parser.add_argument('--yes', '-y', help='Answer yes to all questions', action='store_true')
|
parser.add_argument('--yes', '-y', help='Answer yes to all questions', action='store_true')
|
||||||
parser.add_argument('--no', '-n', help='Answer no to all questions', action='store_true')
|
parser.add_argument('--no', '-n', help='Answer no to all questions', action='store_true')
|
||||||
|
parser.add_argument('--noninteractive', '-x', help="Run in noninteractive mode", action='store_true')
|
||||||
parser.add_argument('cmd', help='Command', default='configure', nargs='?')
|
parser.add_argument('cmd', help='Command', default='configure', nargs='?')
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
@ -190,12 +214,16 @@ def main():
|
||||||
print('Error: --yes and --no are mutually exclusive')
|
print('Error: --yes and --no are mutually exclusive')
|
||||||
exit(1)
|
exit(1)
|
||||||
|
|
||||||
|
ctx = CmdCtx(args)
|
||||||
|
|
||||||
if args.cmd == 'configure':
|
if args.cmd == 'configure':
|
||||||
configure()
|
configure(ctx)
|
||||||
elif args.cmd == 'reset':
|
elif args.cmd == 'reset':
|
||||||
reset()
|
reset()
|
||||||
elif args.cmd == 'testdata':
|
elif args.cmd == 'testdata':
|
||||||
testdata()
|
testdata()
|
||||||
|
elif args.cmd == 'migrate':
|
||||||
|
print('not implemented yet')
|
||||||
else:
|
else:
|
||||||
print('Unknown command: {}'.format(args.cmd))
|
print('Unknown command: {}'.format(args.cmd))
|
||||||
exit(1)
|
exit(1)
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
from django.contrib import admin
|
from django.contrib import admin
|
||||||
|
|
||||||
from .models import Domain
|
from .models import Domain, ImportedIdentifierSets
|
||||||
|
|
||||||
|
|
||||||
class DomainAdmin(admin.ModelAdmin):
|
class DomainAdmin(admin.ModelAdmin):
|
||||||
|
@ -9,3 +9,11 @@ class DomainAdmin(admin.ModelAdmin):
|
||||||
|
|
||||||
|
|
||||||
admin.site.register(Domain, DomainAdmin)
|
admin.site.register(Domain, DomainAdmin)
|
||||||
|
|
||||||
|
|
||||||
|
class ImportedIdentifierSetsAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ('name', 'hash', 'created_at')
|
||||||
|
list_filter = ('name', 'hash', 'created_at')
|
||||||
|
|
||||||
|
|
||||||
|
admin.site.register(ImportedIdentifierSets, ImportedIdentifierSetsAdmin)
|
||||||
|
|
|
@ -0,0 +1,39 @@
|
||||||
|
# Generated by Django 4.2.2 on 2024-03-11 15:19
|
||||||
|
import os
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
dependencies = [
|
||||||
|
('hostadmin', '0002_importedidentifiersets'),
|
||||||
|
]
|
||||||
|
|
||||||
|
def calculate_hash(apps, schema_editor):
|
||||||
|
from hostadmin.models import ImportedIdentifierSets
|
||||||
|
for identifier_set in ImportedIdentifierSets.objects.all():
|
||||||
|
if not identifier_set.hash:
|
||||||
|
print("update", identifier_set.name)
|
||||||
|
filename = "shared_data/" + identifier_set.name.strip('git:') + ".json"
|
||||||
|
if not os.path.exists(filename):
|
||||||
|
continue
|
||||||
|
from hashlib import sha256
|
||||||
|
with open(filename, 'r') as file:
|
||||||
|
data = file.read()
|
||||||
|
identifier_set.hash = sha256(data.encode()).hexdigest()
|
||||||
|
identifier_set.save()
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AddField(
|
||||||
|
model_name='importedidentifiersets',
|
||||||
|
name='hash',
|
||||||
|
field=models.CharField(blank=True, max_length=255, null=True),
|
||||||
|
|
||||||
|
),
|
||||||
|
migrations.RunPython(calculate_hash),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='importedidentifiersets',
|
||||||
|
name='hash',
|
||||||
|
field=models.CharField(max_length=255, unique=True),
|
||||||
|
),
|
||||||
|
]
|
|
@ -0,0 +1,17 @@
|
||||||
|
# Generated by Django 4.2.2 on 2024-03-14 16:33
|
||||||
|
|
||||||
|
from django.db import migrations
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('hostadmin', '0003_importedidentifiersets_hash'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterModelOptions(
|
||||||
|
name='importedidentifiersets',
|
||||||
|
options={'verbose_name_plural': 'imported identifier sets'},
|
||||||
|
),
|
||||||
|
]
|
|
@ -12,4 +12,8 @@ class Domain(models.Model):
|
||||||
|
|
||||||
class ImportedIdentifierSets(models.Model):
|
class ImportedIdentifierSets(models.Model):
|
||||||
name = models.CharField(max_length=255, unique=True)
|
name = models.CharField(max_length=255, unique=True)
|
||||||
|
hash = models.CharField(max_length=255, unique=True)
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
verbose_name_plural = 'imported identifier sets'
|
||||||
|
|
|
@ -5,6 +5,32 @@ from hostadmin.models import Domain
|
||||||
from toolshed.models import Category, Property, Tag
|
from toolshed.models import Category, Property, Tag
|
||||||
|
|
||||||
|
|
||||||
|
class SlugPathField(serializers.SlugRelatedField):
|
||||||
|
def to_internal_value(self, data):
|
||||||
|
path = data.split('/') if '/' in data else [data]
|
||||||
|
candidates = self.get_queryset().filter(name=path[-1])
|
||||||
|
if len(candidates) == 1:
|
||||||
|
return candidates.first()
|
||||||
|
if len(candidates) == 0:
|
||||||
|
raise serializers.ValidationError(
|
||||||
|
"No {} with name '{}' found".format(self.queryset.model.__name__, path[-1]))
|
||||||
|
if len(candidates) > 1 and len(path) == 1:
|
||||||
|
raise serializers.ValidationError("Multiple {}s with name '{}' found, please specify the parent".format(
|
||||||
|
self.queryset.model.__name__, path[-1]))
|
||||||
|
parent = self.to_internal_value('/'.join(path[:-1]))
|
||||||
|
candidates = self.get_queryset().filter(name=path[-1], parent=parent)
|
||||||
|
if len(candidates) == 1:
|
||||||
|
return candidates.first()
|
||||||
|
if len(candidates) == 0:
|
||||||
|
raise serializers.ValidationError(
|
||||||
|
"No {} with name '{}' found".format(self.queryset.model.__name__, path[-1]))
|
||||||
|
|
||||||
|
def to_representation(self, value):
|
||||||
|
source = getattr(value, self.field_name, None) # should this use self.source?
|
||||||
|
prefix = self.to_representation(source) + '/' if source else ''
|
||||||
|
return prefix + getattr(value, self.slug_field)
|
||||||
|
|
||||||
|
|
||||||
class DomainSerializer(serializers.ModelSerializer):
|
class DomainSerializer(serializers.ModelSerializer):
|
||||||
owner = OwnerSerializer(read_only=True)
|
owner = OwnerSerializer(read_only=True)
|
||||||
|
|
||||||
|
@ -12,12 +38,21 @@ class DomainSerializer(serializers.ModelSerializer):
|
||||||
model = Domain
|
model = Domain
|
||||||
fields = ['name', 'owner', 'open_registration']
|
fields = ['name', 'owner', 'open_registration']
|
||||||
|
|
||||||
def create(self, validated_data):
|
|
||||||
return super().create(validated_data)
|
|
||||||
|
|
||||||
|
|
||||||
class CategorySerializer(serializers.ModelSerializer):
|
class CategorySerializer(serializers.ModelSerializer):
|
||||||
parent = serializers.SlugRelatedField(slug_field='name', queryset=Category.objects.all(), required=False)
|
parent = SlugPathField(slug_field='name', queryset=Category.objects.all(), required=False)
|
||||||
|
|
||||||
|
def validate(self, attrs):
|
||||||
|
if 'name' in attrs:
|
||||||
|
if '/' in attrs['name']:
|
||||||
|
raise serializers.ValidationError("Category name cannot contain '/'")
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
def create(self, validated_data):
|
||||||
|
try:
|
||||||
|
return Category.objects.create(**validated_data)
|
||||||
|
except Exception as e:
|
||||||
|
raise serializers.ValidationError(e)
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = Category
|
model = Category
|
||||||
|
@ -27,7 +62,19 @@ class CategorySerializer(serializers.ModelSerializer):
|
||||||
|
|
||||||
|
|
||||||
class PropertySerializer(serializers.ModelSerializer):
|
class PropertySerializer(serializers.ModelSerializer):
|
||||||
category = serializers.SlugRelatedField(slug_field='name', queryset=Category.objects.all(), required=False)
|
category = SlugPathField(slug_field='name', queryset=Category.objects.all(), required=False)
|
||||||
|
|
||||||
|
def validate(self, attrs):
|
||||||
|
if 'name' in attrs:
|
||||||
|
if '/' in attrs['name']:
|
||||||
|
raise serializers.ValidationError("Property name cannot contain '/'")
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
def create(self, validated_data):
|
||||||
|
try:
|
||||||
|
return Property.objects.create(**validated_data)
|
||||||
|
except Exception as e:
|
||||||
|
raise serializers.ValidationError(e)
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = Property
|
model = Property
|
||||||
|
@ -38,7 +85,19 @@ class PropertySerializer(serializers.ModelSerializer):
|
||||||
|
|
||||||
|
|
||||||
class TagSerializer(serializers.ModelSerializer):
|
class TagSerializer(serializers.ModelSerializer):
|
||||||
category = serializers.SlugRelatedField(slug_field='name', queryset=Category.objects.all(), required=False)
|
category = SlugPathField(slug_field='name', queryset=Category.objects.all(), required=False)
|
||||||
|
|
||||||
|
def validate(self, attrs):
|
||||||
|
if 'name' in attrs:
|
||||||
|
if '/' in attrs['name']:
|
||||||
|
raise serializers.ValidationError("Tag name cannot contain '/'")
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
def create(self, validated_data):
|
||||||
|
try:
|
||||||
|
return Tag.objects.create(**validated_data)
|
||||||
|
except Exception as e:
|
||||||
|
raise serializers.ValidationError(e)
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = Tag
|
model = Tag
|
||||||
|
|
|
@ -100,7 +100,8 @@ class CategoryApiTestCase(UserTestMixin, CategoryTestMixin, ToolshedTestCase):
|
||||||
response = client.get('/api/categories/', self.f['local_user1'])
|
response = client.get('/api/categories/', self.f['local_user1'])
|
||||||
self.assertEqual(response.status_code, 200)
|
self.assertEqual(response.status_code, 200)
|
||||||
self.assertEqual(response.json(),
|
self.assertEqual(response.json(),
|
||||||
["cat1", "cat2", "cat3", "cat1/subcat1", "cat1/subcat2", "cat1/subcat1/subcat3"])
|
["cat1", "cat2", "cat3", "cat1/subcat1",
|
||||||
|
"cat1/subcat2", "cat1/subcat1/subcat1", "cat1/subcat1/subcat2"])
|
||||||
|
|
||||||
def test_admin_get_categories_fail(self):
|
def test_admin_get_categories_fail(self):
|
||||||
response = client.get('/admin/categories/', self.f['local_user1'])
|
response = client.get('/admin/categories/', self.f['local_user1'])
|
||||||
|
@ -109,7 +110,7 @@ class CategoryApiTestCase(UserTestMixin, CategoryTestMixin, ToolshedTestCase):
|
||||||
def test_admin_get_categories(self):
|
def test_admin_get_categories(self):
|
||||||
response = client.get('/admin/categories/', self.f['admin'])
|
response = client.get('/admin/categories/', self.f['admin'])
|
||||||
self.assertEqual(response.status_code, 200)
|
self.assertEqual(response.status_code, 200)
|
||||||
self.assertEqual(len(response.json()), 6)
|
self.assertEqual(len(response.json()), 7)
|
||||||
self.assertEqual(response.json()[0]['name'], 'cat1')
|
self.assertEqual(response.json()[0]['name'], 'cat1')
|
||||||
self.assertEqual(response.json()[1]['name'], 'cat2')
|
self.assertEqual(response.json()[1]['name'], 'cat2')
|
||||||
self.assertEqual(response.json()[2]['name'], 'cat3')
|
self.assertEqual(response.json()[2]['name'], 'cat3')
|
||||||
|
@ -117,10 +118,12 @@ class CategoryApiTestCase(UserTestMixin, CategoryTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(response.json()[3]['parent'], 'cat1')
|
self.assertEqual(response.json()[3]['parent'], 'cat1')
|
||||||
self.assertEqual(response.json()[4]['name'], 'subcat2')
|
self.assertEqual(response.json()[4]['name'], 'subcat2')
|
||||||
self.assertEqual(response.json()[4]['parent'], 'cat1')
|
self.assertEqual(response.json()[4]['parent'], 'cat1')
|
||||||
self.assertEqual(response.json()[5]['name'], 'subcat3')
|
self.assertEqual(response.json()[5]['name'], 'subcat1')
|
||||||
self.assertEqual(response.json()[5]['parent'], 'subcat1')
|
self.assertEqual(response.json()[5]['parent'], 'cat1/subcat1')
|
||||||
|
self.assertEqual(response.json()[6]['name'], 'subcat2')
|
||||||
|
self.assertEqual(response.json()[6]['parent'], 'cat1/subcat1')
|
||||||
|
|
||||||
def test_admin_create_category(self):
|
def test_admin_post_category(self):
|
||||||
response = client.post('/admin/categories/', self.f['admin'], {'name': 'cat4'})
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'cat4'})
|
||||||
self.assertEqual(response.status_code, 201)
|
self.assertEqual(response.status_code, 201)
|
||||||
self.assertEqual(response.json()['name'], 'cat4')
|
self.assertEqual(response.json()['name'], 'cat4')
|
||||||
|
@ -128,6 +131,40 @@ class CategoryApiTestCase(UserTestMixin, CategoryTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(response.json()['parent'], None)
|
self.assertEqual(response.json()['parent'], None)
|
||||||
self.assertEqual(response.json()['origin'], 'api')
|
self.assertEqual(response.json()['origin'], 'api')
|
||||||
|
|
||||||
|
def test_admin_post_category_duplicate(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'cat3'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_post_category_invalid(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'cat/4'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_post_category_parent_not_found(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat4', 'parent': 'cat4'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_post_category_parent_ambiguous(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat4', 'parent': 'subcat1'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_post_category_parent_subcategory(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat4', 'parent': 'cat1/subcat1'})
|
||||||
|
self.assertEqual(response.status_code, 201)
|
||||||
|
self.assertEqual(response.json()['name'], 'subcat4')
|
||||||
|
self.assertEqual(response.json()['description'], None)
|
||||||
|
self.assertEqual(response.json()['parent'], 'cat1/subcat1')
|
||||||
|
self.assertEqual(response.json()['origin'], 'api')
|
||||||
|
|
||||||
|
def test_admin_post_category_parent_subcategory_not_found(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat4', 'parent': 'cat2/subcat1'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_post_category_parent_subcategory_ambiguous(self):
|
||||||
|
from toolshed.models import Category
|
||||||
|
self.f['subcat111'] = Category.objects.create(name='subcat1', parent=self.f['subcat11'], origin='test')
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat4', 'parent': 'subcat1/subcat1'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
def test_admin_post_subcategory(self):
|
def test_admin_post_subcategory(self):
|
||||||
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat4', 'parent': 'cat1'})
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat4', 'parent': 'cat1'})
|
||||||
self.assertEqual(response.status_code, 201)
|
self.assertEqual(response.status_code, 201)
|
||||||
|
@ -136,6 +173,18 @@ class CategoryApiTestCase(UserTestMixin, CategoryTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(response.json()['parent'], 'cat1')
|
self.assertEqual(response.json()['parent'], 'cat1')
|
||||||
self.assertEqual(response.json()['origin'], 'api')
|
self.assertEqual(response.json()['origin'], 'api')
|
||||||
|
|
||||||
|
def test_admin_post_subcategory_duplicate(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat2', 'parent': 'cat1'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_post_subcategory_distinct_duplicate(self):
|
||||||
|
response = client.post('/admin/categories/', self.f['admin'], {'name': 'subcat2', 'parent': 'cat2'})
|
||||||
|
self.assertEqual(response.status_code, 201)
|
||||||
|
self.assertEqual(response.json()['name'], 'subcat2')
|
||||||
|
self.assertEqual(response.json()['description'], None)
|
||||||
|
self.assertEqual(response.json()['parent'], 'cat2')
|
||||||
|
self.assertEqual(response.json()['origin'], 'api')
|
||||||
|
|
||||||
def test_admin_put_category(self):
|
def test_admin_put_category(self):
|
||||||
response = client.put('/admin/categories/1/', self.f['admin'], {'name': 'cat5'})
|
response = client.put('/admin/categories/1/', self.f['admin'], {'name': 'cat5'})
|
||||||
self.assertEqual(response.status_code, 200)
|
self.assertEqual(response.status_code, 200)
|
||||||
|
@ -188,6 +237,14 @@ class TagApiTestCase(UserTestMixin, CategoryTestMixin, TagTestMixin, ToolshedTes
|
||||||
self.assertEqual(response.json()['origin'], 'api')
|
self.assertEqual(response.json()['origin'], 'api')
|
||||||
self.assertEqual(response.json()['category'], None)
|
self.assertEqual(response.json()['category'], None)
|
||||||
|
|
||||||
|
def test_admin_create_tag_duplicate(self):
|
||||||
|
response = client.post('/admin/tags/', self.f['admin'], {'name': 'tag3'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_create_tag_invalid(self):
|
||||||
|
response = client.post('/admin/tags/', self.f['admin'], {'name': 'tag/4'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
def test_admin_put_tag(self):
|
def test_admin_put_tag(self):
|
||||||
response = client.put('/admin/tags/1/', self.f['admin'], {'name': 'tag5'})
|
response = client.put('/admin/tags/1/', self.f['admin'], {'name': 'tag5'})
|
||||||
self.assertEqual(response.status_code, 200)
|
self.assertEqual(response.status_code, 200)
|
||||||
|
@ -250,7 +307,13 @@ class PropertyApiTestCase(UserTestMixin, CategoryTestMixin, PropertyTestMixin, T
|
||||||
self.assertEqual(response.json()['base2_prefix'], False)
|
self.assertEqual(response.json()['base2_prefix'], False)
|
||||||
self.assertEqual(response.json()['dimensions'], 1)
|
self.assertEqual(response.json()['dimensions'], 1)
|
||||||
|
|
||||||
# self.assertEqual(response.json()['sort_lexicographically'], False)
|
def test_admin_create_property_duplicate(self):
|
||||||
|
response = client.post('/admin/properties/', self.f['admin'], {'name': 'prop3', 'category': 'cat1'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
|
def test_admin_create_property_invalid(self):
|
||||||
|
response = client.post('/admin/properties/', self.f['admin'], {'name': 'prop/4'})
|
||||||
|
self.assertEqual(response.status_code, 400)
|
||||||
|
|
||||||
def test_admin_put_property(self):
|
def test_admin_put_property(self):
|
||||||
response = client.put('/admin/properties/1/', self.f['admin'], {'name': 'prop5'})
|
response = client.put('/admin/properties/1/', self.f['admin'], {'name': 'prop5'})
|
||||||
|
@ -265,8 +328,6 @@ class PropertyApiTestCase(UserTestMixin, CategoryTestMixin, PropertyTestMixin, T
|
||||||
self.assertEqual(response.json()['base2_prefix'], False)
|
self.assertEqual(response.json()['base2_prefix'], False)
|
||||||
self.assertEqual(response.json()['dimensions'], 1)
|
self.assertEqual(response.json()['dimensions'], 1)
|
||||||
|
|
||||||
# self.assertEqual(response.json()['sort_lexicographically'], False)
|
|
||||||
|
|
||||||
def test_admin_patch_property(self):
|
def test_admin_patch_property(self):
|
||||||
response = client.patch('/admin/properties/1/', self.f['admin'], {'name': 'prop5'})
|
response = client.patch('/admin/properties/1/', self.f['admin'], {'name': 'prop5'})
|
||||||
self.assertEqual(response.status_code, 200)
|
self.assertEqual(response.status_code, 200)
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
from django.contrib import admin
|
from django.contrib import admin
|
||||||
|
|
||||||
from toolshed.models import InventoryItem, Property, Tag, ItemProperty, ItemTag
|
from toolshed.models import InventoryItem, Property, Tag, Category
|
||||||
|
|
||||||
|
|
||||||
class InventoryItemAdmin(admin.ModelAdmin):
|
class InventoryItemAdmin(admin.ModelAdmin):
|
||||||
|
@ -12,16 +12,24 @@ admin.site.register(InventoryItem, InventoryItemAdmin)
|
||||||
|
|
||||||
|
|
||||||
class PropertyAdmin(admin.ModelAdmin):
|
class PropertyAdmin(admin.ModelAdmin):
|
||||||
list_display = ('name',)
|
list_display = ('name', 'description', 'category', 'unit_symbol', 'base2_prefix', 'dimensions', 'origin')
|
||||||
search_fields = ('name',)
|
search_fields = ('name', 'description', 'category', 'unit_symbol', 'base2_prefix', 'dimensions', 'origin')
|
||||||
|
|
||||||
|
|
||||||
admin.site.register(Property, PropertyAdmin)
|
admin.site.register(Property, PropertyAdmin)
|
||||||
|
|
||||||
|
|
||||||
class TagAdmin(admin.ModelAdmin):
|
class TagAdmin(admin.ModelAdmin):
|
||||||
list_display = ('name',)
|
list_display = ('name', 'description', 'category', 'origin')
|
||||||
search_fields = ('name',)
|
search_fields = ('name', 'description', 'category', 'origin')
|
||||||
|
|
||||||
|
|
||||||
admin.site.register(Tag, TagAdmin)
|
admin.site.register(Tag, TagAdmin)
|
||||||
|
|
||||||
|
|
||||||
|
class CategoryAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ('name', 'description', 'parent', 'origin')
|
||||||
|
search_fields = ('name', 'description', 'parent', 'origin')
|
||||||
|
|
||||||
|
|
||||||
|
admin.site.register(Category, CategoryAdmin)
|
||||||
|
|
|
@ -64,15 +64,23 @@ class FriendsRequests(APIView, ViewSetMixin):
|
||||||
befriendee_username, befriendee_domain = split_userhandle_or_throw(request.data['befriendee'])
|
befriendee_username, befriendee_domain = split_userhandle_or_throw(request.data['befriendee'])
|
||||||
if befriender_domain == befriendee_domain and befriender_username == befriendee_username:
|
if befriender_domain == befriendee_domain and befriender_username == befriendee_username:
|
||||||
return Response(status=status.HTTP_400_BAD_REQUEST, data={'status': 'cannot befriend yourself'})
|
return Response(status=status.HTTP_400_BAD_REQUEST, data={'status': 'cannot befriend yourself'})
|
||||||
if user := authenticate_request_against_local_users(request, raw_request):
|
if user := authenticate_request_against_local_users(request, raw_request): # befriender is local
|
||||||
secret = secrets.token_hex(64)
|
secret = secrets.token_hex(64)
|
||||||
befriendee_user = ToolshedUser.objects.filter(username=befriendee_username, domain=befriendee_domain)
|
befriendee_user = ToolshedUser.objects.filter(username=befriendee_username, domain=befriendee_domain)
|
||||||
if befriendee_user.exists():
|
if befriendee_user.exists(): # befriendee is local (both are local)
|
||||||
|
if user.friends.filter(username=befriendee_username, domain=befriendee_domain).exists():
|
||||||
|
return Response(status=status.HTTP_208_ALREADY_REPORTED, data={'status': "exists"})
|
||||||
|
existing_request = FriendRequestIncoming.objects.filter(
|
||||||
|
befriender_username=befriender_username,
|
||||||
|
befriender_domain=befriender_domain,
|
||||||
|
befriendee_user=befriendee_user.get())
|
||||||
|
if existing_request.exists():
|
||||||
|
return Response(status=status.HTTP_208_ALREADY_REPORTED, data={'status': "exists"})
|
||||||
FriendRequestIncoming.objects.create(
|
FriendRequestIncoming.objects.create(
|
||||||
befriender_username=befriender_username,
|
befriender_username=befriender_username,
|
||||||
befriender_domain=befriender_domain,
|
befriender_domain=befriender_domain,
|
||||||
befriender_public_key=user.public_identity.public_key,
|
befriender_public_key=user.public_identity.public_key,
|
||||||
secret=secret, # request.data['secret'] # TODO ??
|
secret=secret,
|
||||||
befriendee_user=befriendee_user.get(),
|
befriendee_user=befriendee_user.get(),
|
||||||
)
|
)
|
||||||
return Response(status=status.HTTP_201_CREATED, data={'secret': secret, 'status': "pending"})
|
return Response(status=status.HTTP_201_CREATED, data={'secret': secret, 'status': "pending"})
|
||||||
|
@ -81,7 +89,7 @@ class FriendsRequests(APIView, ViewSetMixin):
|
||||||
befriender_user=user,
|
befriender_user=user,
|
||||||
befriendee_username=befriendee_username,
|
befriendee_username=befriendee_username,
|
||||||
befriendee_domain=befriendee_domain,
|
befriendee_domain=befriendee_domain,
|
||||||
secret=secret, # request.data['secret'] # TODO ??
|
secret=secret,
|
||||||
)
|
)
|
||||||
return Response(status=status.HTTP_201_CREATED, data={'secret': secret, 'status': "pending"})
|
return Response(status=status.HTTP_201_CREATED, data={'secret': secret, 'status': "pending"})
|
||||||
elif verify_incoming_friend_request(request, raw_request):
|
elif verify_incoming_friend_request(request, raw_request):
|
||||||
|
|
|
@ -5,7 +5,7 @@ from rest_framework.response import Response
|
||||||
|
|
||||||
from hostadmin.models import Domain
|
from hostadmin.models import Domain
|
||||||
from authentication.signature_auth import SignatureAuthentication
|
from authentication.signature_auth import SignatureAuthentication
|
||||||
from toolshed.models import Tag, Property, Category
|
from toolshed.models import Tag, Property, Category, InventoryItem
|
||||||
from toolshed.serializers import CategorySerializer, PropertySerializer
|
from toolshed.serializers import CategorySerializer, PropertySerializer
|
||||||
from backend.settings import TOOLSHED_VERSION
|
from backend.settings import TOOLSHED_VERSION
|
||||||
|
|
||||||
|
@ -51,8 +51,7 @@ def list_categories(request, format=None): # /categories/
|
||||||
@permission_classes([IsAuthenticated])
|
@permission_classes([IsAuthenticated])
|
||||||
@authentication_classes([SignatureAuthentication])
|
@authentication_classes([SignatureAuthentication])
|
||||||
def list_availability_policies(request, format=None): # /availability_policies/
|
def list_availability_policies(request, format=None): # /availability_policies/
|
||||||
policies = ['private', 'friends', 'internal', 'public']
|
return Response(InventoryItem.AVAILABILITY_POLICY_CHOICES)
|
||||||
return Response(policies)
|
|
||||||
|
|
||||||
|
|
||||||
@api_view(['GET'])
|
@api_view(['GET'])
|
||||||
|
@ -62,9 +61,11 @@ def combined_info(request, format=None): # /info/
|
||||||
tags = [tag.name for tag in Tag.objects.all()]
|
tags = [tag.name for tag in Tag.objects.all()]
|
||||||
properties = PropertySerializer(Property.objects.all(), many=True).data
|
properties = PropertySerializer(Property.objects.all(), many=True).data
|
||||||
categories = [str(category) for category in Category.objects.all()]
|
categories = [str(category) for category in Category.objects.all()]
|
||||||
policies = ['private', 'friends', 'internal', 'public']
|
policies = InventoryItem.AVAILABILITY_POLICY_CHOICES
|
||||||
domains = [domain.name for domain in Domain.objects.filter(open_registration=True)]
|
domains = [domain.name for domain in Domain.objects.filter(open_registration=True)]
|
||||||
return Response({'tags': tags, 'properties': properties, 'availability_policies': policies, 'categories': categories, 'domains': domains})
|
return Response(
|
||||||
|
{'tags': tags, 'properties': properties, 'availability_policies': policies, 'categories': categories,
|
||||||
|
'domains': domains})
|
||||||
|
|
||||||
|
|
||||||
urlpatterns = [
|
urlpatterns = [
|
||||||
|
|
|
@ -7,8 +7,8 @@ from rest_framework.response import Response
|
||||||
|
|
||||||
from authentication.models import ToolshedUser, KnownIdentity
|
from authentication.models import ToolshedUser, KnownIdentity
|
||||||
from authentication.signature_auth import SignatureAuthentication
|
from authentication.signature_auth import SignatureAuthentication
|
||||||
from toolshed.models import InventoryItem
|
from toolshed.models import InventoryItem, StorageLocation
|
||||||
from toolshed.serializers import InventoryItemSerializer
|
from toolshed.serializers import InventoryItemSerializer, StorageLocationSerializer
|
||||||
|
|
||||||
router = routers.SimpleRouter()
|
router = routers.SimpleRouter()
|
||||||
|
|
||||||
|
@ -24,6 +24,7 @@ def inventory_items(identity):
|
||||||
for friend in identity.friends.all():
|
for friend in identity.friends.all():
|
||||||
if friend_user := friend.user.get():
|
if friend_user := friend.user.get():
|
||||||
for item in friend_user.inventory_items.all():
|
for item in friend_user.inventory_items.all():
|
||||||
|
if item.availability_policy != 'private':
|
||||||
yield item
|
yield item
|
||||||
|
|
||||||
|
|
||||||
|
@ -61,7 +62,19 @@ def search_inventory_items(request):
|
||||||
return Response({'error': 'No query provided.'}, status=400)
|
return Response({'error': 'No query provided.'}, status=400)
|
||||||
|
|
||||||
|
|
||||||
|
class StorageLocationViewSet(viewsets.ModelViewSet):
|
||||||
|
serializer_class = StorageLocationSerializer
|
||||||
|
authentication_classes = [SignatureAuthentication]
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
if type(self.request.user) == KnownIdentity and self.request.user.user.exists():
|
||||||
|
return StorageLocation.objects.filter(owner=self.request.user.user.get())
|
||||||
|
return StorageLocation.objects.none()
|
||||||
|
|
||||||
|
|
||||||
router.register(r'inventory_items', InventoryItemViewSet, basename='inventory_items')
|
router.register(r'inventory_items', InventoryItemViewSet, basename='inventory_items')
|
||||||
|
router.register(r'storage_locations', StorageLocationViewSet, basename='storage_locations')
|
||||||
|
|
||||||
urlpatterns = router.urls + [
|
urlpatterns = router.urls + [
|
||||||
path('search/', search_inventory_items, name='search_inventory_items'),
|
path('search/', search_inventory_items, name='search_inventory_items'),
|
||||||
|
|
|
@ -0,0 +1,32 @@
|
||||||
|
# Generated by Django 4.2.2 on 2024-02-20 13:50
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
import django.db.models.deletion
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||||
|
('toolshed', '0003_inventoryitem_files_and_more'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='StorageLocation',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('name', models.CharField(max_length=255)),
|
||||||
|
('description', models.TextField(blank=True, null=True)),
|
||||||
|
('category', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='storage_locations', to='toolshed.category')),
|
||||||
|
('owner', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='storage_locations', to=settings.AUTH_USER_MODEL)),
|
||||||
|
('parent', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='children', to='toolshed.storagelocation')),
|
||||||
|
],
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name='inventoryitem',
|
||||||
|
name='storage_location',
|
||||||
|
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='inventory_items', to='toolshed.storagelocation'),
|
||||||
|
),
|
||||||
|
]
|
|
@ -0,0 +1,18 @@
|
||||||
|
# Generated by Django 4.2.2 on 2024-02-20 15:15
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('toolshed', '0004_storagelocation_inventoryitem_storage_location'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='inventoryitem',
|
||||||
|
name='availability_policy',
|
||||||
|
field=models.CharField(choices=[('sell', 'Sell'), ('rent', 'Rent'), ('lend', 'Lend'), ('share', 'Share'), ('private', 'Private')], default='private', max_length=20),
|
||||||
|
),
|
||||||
|
]
|
|
@ -0,0 +1,67 @@
|
||||||
|
# Generated by Django 4.2.2 on 2024-03-14 16:54
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
import django.db.models.deletion
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('toolshed', '0005_alter_inventoryitem_availability_policy'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterModelOptions(
|
||||||
|
name='tag',
|
||||||
|
options={'verbose_name_plural': 'tags'},
|
||||||
|
),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='category',
|
||||||
|
name='name',
|
||||||
|
field=models.CharField(max_length=255),
|
||||||
|
),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='category',
|
||||||
|
name='parent',
|
||||||
|
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='children', to='toolshed.category'),
|
||||||
|
),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='inventoryitem',
|
||||||
|
name='category',
|
||||||
|
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='inventory_items', to='toolshed.category'),
|
||||||
|
),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='property',
|
||||||
|
name='category',
|
||||||
|
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='properties', to='toolshed.category'),
|
||||||
|
),
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='tag',
|
||||||
|
name='category',
|
||||||
|
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='tags', to='toolshed.category'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='category',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('parent__isnull', False)), fields=('name', 'parent'), name='category_unique_name_parent'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='category',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('parent__isnull', True)), fields=('name',), name='category_unique_name_no_parent'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='property',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('category__isnull', False)), fields=('name', 'category'), name='property_unique_name_category'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='property',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('category__isnull', True)), fields=('name',), name='property_unique_name_no_category'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='tag',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('category__isnull', False)), fields=('name', 'category'), name='tag_unique_name_category'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='tag',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('category__isnull', True)), fields=('name',), name='tag_unique_name_no_category'),
|
||||||
|
),
|
||||||
|
]
|
|
@ -8,13 +8,19 @@ from files.models import File
|
||||||
|
|
||||||
|
|
||||||
class Category(SoftDeleteModel):
|
class Category(SoftDeleteModel):
|
||||||
name = models.CharField(max_length=255, unique=True)
|
name = models.CharField(max_length=255)
|
||||||
description = models.TextField(null=True, blank=True)
|
description = models.TextField(null=True, blank=True)
|
||||||
parent = models.ForeignKey('self', on_delete=models.CASCADE, null=True, blank=True, related_name='children')
|
parent = models.ForeignKey('self', on_delete=models.CASCADE, null=True, related_name='children')
|
||||||
origin = models.CharField(max_length=255, null=False, blank=False)
|
origin = models.CharField(max_length=255, null=False, blank=False)
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
verbose_name_plural = 'categories'
|
verbose_name_plural = 'categories'
|
||||||
|
constraints = [
|
||||||
|
models.UniqueConstraint(fields=['name', 'parent'], condition=models.Q(parent__isnull=False),
|
||||||
|
name='category_unique_name_parent'),
|
||||||
|
models.UniqueConstraint(fields=['name'], condition=models.Q(parent__isnull=True),
|
||||||
|
name='category_unique_name_no_parent')
|
||||||
|
]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
parent = str(self.parent) + "/" if self.parent else ""
|
parent = str(self.parent) + "/" if self.parent else ""
|
||||||
|
@ -24,7 +30,7 @@ class Category(SoftDeleteModel):
|
||||||
class Property(models.Model):
|
class Property(models.Model):
|
||||||
name = models.CharField(max_length=255)
|
name = models.CharField(max_length=255)
|
||||||
description = models.TextField(null=True, blank=True)
|
description = models.TextField(null=True, blank=True)
|
||||||
category = models.ForeignKey(Category, on_delete=models.CASCADE, null=True, blank=True, related_name='properties')
|
category = models.ForeignKey(Category, on_delete=models.CASCADE, null=True, related_name='properties')
|
||||||
unit_symbol = models.CharField(max_length=16, null=True, blank=True)
|
unit_symbol = models.CharField(max_length=16, null=True, blank=True)
|
||||||
unit_name = models.CharField(max_length=255, null=True, blank=True)
|
unit_name = models.CharField(max_length=255, null=True, blank=True)
|
||||||
unit_name_plural = models.CharField(max_length=255, null=True, blank=True)
|
unit_name_plural = models.CharField(max_length=255, null=True, blank=True)
|
||||||
|
@ -34,6 +40,12 @@ class Property(models.Model):
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
verbose_name_plural = 'properties'
|
verbose_name_plural = 'properties'
|
||||||
|
constraints = [
|
||||||
|
models.UniqueConstraint(fields=['name', 'category'], condition=models.Q(category__isnull=False),
|
||||||
|
name='property_unique_name_category'),
|
||||||
|
models.UniqueConstraint(fields=['name'], condition=models.Q(category__isnull=True),
|
||||||
|
name='property_unique_name_no_category')
|
||||||
|
]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.name
|
return self.name
|
||||||
|
@ -42,26 +54,44 @@ class Property(models.Model):
|
||||||
class Tag(models.Model):
|
class Tag(models.Model):
|
||||||
name = models.CharField(max_length=255)
|
name = models.CharField(max_length=255)
|
||||||
description = models.TextField(null=True, blank=True)
|
description = models.TextField(null=True, blank=True)
|
||||||
category = models.ForeignKey(Category, on_delete=models.CASCADE, null=True, blank=True, related_name='tags')
|
category = models.ForeignKey(Category, on_delete=models.CASCADE, null=True, related_name='tags')
|
||||||
origin = models.CharField(max_length=255, null=False, blank=False)
|
origin = models.CharField(max_length=255, null=False, blank=False)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
verbose_name_plural = 'tags'
|
||||||
|
constraints = [
|
||||||
|
models.UniqueConstraint(fields=['name', 'category'], condition=models.Q(category__isnull=False),
|
||||||
|
name='tag_unique_name_category'),
|
||||||
|
models.UniqueConstraint(fields=['name'], condition=models.Q(category__isnull=True),
|
||||||
|
name='tag_unique_name_no_category')
|
||||||
|
]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
|
|
||||||
class InventoryItem(SoftDeleteModel):
|
class InventoryItem(SoftDeleteModel):
|
||||||
|
AVAILABILITY_POLICY_CHOICES = (
|
||||||
|
('sell', 'Sell'),
|
||||||
|
('rent', 'Rent'),
|
||||||
|
('lend', 'Lend'),
|
||||||
|
('share', 'Share'),
|
||||||
|
('private', 'Private'),
|
||||||
|
)
|
||||||
|
|
||||||
published = models.BooleanField(default=False)
|
published = models.BooleanField(default=False)
|
||||||
name = models.CharField(max_length=255, null=True, blank=True)
|
name = models.CharField(max_length=255, null=True, blank=True)
|
||||||
description = models.TextField(null=True, blank=True)
|
description = models.TextField(null=True, blank=True)
|
||||||
category = models.ForeignKey(Category, on_delete=models.CASCADE, null=True, blank=True,
|
category = models.ForeignKey(Category, on_delete=models.CASCADE, null=True, related_name='inventory_items')
|
||||||
related_name='inventory_items')
|
availability_policy = models.CharField(max_length=20, choices=AVAILABILITY_POLICY_CHOICES, default='private')
|
||||||
availability_policy = models.CharField(max_length=255, default="private")
|
|
||||||
owned_quantity = models.IntegerField(default=1, validators=[MinValueValidator(0)])
|
owned_quantity = models.IntegerField(default=1, validators=[MinValueValidator(0)])
|
||||||
owner = models.ForeignKey(ToolshedUser, on_delete=models.CASCADE, related_name='inventory_items')
|
owner = models.ForeignKey(ToolshedUser, on_delete=models.CASCADE, related_name='inventory_items')
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
tags = models.ManyToManyField(Tag, through='ItemTag', related_name='inventory_items')
|
tags = models.ManyToManyField(Tag, through='ItemTag', related_name='inventory_items')
|
||||||
properties = models.ManyToManyField(Property, through='ItemProperty')
|
properties = models.ManyToManyField(Property, through='ItemProperty')
|
||||||
files = models.ManyToManyField(File, related_name='connected_items')
|
files = models.ManyToManyField(File, related_name='connected_items')
|
||||||
|
storage_location = models.ForeignKey('StorageLocation', on_delete=models.CASCADE, null=True, blank=True,
|
||||||
|
related_name='inventory_items')
|
||||||
|
|
||||||
def clean(self):
|
def clean(self):
|
||||||
if (self.name is None or self.name == "") and self.files.count() == 0:
|
if (self.name is None or self.name == "") and self.files.count() == 0:
|
||||||
|
@ -77,3 +107,16 @@ class ItemProperty(models.Model):
|
||||||
class ItemTag(models.Model):
|
class ItemTag(models.Model):
|
||||||
tag = models.ForeignKey(Tag, on_delete=models.CASCADE)
|
tag = models.ForeignKey(Tag, on_delete=models.CASCADE)
|
||||||
inventory_item = models.ForeignKey(InventoryItem, on_delete=models.CASCADE)
|
inventory_item = models.ForeignKey(InventoryItem, on_delete=models.CASCADE)
|
||||||
|
|
||||||
|
|
||||||
|
class StorageLocation(models.Model):
|
||||||
|
name = models.CharField(max_length=255)
|
||||||
|
description = models.TextField(null=True, blank=True)
|
||||||
|
category = models.ForeignKey(Category, on_delete=models.CASCADE, null=True, blank=True,
|
||||||
|
related_name='storage_locations')
|
||||||
|
parent = models.ForeignKey('self', on_delete=models.CASCADE, null=True, blank=True, related_name='children')
|
||||||
|
owner = models.ForeignKey(ToolshedUser, on_delete=models.CASCADE, related_name='storage_locations')
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
parent = str(self.parent) + "/" if self.parent else ""
|
||||||
|
return parent + self.name
|
||||||
|
|
|
@ -3,7 +3,7 @@ from authentication.models import KnownIdentity, ToolshedUser, FriendRequestInco
|
||||||
from authentication.serializers import OwnerSerializer
|
from authentication.serializers import OwnerSerializer
|
||||||
from files.models import File
|
from files.models import File
|
||||||
from files.serializers import FileSerializer
|
from files.serializers import FileSerializer
|
||||||
from toolshed.models import Category, Property, ItemProperty, InventoryItem, Tag
|
from toolshed.models import Category, Property, ItemProperty, InventoryItem, Tag, StorageLocation
|
||||||
|
|
||||||
|
|
||||||
class FriendSerializer(serializers.ModelSerializer):
|
class FriendSerializer(serializers.ModelSerializer):
|
||||||
|
@ -11,7 +11,7 @@ class FriendSerializer(serializers.ModelSerializer):
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = KnownIdentity
|
model = KnownIdentity
|
||||||
fields = ['username', 'public_key']
|
fields = ['id', 'username', 'public_key']
|
||||||
|
|
||||||
def get_username(self, obj):
|
def get_username(self, obj):
|
||||||
return obj.username + '@' + obj.domain
|
return obj.username + '@' + obj.domain
|
||||||
|
@ -48,6 +48,23 @@ class CategorySerializer(serializers.ModelSerializer):
|
||||||
return Category.objects.get(name=data.split("/")[-1])
|
return Category.objects.get(name=data.split("/")[-1])
|
||||||
|
|
||||||
|
|
||||||
|
class StorageLocationSerializer(serializers.ModelSerializer):
|
||||||
|
owner = OwnerSerializer(read_only=True)
|
||||||
|
category = CategorySerializer(required=False, allow_null=True)
|
||||||
|
path = serializers.SerializerMethodField()
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = StorageLocation
|
||||||
|
fields = ['id', 'name', 'description', 'path', 'category', 'owner']
|
||||||
|
read_only_fields = ['path']
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_path(obj):
|
||||||
|
if obj.parent:
|
||||||
|
return StorageLocationSerializer.get_path(obj.parent) + "/" + obj.name
|
||||||
|
return obj.name
|
||||||
|
|
||||||
|
|
||||||
class ItemPropertySerializer(serializers.ModelSerializer):
|
class ItemPropertySerializer(serializers.ModelSerializer):
|
||||||
property = PropertySerializer(read_only=True)
|
property = PropertySerializer(read_only=True)
|
||||||
|
|
||||||
|
@ -74,7 +91,7 @@ class InventoryItemSerializer(serializers.ModelSerializer):
|
||||||
class Meta:
|
class Meta:
|
||||||
model = InventoryItem
|
model = InventoryItem
|
||||||
fields = ['id', 'name', 'description', 'owner', 'category', 'availability_policy', 'owned_quantity', 'owner',
|
fields = ['id', 'name', 'description', 'owner', 'category', 'availability_policy', 'owned_quantity', 'owner',
|
||||||
'tags', 'properties', 'files']
|
'tags', 'properties', 'files', 'storage_location']
|
||||||
|
|
||||||
def to_internal_value(self, data):
|
def to_internal_value(self, data):
|
||||||
files = data.pop('files', [])
|
files = data.pop('files', [])
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
from toolshed.models import Category, Tag, Property, InventoryItem, ItemProperty
|
from toolshed.models import Category, Tag, Property, InventoryItem, ItemProperty, StorageLocation
|
||||||
|
|
||||||
|
|
||||||
class CategoryTestMixin:
|
class CategoryTestMixin:
|
||||||
|
@ -8,13 +8,16 @@ class CategoryTestMixin:
|
||||||
self.f['cat3'] = Category.objects.create(name='cat3', origin='test')
|
self.f['cat3'] = Category.objects.create(name='cat3', origin='test')
|
||||||
self.f['subcat1'] = Category.objects.create(name='subcat1', parent=self.f['cat1'], origin='test')
|
self.f['subcat1'] = Category.objects.create(name='subcat1', parent=self.f['cat1'], origin='test')
|
||||||
self.f['subcat2'] = Category.objects.create(name='subcat2', parent=self.f['cat1'], origin='test')
|
self.f['subcat2'] = Category.objects.create(name='subcat2', parent=self.f['cat1'], origin='test')
|
||||||
self.f['subcat3'] = Category.objects.create(name='subcat3', parent=self.f['subcat1'], origin='test')
|
self.f['subcat11'] = Category.objects.create(name='subcat1', parent=self.f['subcat1'], origin='test')
|
||||||
|
self.f['subcat12'] = Category.objects.create(name='subcat2', parent=self.f['subcat1'], origin='test')
|
||||||
|
|
||||||
|
|
||||||
class TagTestMixin:
|
class TagTestMixin:
|
||||||
def prepare_tags(self):
|
def prepare_tags(self):
|
||||||
self.f['tag1'] = Tag.objects.create(name='tag1', description='tag1 description', category=self.f['cat1'], origin='test')
|
self.f['tag1'] = Tag.objects.create(name='tag1', description='tag1 description', category=self.f['cat1'],
|
||||||
self.f['tag2'] = Tag.objects.create(name='tag2', description='tag2 description', category=self.f['cat1'], origin='test')
|
origin='test')
|
||||||
|
self.f['tag2'] = Tag.objects.create(name='tag2', description='tag2 description', category=self.f['cat1'],
|
||||||
|
origin='test')
|
||||||
self.f['tag3'] = Tag.objects.create(name='tag3', origin='test')
|
self.f['tag3'] = Tag.objects.create(name='tag3', origin='test')
|
||||||
|
|
||||||
|
|
||||||
|
@ -41,3 +44,13 @@ class InventoryTestMixin(CategoryTestMixin, TagTestMixin, PropertyTestMixin):
|
||||||
self.f['item2'].tags.add(self.f['tag2'], through_defaults={})
|
self.f['item2'].tags.add(self.f['tag2'], through_defaults={})
|
||||||
ItemProperty.objects.create(inventory_item=self.f['item2'], property=self.f['prop1'], value='value1').save()
|
ItemProperty.objects.create(inventory_item=self.f['item2'], property=self.f['prop1'], value='value1').save()
|
||||||
ItemProperty.objects.create(inventory_item=self.f['item2'], property=self.f['prop2'], value='value2').save()
|
ItemProperty.objects.create(inventory_item=self.f['item2'], property=self.f['prop2'], value='value2').save()
|
||||||
|
|
||||||
|
|
||||||
|
class LocationTestMixin:
|
||||||
|
def prepare_locations(self):
|
||||||
|
self.f['loc1'] = StorageLocation.objects.create(name='loc1', owner=self.f['local_user1'])
|
||||||
|
self.f['loc2'] = StorageLocation.objects.create(name='loc2', owner=self.f['local_user1'],
|
||||||
|
category=self.f['cat1'])
|
||||||
|
self.f['loc3'] = StorageLocation.objects.create(name='loc3', owner=self.f['local_user1'], parent=self.f['loc1'])
|
||||||
|
self.f['loc4'] = StorageLocation.objects.create(name='loc4', owner=self.f['local_user1'], parent=self.f['loc1'],
|
||||||
|
category=self.f['cat1'])
|
||||||
|
|
|
@ -43,7 +43,8 @@ class CombinedApiTestCase(UserTestMixin, CategoryTestMixin, TagTestMixin, Proper
|
||||||
def test_policy_api(self):
|
def test_policy_api(self):
|
||||||
response = client.get('/api/availability_policies/', self.f['local_user1'])
|
response = client.get('/api/availability_policies/', self.f['local_user1'])
|
||||||
self.assertEqual(response.status_code, 200)
|
self.assertEqual(response.status_code, 200)
|
||||||
self.assertEqual(response.json(), ['private', 'friends', 'internal', 'public'])
|
self.assertEqual(response.json(), [['sell', 'Sell'], ['rent', 'Rent'], ['lend', 'Lend'], ['share', 'Share'],
|
||||||
|
['private', 'Private']])
|
||||||
|
|
||||||
def test_combined_api_anonymous(self):
|
def test_combined_api_anonymous(self):
|
||||||
response = anonymous_client.get('/api/info/')
|
response = anonymous_client.get('/api/info/')
|
||||||
|
@ -52,9 +53,11 @@ class CombinedApiTestCase(UserTestMixin, CategoryTestMixin, TagTestMixin, Proper
|
||||||
def test_combined_api(self):
|
def test_combined_api(self):
|
||||||
response = client.get('/api/info/', self.f['local_user1'])
|
response = client.get('/api/info/', self.f['local_user1'])
|
||||||
self.assertEqual(response.status_code, 200)
|
self.assertEqual(response.status_code, 200)
|
||||||
self.assertEqual(response.json()['availability_policies'], ['private', 'friends', 'internal', 'public'])
|
self.assertEqual(response.json()['availability_policies'], [['sell', 'Sell'], ['rent', 'Rent'], ['lend', 'Lend'],
|
||||||
|
['share', 'Share'], ['private', 'Private']])
|
||||||
self.assertEqual(response.json()['categories'],
|
self.assertEqual(response.json()['categories'],
|
||||||
['cat1', 'cat2', 'cat3', 'cat1/subcat1', 'cat1/subcat2', 'cat1/subcat1/subcat3'])
|
['cat1', 'cat2', 'cat3', 'cat1/subcat1', 'cat1/subcat2', 'cat1/subcat1/subcat1',
|
||||||
|
'cat1/subcat1/subcat2'])
|
||||||
self.assertEqual(response.json()['tags'], ['tag1', 'tag2', 'tag3'])
|
self.assertEqual(response.json()['tags'], ['tag1', 'tag2', 'tag3'])
|
||||||
self.assertEqual([p['name'] for p in response.json()['properties']], ['prop1', 'prop2', 'prop3'])
|
self.assertEqual([p['name'] for p in response.json()['properties']], ['prop1', 'prop2', 'prop3'])
|
||||||
self.assertEqual(response.json()['domains'], ['example.com'])
|
self.assertEqual(response.json()['domains'], ['example.com'])
|
||||||
|
|
|
@ -17,10 +17,11 @@ class CategoryTestCase(CategoryTestMixin, UserTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(self.f['cat1'].children.last(), self.f['subcat2'])
|
self.assertEqual(self.f['cat1'].children.last(), self.f['subcat2'])
|
||||||
self.assertEqual(self.f['subcat1'].parent, self.f['cat1'])
|
self.assertEqual(self.f['subcat1'].parent, self.f['cat1'])
|
||||||
self.assertEqual(self.f['subcat2'].parent, self.f['cat1'])
|
self.assertEqual(self.f['subcat2'].parent, self.f['cat1'])
|
||||||
self.assertEqual(self.f['subcat1'].children.count(), 1)
|
self.assertEqual(self.f['subcat1'].children.count(), 2)
|
||||||
self.assertEqual(str(self.f['subcat1']), 'cat1/subcat1')
|
self.assertEqual(str(self.f['subcat1']), 'cat1/subcat1')
|
||||||
self.assertEqual(str(self.f['subcat2']), 'cat1/subcat2')
|
self.assertEqual(str(self.f['subcat2']), 'cat1/subcat2')
|
||||||
self.assertEqual(str(self.f['subcat3']), 'cat1/subcat1/subcat3')
|
self.assertEqual(str(self.f['subcat11']), 'cat1/subcat1/subcat1')
|
||||||
|
self.assertEqual(str(self.f['subcat12']), 'cat1/subcat1/subcat2')
|
||||||
|
|
||||||
|
|
||||||
class CategoryApiTestCase(CategoryTestMixin, UserTestMixin, ToolshedTestCase):
|
class CategoryApiTestCase(CategoryTestMixin, UserTestMixin, ToolshedTestCase):
|
||||||
|
@ -33,10 +34,12 @@ class CategoryApiTestCase(CategoryTestMixin, UserTestMixin, ToolshedTestCase):
|
||||||
def test_get_categories(self):
|
def test_get_categories(self):
|
||||||
reply = client.get('/api/categories/', self.f['local_user1'])
|
reply = client.get('/api/categories/', self.f['local_user1'])
|
||||||
self.assertEqual(reply.status_code, 200)
|
self.assertEqual(reply.status_code, 200)
|
||||||
self.assertEqual(len(reply.json()), 6)
|
self.assertEqual(len(reply.json()), 7)
|
||||||
self.assertEqual(reply.json()[0], 'cat1')
|
self.assertEqual(reply.json()[0], 'cat1')
|
||||||
self.assertEqual(reply.json()[1], 'cat2')
|
self.assertEqual(reply.json()[1], 'cat2')
|
||||||
self.assertEqual(reply.json()[2], 'cat3')
|
self.assertEqual(reply.json()[2], 'cat3')
|
||||||
self.assertEqual(reply.json()[3], 'cat1/subcat1')
|
self.assertEqual(reply.json()[3], 'cat1/subcat1')
|
||||||
self.assertEqual(reply.json()[4], 'cat1/subcat2')
|
self.assertEqual(reply.json()[4], 'cat1/subcat2')
|
||||||
self.assertEqual(reply.json()[5], 'cat1/subcat1/subcat3')
|
self.assertEqual(reply.json()[5], 'cat1/subcat1/subcat1')
|
||||||
|
self.assertEqual(reply.json()[6], 'cat1/subcat1/subcat2')
|
||||||
|
|
||||||
|
|
|
@ -210,6 +210,17 @@ class FriendRequestIncomingTestCase(UserTestMixin, ToolshedTestCase):
|
||||||
})
|
})
|
||||||
self.assertEqual(reply.status_code, 400)
|
self.assertEqual(reply.status_code, 400)
|
||||||
|
|
||||||
|
def test_post_request_missing_key_none(self):
|
||||||
|
befriender = self.f['ext_user1']
|
||||||
|
befriendee = self.f['local_user1']
|
||||||
|
reply = client.post('/api/friendrequests/', befriender, {
|
||||||
|
'befriender': str(befriender),
|
||||||
|
'befriendee': str(befriendee),
|
||||||
|
'befriender_key': None,
|
||||||
|
'secret': 'secret2'
|
||||||
|
})
|
||||||
|
self.assertEqual(reply.status_code, 400)
|
||||||
|
|
||||||
def test_post_request_breaking_key(self):
|
def test_post_request_breaking_key(self):
|
||||||
befriender = self.f['ext_user1']
|
befriender = self.f['ext_user1']
|
||||||
befriendee = self.f['local_user1']
|
befriendee = self.f['local_user1']
|
||||||
|
@ -357,3 +368,43 @@ class FriendRequestOutgoingTestCase(UserTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(befriendee.friends.count(), 1)
|
self.assertEqual(befriendee.friends.count(), 1)
|
||||||
self.assertEqual(befriendee.friends.first().username, befriender.username)
|
self.assertEqual(befriendee.friends.first().username, befriender.username)
|
||||||
self.assertEqual(befriendee.friends.first().domain, befriender.domain)
|
self.assertEqual(befriendee.friends.first().domain, befriender.domain)
|
||||||
|
|
||||||
|
|
||||||
|
class FriendRequestCombinedTestCase(UserTestMixin, ToolshedTestCase):
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
super().setUp()
|
||||||
|
self.prepare_users()
|
||||||
|
|
||||||
|
def test_friend_request_combined(self):
|
||||||
|
befriender = self.f['local_user1']
|
||||||
|
befriendee = self.f['local_user2']
|
||||||
|
reply1 = client.post('/api/friendrequests/', befriender, {
|
||||||
|
'befriender': str(befriender),
|
||||||
|
'befriendee': str(befriendee),
|
||||||
|
})
|
||||||
|
secret = reply1.json()['secret']
|
||||||
|
reply2 = client.post('/api/friendrequests/', befriender, {
|
||||||
|
'befriender': str(befriender),
|
||||||
|
'befriender_key': befriender.public_key(),
|
||||||
|
'befriendee': str(befriendee),
|
||||||
|
'secret': secret
|
||||||
|
})
|
||||||
|
|
||||||
|
self.assertEqual(reply1.status_code, 201)
|
||||||
|
self.assertEqual(reply2.status_code, 208)
|
||||||
|
self.assertEqual(reply1.json()['status'], 'pending')
|
||||||
|
self.assertEqual(reply2.json()['status'], 'exists')
|
||||||
|
self.assertEqual(FriendRequestIncoming.objects.count(), 1)
|
||||||
|
|
||||||
|
def test_friend_request_already_friends(self):
|
||||||
|
befriender = self.f['local_user1']
|
||||||
|
befriendee = self.f['local_user2']
|
||||||
|
befriender.friends.add(befriendee.public_identity)
|
||||||
|
reply1 = client.post('/api/friendrequests/', befriender, {
|
||||||
|
'befriender': str(befriender),
|
||||||
|
'befriendee': str(befriendee),
|
||||||
|
})
|
||||||
|
self.assertEqual(reply1.status_code, 208)
|
||||||
|
self.assertEqual(reply1.json()['status'], 'exists')
|
||||||
|
self.assertEqual(FriendRequestIncoming.objects.count(), 0)
|
||||||
|
|
|
@ -38,7 +38,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
|
|
||||||
def test_post_new_item(self):
|
def test_post_new_item(self):
|
||||||
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
||||||
'availability_policy': 'friends',
|
'availability_policy': 'rent',
|
||||||
'category': 'cat2',
|
'category': 'cat2',
|
||||||
'name': 'test3',
|
'name': 'test3',
|
||||||
'description': 'test',
|
'description': 'test',
|
||||||
|
@ -50,7 +50,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(reply.status_code, 201)
|
self.assertEqual(reply.status_code, 201)
|
||||||
self.assertEqual(InventoryItem.objects.count(), 3)
|
self.assertEqual(InventoryItem.objects.count(), 3)
|
||||||
item = InventoryItem.objects.get(name='test3')
|
item = InventoryItem.objects.get(name='test3')
|
||||||
self.assertEqual(item.availability_policy, 'friends')
|
self.assertEqual(item.availability_policy, 'rent')
|
||||||
self.assertEqual(item.category, Category.objects.get(name='cat2'))
|
self.assertEqual(item.category, Category.objects.get(name='cat2'))
|
||||||
self.assertEqual(item.name, 'test3')
|
self.assertEqual(item.name, 'test3')
|
||||||
self.assertEqual(item.description, 'test')
|
self.assertEqual(item.description, 'test')
|
||||||
|
@ -61,7 +61,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
|
|
||||||
def test_post_new_item2(self):
|
def test_post_new_item2(self):
|
||||||
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
||||||
'availability_policy': 'friends',
|
'availability_policy': 'share',
|
||||||
'name': 'test3',
|
'name': 'test3',
|
||||||
'description': 'test',
|
'description': 'test',
|
||||||
'owned_quantity': 1,
|
'owned_quantity': 1,
|
||||||
|
@ -70,7 +70,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(reply.status_code, 201)
|
self.assertEqual(reply.status_code, 201)
|
||||||
self.assertEqual(InventoryItem.objects.count(), 3)
|
self.assertEqual(InventoryItem.objects.count(), 3)
|
||||||
item = InventoryItem.objects.get(name='test3')
|
item = InventoryItem.objects.get(name='test3')
|
||||||
self.assertEqual(item.availability_policy, 'friends')
|
self.assertEqual(item.availability_policy, 'share')
|
||||||
self.assertEqual(item.category, None)
|
self.assertEqual(item.category, None)
|
||||||
self.assertEqual(item.name, 'test3')
|
self.assertEqual(item.name, 'test3')
|
||||||
self.assertEqual(item.description, 'test')
|
self.assertEqual(item.description, 'test')
|
||||||
|
@ -80,7 +80,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
|
|
||||||
def test_post_new_item_empty(self):
|
def test_post_new_item_empty(self):
|
||||||
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
||||||
'availability_policy': 'friends',
|
'availability_policy': 'rent',
|
||||||
'owned_quantity': 1,
|
'owned_quantity': 1,
|
||||||
'image': '',
|
'image': '',
|
||||||
})
|
})
|
||||||
|
@ -89,7 +89,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
|
|
||||||
def test_post_new_item3(self):
|
def test_post_new_item3(self):
|
||||||
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
reply = client.post('/api/inventory_items/', self.f['local_user1'], {
|
||||||
'availability_policy': 'friends',
|
'availability_policy': 'private',
|
||||||
'name': 'test3',
|
'name': 'test3',
|
||||||
'description': 'test',
|
'description': 'test',
|
||||||
'owned_quantity': 1,
|
'owned_quantity': 1,
|
||||||
|
@ -99,7 +99,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(reply.status_code, 201)
|
self.assertEqual(reply.status_code, 201)
|
||||||
self.assertEqual(InventoryItem.objects.count(), 3)
|
self.assertEqual(InventoryItem.objects.count(), 3)
|
||||||
item = InventoryItem.objects.get(name='test3')
|
item = InventoryItem.objects.get(name='test3')
|
||||||
self.assertEqual(item.availability_policy, 'friends')
|
self.assertEqual(item.availability_policy, 'private')
|
||||||
self.assertEqual(item.category, None)
|
self.assertEqual(item.category, None)
|
||||||
self.assertEqual(item.name, 'test3')
|
self.assertEqual(item.name, 'test3')
|
||||||
self.assertEqual(item.description, 'test')
|
self.assertEqual(item.description, 'test')
|
||||||
|
@ -109,7 +109,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
|
|
||||||
def test_put_item(self):
|
def test_put_item(self):
|
||||||
reply = client.put('/api/inventory_items/1/', self.f['local_user1'], {
|
reply = client.put('/api/inventory_items/1/', self.f['local_user1'], {
|
||||||
'availability_policy': 'friends',
|
'availability_policy': 'sell',
|
||||||
'name': 'test4',
|
'name': 'test4',
|
||||||
'description': 'new description',
|
'description': 'new description',
|
||||||
'owned_quantity': 100,
|
'owned_quantity': 100,
|
||||||
|
@ -121,7 +121,7 @@ class InventoryApiTestCase(UserTestMixin, InventoryTestMixin, ToolshedTestCase):
|
||||||
self.assertEqual(reply.status_code, 200)
|
self.assertEqual(reply.status_code, 200)
|
||||||
self.assertEqual(InventoryItem.objects.count(), 2)
|
self.assertEqual(InventoryItem.objects.count(), 2)
|
||||||
item = InventoryItem.objects.get(id=1)
|
item = InventoryItem.objects.get(id=1)
|
||||||
self.assertEqual(item.availability_policy, 'friends')
|
self.assertEqual(item.availability_policy, 'sell')
|
||||||
self.assertEqual(item.category, None)
|
self.assertEqual(item.category, None)
|
||||||
self.assertEqual(item.name, 'test4')
|
self.assertEqual(item.name, 'test4')
|
||||||
self.assertEqual(item.description, 'new description')
|
self.assertEqual(item.description, 'new description')
|
||||||
|
|
71
backend/toolshed/tests/test_locations.py
Normal file
71
backend/toolshed/tests/test_locations.py
Normal file
|
@ -0,0 +1,71 @@
|
||||||
|
from authentication.tests import SignatureAuthClient, UserTestMixin, ToolshedTestCase
|
||||||
|
from files.tests import FilesTestMixin
|
||||||
|
from toolshed.models import InventoryItem, Category
|
||||||
|
from toolshed.tests import InventoryTestMixin, LocationTestMixin
|
||||||
|
|
||||||
|
client = SignatureAuthClient()
|
||||||
|
|
||||||
|
|
||||||
|
class LocationApiTestCase(UserTestMixin, InventoryTestMixin, LocationTestMixin, ToolshedTestCase):
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
super().setUp()
|
||||||
|
self.prepare_users()
|
||||||
|
self.prepare_categories()
|
||||||
|
self.prepare_tags()
|
||||||
|
self.prepare_properties()
|
||||||
|
self.prepare_locations()
|
||||||
|
self.prepare_inventory()
|
||||||
|
|
||||||
|
def test_locations(self):
|
||||||
|
self.assertEqual("loc1", str(self.f['loc1']))
|
||||||
|
self.assertEqual("loc1", self.f['loc1'].name)
|
||||||
|
self.assertEqual("loc2", str(self.f['loc2']))
|
||||||
|
self.assertEqual("loc2", self.f['loc2'].name)
|
||||||
|
self.assertEqual("loc1/loc3", str(self.f['loc3']))
|
||||||
|
self.assertEqual("loc3", self.f['loc3'].name)
|
||||||
|
self.assertEqual(self.f['loc1'], self.f['loc3'].parent)
|
||||||
|
self.assertEqual("loc1/loc4", str(self.f['loc4']))
|
||||||
|
self.assertEqual("loc4", self.f['loc4'].name)
|
||||||
|
self.assertEqual(self.f['loc1'], self.f['loc4'].parent)
|
||||||
|
|
||||||
|
def test_get_inventory(self):
|
||||||
|
reply = client.get('/api/inventory_items/', self.f['local_user1'])
|
||||||
|
self.assertEqual(reply.status_code, 200)
|
||||||
|
self.assertEqual(len(reply.json()), 2)
|
||||||
|
self.assertEqual(reply.json()[0]['name'], 'test1')
|
||||||
|
self.assertEqual(reply.json()[0]['description'], 'test')
|
||||||
|
self.assertEqual(reply.json()[0]['owned_quantity'], 1)
|
||||||
|
self.assertEqual(reply.json()[0]['tags'], [])
|
||||||
|
self.assertEqual(reply.json()[0]['properties'], [])
|
||||||
|
self.assertEqual(reply.json()[0]['category'], 'cat1')
|
||||||
|
self.assertEqual(reply.json()[0]['availability_policy'], 'friends')
|
||||||
|
self.assertEqual(reply.json()[1]['name'], 'test2')
|
||||||
|
self.assertEqual(reply.json()[1]['description'], 'test2')
|
||||||
|
self.assertEqual(reply.json()[1]['owned_quantity'], 1)
|
||||||
|
self.assertEqual(reply.json()[1]['tags'], ['tag1', 'tag2'])
|
||||||
|
self.assertEqual(reply.json()[1]['properties'],
|
||||||
|
[{'name': 'prop1', 'value': 'value1'}, {'name': 'prop2', 'value': 'value2'}])
|
||||||
|
self.assertEqual(reply.json()[1]['category'], 'cat1')
|
||||||
|
self.assertEqual(reply.json()[1]['availability_policy'], 'friends')
|
||||||
|
|
||||||
|
def test_get_inventory_item(self):
|
||||||
|
reply = client.get('/api/storage_locations/', self.f['local_user1'])
|
||||||
|
self.assertEqual(reply.status_code, 200)
|
||||||
|
self.assertEqual(len(reply.json()), 4)
|
||||||
|
self.assertEqual(reply.json()[0]['name'], 'loc1')
|
||||||
|
self.assertEqual(reply.json()[0]['description'], None)
|
||||||
|
self.assertEqual(reply.json()[0]['category'], None)
|
||||||
|
self.assertEqual(reply.json()[0]['path'], 'loc1')
|
||||||
|
self.assertEqual(reply.json()[1]['name'], 'loc2')
|
||||||
|
self.assertEqual(reply.json()[1]['description'], None)
|
||||||
|
self.assertEqual(reply.json()[1]['category'], 'cat1')
|
||||||
|
self.assertEqual(reply.json()[1]['path'], 'loc2')
|
||||||
|
self.assertEqual(reply.json()[2]['name'], 'loc3')
|
||||||
|
self.assertEqual(reply.json()[2]['description'], None)
|
||||||
|
self.assertEqual(reply.json()[2]['category'], None)
|
||||||
|
self.assertEqual(reply.json()[2]['path'], 'loc1/loc3')
|
||||||
|
self.assertEqual(reply.json()[3]['name'], 'loc4')
|
||||||
|
self.assertEqual(reply.json()[3]['description'], None)
|
||||||
|
self.assertEqual(reply.json()[3]['category'], 'cat1')
|
||||||
|
self.assertEqual(reply.json()[3]['path'], 'loc1/loc4')
|
16
deploy/dev/Dockerfile.backend
Normal file
16
deploy/dev/Dockerfile.backend
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
# Use an official Python runtime as instance_a parent image
|
||||||
|
FROM python:3.9
|
||||||
|
|
||||||
|
# Set environment variables
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE 1
|
||||||
|
ENV PYTHONUNBUFFERED 1
|
||||||
|
|
||||||
|
# Set work directory
|
||||||
|
WORKDIR /code
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
COPY requirements.txt /code/
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
# Run the application
|
||||||
|
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000", "--insecure"]
|
16
deploy/dev/Dockerfile.dns
Normal file
16
deploy/dev/Dockerfile.dns
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
# Use an official Python runtime as instance_a parent image
|
||||||
|
FROM python:3.9
|
||||||
|
|
||||||
|
# Set environment variables
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE 1
|
||||||
|
ENV PYTHONUNBUFFERED 1
|
||||||
|
|
||||||
|
# Set work directory
|
||||||
|
WORKDIR /dns
|
||||||
|
|
||||||
|
COPY dns_server.py /dns/
|
||||||
|
|
||||||
|
RUN pip install dnslib
|
||||||
|
|
||||||
|
# Run the application
|
||||||
|
CMD ["python", "dns_server.py"]
|
13
deploy/dev/Dockerfile.frontend
Normal file
13
deploy/dev/Dockerfile.frontend
Normal file
|
@ -0,0 +1,13 @@
|
||||||
|
# Use an official Node.js runtime as instance_a parent image
|
||||||
|
FROM node:14
|
||||||
|
|
||||||
|
# Set work directory
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install app dependencies
|
||||||
|
# A wildcard is used to ensure both package.json AND package-lock.json are copied
|
||||||
|
COPY package.json ./
|
||||||
|
|
||||||
|
RUN npm install
|
||||||
|
|
||||||
|
CMD [ "npm", "run", "dev", "--", "--host"]
|
14
deploy/dev/Dockerfile.proxy
Normal file
14
deploy/dev/Dockerfile.proxy
Normal file
|
@ -0,0 +1,14 @@
|
||||||
|
FROM nginx:bookworm
|
||||||
|
|
||||||
|
# snakeoil for localhost
|
||||||
|
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get install -y openssl && \
|
||||||
|
openssl genrsa -des3 -passout pass:x -out server.pass.key 2048 && \
|
||||||
|
openssl rsa -passin pass:x -in server.pass.key -out server.key && \
|
||||||
|
rm server.pass.key && \
|
||||||
|
openssl req -new -key server.key -out server.csr \
|
||||||
|
-subj "/C=US/ST=Denial/L=Springfield/O=Dis/CN=localhost" && \
|
||||||
|
openssl x509 -req -days 365 -in server.csr -signkey server.key -out server.crt &&\
|
||||||
|
mv server.crt /etc/nginx/nginx.crt && \
|
||||||
|
mv server.key /etc/nginx/nginx.key \
|
15
deploy/dev/Dockerfile.wiki
Normal file
15
deploy/dev/Dockerfile.wiki
Normal file
|
@ -0,0 +1,15 @@
|
||||||
|
# Use an official Python runtime as instance_a parent image
|
||||||
|
FROM python:3.9
|
||||||
|
|
||||||
|
# Set environment variables
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE 1
|
||||||
|
ENV PYTHONUNBUFFERED 1
|
||||||
|
|
||||||
|
# Set work directory
|
||||||
|
WORKDIR /wiki
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
RUN pip install --no-cache-dir mkdocs
|
||||||
|
|
||||||
|
# Run the application
|
||||||
|
CMD ["mkdocs", "serve", "--dev-addr=0.0.0.0:8001"]
|
72
deploy/dev/dns_server.py
Normal file
72
deploy/dev/dns_server.py
Normal file
|
@ -0,0 +1,72 @@
|
||||||
|
import http.server
|
||||||
|
import socketserver
|
||||||
|
import urllib.parse
|
||||||
|
import dnslib
|
||||||
|
import base64
|
||||||
|
|
||||||
|
try:
|
||||||
|
|
||||||
|
def resolve(zone, qname, qtype):
|
||||||
|
for record in zone:
|
||||||
|
if record["name"] == qname and record["type"] == qtype and "value" in record:
|
||||||
|
return record["value"]
|
||||||
|
|
||||||
|
|
||||||
|
class DnsHttpRequestHandler(http.server.BaseHTTPRequestHandler):
|
||||||
|
def do_GET(self):
|
||||||
|
try:
|
||||||
|
with open("/dns/zone.json", "r") as f:
|
||||||
|
import json
|
||||||
|
zone = json.load(f)
|
||||||
|
|
||||||
|
url = urllib.parse.urlparse(self.path)
|
||||||
|
if url.path != "/dns-query":
|
||||||
|
self.send_response(404)
|
||||||
|
return
|
||||||
|
query = urllib.parse.parse_qs(url.query)
|
||||||
|
if "dns" not in query:
|
||||||
|
self.send_response(400)
|
||||||
|
return
|
||||||
|
query_base64 = query["dns"][0]
|
||||||
|
padded = query_base64 + "=" * (4 - len(query_base64) % 4)
|
||||||
|
raw = base64.b64decode(padded)
|
||||||
|
dns = dnslib.DNSRecord.parse(raw)
|
||||||
|
|
||||||
|
response = dnslib.DNSRecord(dnslib.DNSHeader(id=dns.header.id, qr=1, aa=1, ra=1), q=dns.q)
|
||||||
|
|
||||||
|
record = resolve(zone, dns.q.qname, dnslib.QTYPE[dns.q.qtype])
|
||||||
|
if record:
|
||||||
|
if dns.q.qtype == dnslib.QTYPE.SRV:
|
||||||
|
print("SRV record")
|
||||||
|
reply = dnslib.SRV(record["priority"], record["weight"], record["port"], record["target"])
|
||||||
|
response.add_answer(dnslib.RR(dns.q.qname, dns.q.qtype, rdata=reply))
|
||||||
|
else:
|
||||||
|
response.header.rcode = dnslib.RCODE.NXDOMAIN
|
||||||
|
|
||||||
|
print(response)
|
||||||
|
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-type", "application/dns-message")
|
||||||
|
self.end_headers()
|
||||||
|
pack = response.pack()
|
||||||
|
self.wfile.write(pack)
|
||||||
|
return
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error: {e}")
|
||||||
|
self.send_response(500)
|
||||||
|
self.send_header("Content-type", "text/html")
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(b"Internal Server Error")
|
||||||
|
|
||||||
|
|
||||||
|
handler_object = DnsHttpRequestHandler
|
||||||
|
|
||||||
|
PORT = 8053
|
||||||
|
my_server = socketserver.TCPServer(("", PORT), handler_object)
|
||||||
|
|
||||||
|
# Start the server
|
||||||
|
print(f"Starting server on port {PORT}")
|
||||||
|
my_server.serve_forever()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error: {e}")
|
8
deploy/dev/instance_a/a.env
Normal file
8
deploy/dev/instance_a/a.env
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
|
||||||
|
# SECURITY WARNING: don't run with debug turned on in production!
|
||||||
|
DEBUG=True
|
||||||
|
|
||||||
|
# SECURITY WARNING: keep the secret key used in production secret!
|
||||||
|
SECRET_KEY='e*lm&*!j0_stqaiod$1zob(vs@aq6+n-i$1%!rek)_v9n^ue$3'
|
||||||
|
|
||||||
|
ALLOWED_HOSTS="*"
|
3
deploy/dev/instance_a/dns.json
Normal file
3
deploy/dev/instance_a/dns.json
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
[
|
||||||
|
"127.0.0.3:5353"
|
||||||
|
]
|
3
deploy/dev/instance_a/domains.json
Normal file
3
deploy/dev/instance_a/domains.json
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
[
|
||||||
|
"a.localhost"
|
||||||
|
]
|
96
deploy/dev/instance_a/nginx-a.dev.conf
Normal file
96
deploy/dev/instance_a/nginx-a.dev.conf
Normal file
|
@ -0,0 +1,96 @@
|
||||||
|
events {}
|
||||||
|
|
||||||
|
http {
|
||||||
|
upstream backend {
|
||||||
|
server backend-a:8000;
|
||||||
|
}
|
||||||
|
|
||||||
|
upstream frontend {
|
||||||
|
server frontend:5173;
|
||||||
|
}
|
||||||
|
|
||||||
|
upstream wiki {
|
||||||
|
server wiki:8001;
|
||||||
|
}
|
||||||
|
|
||||||
|
upstream dns {
|
||||||
|
server dns:8053;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
|
||||||
|
listen 8080 ssl;
|
||||||
|
server_name localhost;
|
||||||
|
|
||||||
|
ssl_certificate /etc/nginx/nginx.crt;
|
||||||
|
ssl_certificate_key /etc/nginx/nginx.key;
|
||||||
|
|
||||||
|
location /api {
|
||||||
|
proxy_set_header Host $host:$server_port;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header X-Forwarded-Host $host:$server_port;
|
||||||
|
proxy_set_header X-Forwarded-Port $server_port;
|
||||||
|
proxy_pass http://backend;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /auth {
|
||||||
|
proxy_set_header Host $host:$server_port;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header X-Forwarded-Host $host:$server_port;
|
||||||
|
proxy_set_header X-Forwarded-Port $server_port;
|
||||||
|
proxy_pass http://backend;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /docs {
|
||||||
|
proxy_pass http://backend/docs;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /static {
|
||||||
|
proxy_pass http://backend/static;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /wiki {
|
||||||
|
proxy_pass http://wiki/wiki;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /livereload {
|
||||||
|
proxy_pass http://wiki/livereload;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /local/ {
|
||||||
|
alias /var/www/;
|
||||||
|
try_files $uri.json =404;
|
||||||
|
add_header Content-Type application/json;
|
||||||
|
}
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection "Upgrade";
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_pass http://frontend;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
# DoH server
|
||||||
|
server {
|
||||||
|
listen 5353 ssl;
|
||||||
|
server_name localhost;
|
||||||
|
|
||||||
|
ssl_certificate /etc/nginx/nginx.crt;
|
||||||
|
ssl_certificate_key /etc/nginx/nginx.key;
|
||||||
|
|
||||||
|
location /dns-query {
|
||||||
|
proxy_pass http://dns;
|
||||||
|
# allow any origin
|
||||||
|
add_header 'Access-Control-Allow-Origin' '*';
|
||||||
|
add_header 'Access-Control-Allow-Methods' 'GET, OPTIONS';
|
||||||
|
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
7
deploy/dev/instance_b/b.env
Normal file
7
deploy/dev/instance_b/b.env
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
# SECURITY WARNING: don't run with debug turned on in production!
|
||||||
|
DEBUG=True
|
||||||
|
|
||||||
|
# SECURITY WARNING: keep the secret key used in production secret!
|
||||||
|
SECRET_KEY='7ccxjje%q@@0*z+r&-$fy3(rj9n)%$!sk-k++-&rb=_u(wpjbe'
|
||||||
|
|
||||||
|
ALLOWED_HOSTS="*"
|
46
deploy/dev/instance_b/nginx-b.dev.conf
Normal file
46
deploy/dev/instance_b/nginx-b.dev.conf
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
events {}
|
||||||
|
|
||||||
|
http {
|
||||||
|
upstream backend {
|
||||||
|
server backend-b:8000;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
|
||||||
|
listen 8080 ssl;
|
||||||
|
server_name localhost;
|
||||||
|
|
||||||
|
ssl_certificate /etc/nginx/nginx.crt;
|
||||||
|
ssl_certificate_key /etc/nginx/nginx.key;
|
||||||
|
|
||||||
|
location /api {
|
||||||
|
#proxy_set_header X-Forwarded-For "$http_x_forwarded_for, $realip_remote_addr";
|
||||||
|
proxy_set_header Host $host:$server_port;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header X-Forwarded-Host $host:$server_port;
|
||||||
|
proxy_set_header X-Forwarded-Port $server_port;
|
||||||
|
proxy_pass http://backend;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /auth {
|
||||||
|
#proxy_set_header X-Forwarded-For "$http_x_forwarded_for, $realip_remote_addr";
|
||||||
|
proxy_set_header Host $host:$server_port;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header X-Forwarded-Host $host:$server_port;
|
||||||
|
proxy_set_header X-Forwarded-Port $server_port;
|
||||||
|
proxy_pass http://backend;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /docs {
|
||||||
|
proxy_pass http://backend/docs;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /static {
|
||||||
|
proxy_pass http://backend/static;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
24
deploy/dev/zone.json
Normal file
24
deploy/dev/zone.json
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"name": "_toolshed-server._tcp.a.localhost.",
|
||||||
|
"type": "SRV",
|
||||||
|
"ttl": 60,
|
||||||
|
"value": {
|
||||||
|
"priority": 0,
|
||||||
|
"weight": 5,
|
||||||
|
"port": 8080,
|
||||||
|
"target": "127.0.0.1."
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "_toolshed-server._tcp.b.localhost.",
|
||||||
|
"type": "SRV",
|
||||||
|
"ttl": 60,
|
||||||
|
"value": {
|
||||||
|
"priority": 0,
|
||||||
|
"weight": 5,
|
||||||
|
"port": 8080,
|
||||||
|
"target": "127.0.0.2."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
78
deploy/docker-compose.override.yml
Normal file
78
deploy/docker-compose.override.yml
Normal file
|
@ -0,0 +1,78 @@
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
backend-a:
|
||||||
|
build:
|
||||||
|
context: ../backend/
|
||||||
|
dockerfile: ../deploy/dev/Dockerfile.backend
|
||||||
|
volumes:
|
||||||
|
- ../backend:/code
|
||||||
|
- ../deploy/dev/instance_a/a.env:/code/.env
|
||||||
|
- ../deploy/dev/instance_a/a.sqlite3:/code/db.sqlite3
|
||||||
|
expose:
|
||||||
|
- 8000
|
||||||
|
command: bash -c "python configure.py; python configure.py testdata; python manage.py runserver 0.0.0.0:8000 --insecure"
|
||||||
|
|
||||||
|
backend-b:
|
||||||
|
build:
|
||||||
|
context: ../backend/
|
||||||
|
dockerfile: ../deploy/dev/Dockerfile.backend
|
||||||
|
volumes:
|
||||||
|
- ../backend:/code
|
||||||
|
- ../deploy/dev/instance_b/b.env:/code/.env
|
||||||
|
- ../deploy/dev/instance_b/b.sqlite3:/code/db.sqlite3
|
||||||
|
expose:
|
||||||
|
- 8000
|
||||||
|
command: bash -c "python configure.py; python configure.py testdata; python manage.py runserver 0.0.0.0:8000 --insecure"
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ../frontend/
|
||||||
|
dockerfile: ../deploy/dev/Dockerfile.frontend
|
||||||
|
volumes:
|
||||||
|
- ../frontend:/app:ro
|
||||||
|
- /app/node_modules
|
||||||
|
expose:
|
||||||
|
- 5173
|
||||||
|
command: npm run dev -- --host
|
||||||
|
|
||||||
|
wiki:
|
||||||
|
build:
|
||||||
|
context: ../
|
||||||
|
dockerfile: deploy/dev/Dockerfile.wiki
|
||||||
|
volumes:
|
||||||
|
- ../mkdocs.yml:/wiki/mkdocs.yml
|
||||||
|
- ../docs:/wiki/docs
|
||||||
|
expose:
|
||||||
|
- 8001
|
||||||
|
command: mkdocs serve --dev-addr=0.0.0.0:8001
|
||||||
|
|
||||||
|
proxy-a:
|
||||||
|
build:
|
||||||
|
context: ./
|
||||||
|
dockerfile: dev/Dockerfile.proxy
|
||||||
|
volumes:
|
||||||
|
- ./dev/instance_a/nginx-a.dev.conf:/etc/nginx/nginx.conf:ro
|
||||||
|
- ./dev/instance_a/dns.json:/var/www/dns.json:ro
|
||||||
|
- ./dev/instance_a/domains.json:/var/www/domains.json:ro
|
||||||
|
ports:
|
||||||
|
- "127.0.0.1:8080:8080"
|
||||||
|
- "127.0.0.3:5353:5353"
|
||||||
|
|
||||||
|
proxy-b:
|
||||||
|
build:
|
||||||
|
context: ./
|
||||||
|
dockerfile: dev/Dockerfile.proxy
|
||||||
|
volumes:
|
||||||
|
- ./dev/instance_b/nginx-b.dev.conf:/etc/nginx/nginx.conf:ro
|
||||||
|
ports:
|
||||||
|
- "127.0.0.2:8080:8080"
|
||||||
|
|
||||||
|
dns:
|
||||||
|
build:
|
||||||
|
context: ./dev/
|
||||||
|
dockerfile: Dockerfile.dns
|
||||||
|
volumes:
|
||||||
|
- ./dev/zone.json:/dns/zone.json
|
||||||
|
expose:
|
||||||
|
- 8053
|
|
@ -1,43 +0,0 @@
|
||||||
version: '3.3'
|
|
||||||
|
|
||||||
services:
|
|
||||||
# db:
|
|
||||||
# image: postgres
|
|
||||||
# container_name: docker-django-vue-db
|
|
||||||
# environment:
|
|
||||||
# POSTGRES_USER: user
|
|
||||||
# POSTGRES_PASSWORD: pass
|
|
||||||
# POSTGRES_DB: db
|
|
||||||
# restart: unless-stopped
|
|
||||||
# ports:
|
|
||||||
# - "5432:5432"
|
|
||||||
django:
|
|
||||||
build:
|
|
||||||
context: ./backend
|
|
||||||
dockerfile: ./Dockerfile
|
|
||||||
command: python backend/manage.py runserver 0.0.0.0:8000
|
|
||||||
volumes:
|
|
||||||
- .:/app
|
|
||||||
ports:
|
|
||||||
- "8002:8000"
|
|
||||||
networks:
|
|
||||||
- internal
|
|
||||||
# depends_on:
|
|
||||||
# - db
|
|
||||||
vue:
|
|
||||||
build:
|
|
||||||
context: ./frontend
|
|
||||||
dockerfile: ./Dockerfile
|
|
||||||
command: nginx -g 'daemon off;'
|
|
||||||
volumes:
|
|
||||||
- .:/app
|
|
||||||
ports:
|
|
||||||
- "8001:80"
|
|
||||||
networks:
|
|
||||||
- internal
|
|
||||||
- external
|
|
||||||
depends_on:
|
|
||||||
- django
|
|
||||||
networks:
|
|
||||||
external:
|
|
||||||
internal:
|
|
102
docs/deployment.md
Normal file
102
docs/deployment.md
Normal file
|
@ -0,0 +1,102 @@
|
||||||
|
# Deployment
|
||||||
|
|
||||||
|
## Native
|
||||||
|
|
||||||
|
### Requirements
|
||||||
|
|
||||||
|
- python3
|
||||||
|
- python3-pip
|
||||||
|
- python3-venv
|
||||||
|
- wget
|
||||||
|
- unzip
|
||||||
|
- nginx
|
||||||
|
- uwsgi
|
||||||
|
- certbot
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
Get the latest release:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd /var/www # or wherever you want to install toolshed
|
||||||
|
wget https://git.neulandlabor.de/j3d1/toolshed/releases/download/<version>/toolshed.zip
|
||||||
|
```
|
||||||
|
or from github:
|
||||||
|
``` bash
|
||||||
|
cd /var/www # or wherever you want to install toolshed
|
||||||
|
wget https://github.com/gr4yj3d1/toolshed/archive/refs/tags/<version>.zip -O toolshed.zip
|
||||||
|
```
|
||||||
|
|
||||||
|
Extract and configure the backend:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
unzip toolshed.zip
|
||||||
|
cd toolshed/backend
|
||||||
|
python3 -m venv venv
|
||||||
|
source venv/bin/activate
|
||||||
|
pip install -r requirements.txt
|
||||||
|
python configure.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Configure uWSGI to serve the backend locally:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd /var/www/toolshed/backend
|
||||||
|
cp toolshed.ini /etc/uwsgi/apps-available/
|
||||||
|
ln -s /etc/uwsgi/apps-available/toolshed.ini /etc/uwsgi/apps-enabled/
|
||||||
|
systemctl restart uwsgi
|
||||||
|
```
|
||||||
|
|
||||||
|
Configure nginx to serve the static files and proxy the requests to the backend:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd /var/www/toolshed/backend
|
||||||
|
cp toolshed.nginx /etc/nginx/sites-available/toolshed
|
||||||
|
ln -s /etc/nginx/sites-available/toolsheed /etc/nginx/sites-enabled/
|
||||||
|
systemctl restart nginx
|
||||||
|
```
|
||||||
|
|
||||||
|
Configure certbot to get a certificate for the domain:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
certbot --nginx -d <domain>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Update
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd /var/www
|
||||||
|
wget https://git.neulandlabor.de/j3d1/toolshed/releases/download/<version>/toolshed.zip
|
||||||
|
unzip toolshed.zip
|
||||||
|
cd toolshed/backend
|
||||||
|
source venv/bin/activate
|
||||||
|
pip install -r requirements.txt
|
||||||
|
python configure.py
|
||||||
|
systemctl restart uwsgi
|
||||||
|
```
|
||||||
|
|
||||||
|
## Docker
|
||||||
|
|
||||||
|
### Requirements
|
||||||
|
|
||||||
|
- docker
|
||||||
|
- docker-compose
|
||||||
|
- git
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
git clone https://git.neulandlabor.de/j3d1/toolshed.git
|
||||||
|
# or
|
||||||
|
git clone https://github.com/gr4yj3d1/toolshed.git
|
||||||
|
cd toolshed
|
||||||
|
docker-compose -f deploy/docker-compose.prod.yml up -d --build
|
||||||
|
```
|
||||||
|
|
||||||
|
### Update
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
toolshed
|
||||||
|
git pull
|
||||||
|
docker-compose -f deploy/docker-compose.prod.yml up -d --build
|
||||||
|
```
|
105
docs/development.md
Normal file
105
docs/development.md
Normal file
|
@ -0,0 +1,105 @@
|
||||||
|
# Development
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
git clone https://github.com/gr4yj3d1/toolshed.git
|
||||||
|
```
|
||||||
|
|
||||||
|
or
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
git clone https://git.neulandlabor.de/j3d1/toolshed.git
|
||||||
|
```
|
||||||
|
|
||||||
|
## Native
|
||||||
|
|
||||||
|
To a certain extent, the frontend and backend can be developed independently. The frontend is a Vue.js project and the
|
||||||
|
backend is a DRF (Django-Rest-Framework) project. If you want to develop the frontend, you can do so without the backend
|
||||||
|
and vice
|
||||||
|
versa. However, especially for the frontend, it is recommended to use the backend as well, as the frontend does not have
|
||||||
|
a lot of 'offline' functionality.
|
||||||
|
If you want to run the fullstack application, it is recommended to use the [docker-compose](#docker) method.
|
||||||
|
|
||||||
|
### Frontend
|
||||||
|
|
||||||
|
install `node.js` and `npm`
|
||||||
|
|
||||||
|
on Debian* for example: `sudo apt install npm`
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd toolshed/frontend
|
||||||
|
npm install
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
|
||||||
|
### Backend
|
||||||
|
|
||||||
|
Install `python3`, `pip` and `virtualenv`
|
||||||
|
|
||||||
|
on Debian* for example: `sudo apt install python3 python3-pip python3-venv`
|
||||||
|
|
||||||
|
Prepare backend environment
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd toolshed/backend
|
||||||
|
python -m venv venv
|
||||||
|
source venv/bin/activate
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
Run the test suite:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
python manage.py test
|
||||||
|
```
|
||||||
|
|
||||||
|
optionally with coverage:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
coverage run manage.py test
|
||||||
|
coverage report
|
||||||
|
```
|
||||||
|
|
||||||
|
Start the backend in development mode:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
python manage.py migrate
|
||||||
|
cp .env.dist .env
|
||||||
|
echo "DEBUG = True" >> .env
|
||||||
|
python manage.py runserver 0.0.0.0:8000
|
||||||
|
```
|
||||||
|
|
||||||
|
provides the api docs at `http://localhost:8000/docs/`
|
||||||
|
|
||||||
|
### Docs (Wiki)
|
||||||
|
|
||||||
|
Install `mkdocs`
|
||||||
|
|
||||||
|
on Debian* for example: `sudo apt install mkdocs`
|
||||||
|
|
||||||
|
Start the docs server:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
cd toolshed/docs
|
||||||
|
mkdocs serve -a 0.0.0.0:8080
|
||||||
|
```
|
||||||
|
|
||||||
|
## Docker
|
||||||
|
|
||||||
|
### Fullstack
|
||||||
|
|
||||||
|
Install `docker` and `docker-compose`
|
||||||
|
|
||||||
|
on Debian* for example: `sudo apt install docker.io docker-compose`
|
||||||
|
|
||||||
|
Start the fullstack application:
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
docker-compose -f deploy/docker-compose.override.yml up --build
|
||||||
|
```
|
||||||
|
|
||||||
|
This will start an instance of the frontend and wiki, a limited DoH (DNS over HTTPS) server and **two** instances of the backend.
|
||||||
|
The two backend instances are set up to use the domains `a.localhost` and `b.localhost`, the local DoH
|
||||||
|
server is used to direct the frontend to the correct backend instance.
|
||||||
|
The frontend is configured to act as if it was served from the domain `a.localhost`.
|
||||||
|
Access the frontend at `http://localhost:8080/`, backend at `http://localhost:8080/api/`, api docs
|
||||||
|
at `http://localhost:8080/docs/` and the wiki at `http://localhost:8080/wiki/`.
|
23
docs/federation.md
Normal file
23
docs/federation.md
Normal file
|
@ -0,0 +1,23 @@
|
||||||
|
# Federation
|
||||||
|
|
||||||
|
This section will cover how federation works in Toolshed.
|
||||||
|
|
||||||
|
## What is Federation?
|
||||||
|
|
||||||
|
Since user of Toolshed you can search and interact the inventory of all their 'friends' that are potentially on
|
||||||
|
different servers there is a need for a way to communicate between servers. We don't want to rely on a central server that
|
||||||
|
stores all the data and we don't want to have a central server that handles all the communication between servers. This
|
||||||
|
is where federation comes in. Toolshed uses a protocol that can not only exchange data with the server where the user
|
||||||
|
is registered but also with the servers where their friends are registered.
|
||||||
|
|
||||||
|
## How does it work?
|
||||||
|
|
||||||
|
Any user can register on any server and creates a personal key pair. The public key is stored on the server and the private
|
||||||
|
key is stored on the client. The private key is used to sign all requests to the server and the public key is used to
|
||||||
|
verify the signature. Once a user has registered on a server they can send friend requests to other users containing
|
||||||
|
their public key. If the other user accepts the friend request, the server stores the public key of the friend and
|
||||||
|
uses it to verify access to the friend's inventory. While accepting a friend request the user also automatically sends
|
||||||
|
their own public key to the friend's server. This way both users can access each other's inventory.
|
||||||
|
|
||||||
|
The protocol is based on a simple HTTPS API exchanging JSON data that is signed with the user's private key. By default
|
||||||
|
Toolshed servers provide a documentation of the API at [/docs/api](/docs/api).
|
|
@ -6,47 +6,8 @@ This is the documentation for the Toolshed project. It is a work in progress.
|
||||||
`#social` `#network` `#federation` `#decentralized` `#federated` `#socialnetwork` `#fediverse` `#community` `#hashtags`
|
`#social` `#network` `#federation` `#decentralized` `#federated` `#socialnetwork` `#fediverse` `#community` `#hashtags`
|
||||||
|
|
||||||
## Getting Started
|
## Getting Started
|
||||||
|
- [Deploying Toolshed](deployment.md)
|
||||||
|
- [Development Setup](development.md)
|
||||||
|
- [About Federation](federation.md)
|
||||||
|
|
||||||
## Installation
|
|
||||||
|
|
||||||
``` bash
|
|
||||||
# TODO add installation instructions
|
|
||||||
# similar to development instructions just with more docker
|
|
||||||
# TODO add docker-compose.yml
|
|
||||||
```
|
|
||||||
|
|
||||||
## Development
|
|
||||||
|
|
||||||
``` bash
|
|
||||||
git clone https://github.com/gr4yj3d1/toolshed.git
|
|
||||||
```
|
|
||||||
or
|
|
||||||
``` bash
|
|
||||||
git clone https://git.neulandlabor.de/j3d1/toolshed.git
|
|
||||||
```
|
|
||||||
|
|
||||||
### Frontend
|
|
||||||
|
|
||||||
``` bash
|
|
||||||
cd toolshed/frontend
|
|
||||||
npm install
|
|
||||||
npm run dev
|
|
||||||
```
|
|
||||||
|
|
||||||
### Backend
|
|
||||||
|
|
||||||
``` bash
|
|
||||||
cd toolshed/backend
|
|
||||||
python3 -m venv venv
|
|
||||||
source venv/bin/activate
|
|
||||||
pip install -r requirements.txt
|
|
||||||
python manage.py migrate
|
|
||||||
python manage.py runserver 0.0.0.0:8000
|
|
||||||
```
|
|
||||||
|
|
||||||
### Docs
|
|
||||||
|
|
||||||
``` bash
|
|
||||||
cd toolshed/docs
|
|
||||||
mkdocs serve -a 0.0.0.0:8080
|
|
||||||
```
|
|
||||||
|
|
|
@ -1,14 +0,0 @@
|
||||||
FROM node:alpine as builder
|
|
||||||
WORKDIR /app
|
|
||||||
COPY ./package.json /app/package.json
|
|
||||||
COPY . /app
|
|
||||||
RUN npm install
|
|
||||||
RUN npm run build
|
|
||||||
|
|
||||||
|
|
||||||
FROM nginx:alpine as runner
|
|
||||||
RUN apk add --update npm
|
|
||||||
WORKDIR /app
|
|
||||||
COPY --from=builder /app/dist /usr/share/nginx/html
|
|
||||||
COPY ./nginx.conf /etc/nginx/nginx.conf
|
|
||||||
EXPOSE 80
|
|
|
@ -1,87 +0,0 @@
|
||||||
-----BEGIN CERTIFICATE-----
|
|
||||||
MIIEXTCCA0WgAwIBAgISBEHk0Sh8UrMfT+VPkRhr83mfMA0GCSqGSIb3DQEBCwUA
|
|
||||||
MDIxCzAJBgNVBAYTAlVTMRYwFAYDVQQKEw1MZXQncyBFbmNyeXB0MQswCQYDVQQD
|
|
||||||
EwJSMzAeFw0yMzA2MDIxNzI1MDVaFw0yMzA4MzExNzI1MDRaMBsxGTAXBgNVBAMT
|
|
||||||
EHRvb2xzaGVkLmozZDEuZGUwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAARYeMpT
|
|
||||||
k1DaC8cigL3DivanGrLQahYBEDm5B26VaS3gUmq9T0RNkEUxJIPnZBwdF8p7xAEB
|
|
||||||
hlTXwgy3eBLAp8lAo4ICTTCCAkkwDgYDVR0PAQH/BAQDAgeAMB0GA1UdJQQWMBQG
|
|
||||||
CCsGAQUFBwMBBggrBgEFBQcDAjAMBgNVHRMBAf8EAjAAMB0GA1UdDgQWBBTRtqOn
|
|
||||||
Qht7rD+2IEGzuYt8frTgVjAfBgNVHSMEGDAWgBQULrMXt1hWy65QCUDmH6+dixTC
|
|
||||||
xjBVBggrBgEFBQcBAQRJMEcwIQYIKwYBBQUHMAGGFWh0dHA6Ly9yMy5vLmxlbmNy
|
|
||||||
Lm9yZzAiBggrBgEFBQcwAoYWaHR0cDovL3IzLmkubGVuY3Iub3JnLzAbBgNVHREE
|
|
||||||
FDASghB0b29sc2hlZC5qM2QxLmRlMEwGA1UdIARFMEMwCAYGZ4EMAQIBMDcGCysG
|
|
||||||
AQQBgt8TAQEBMCgwJgYIKwYBBQUHAgEWGmh0dHA6Ly9jcHMubGV0c2VuY3J5cHQu
|
|
||||||
b3JnMIIBBgYKKwYBBAHWeQIEAgSB9wSB9ADyAHcAtz77JN+cTbp18jnFulj0bF38
|
|
||||||
Qs96nzXEnh0JgSXttJkAAAGIfVskxgAABAMASDBGAiEA+D8rCaCpttJm7w0M4N5N
|
|
||||||
3cmJSfPNmh/t2ojaDB0iSe0CIQCS2XkwJzUrDZ35fIJ9evwJduk/K2I/tmWs4Uk5
|
|
||||||
vnPSNQB3AK33vvp8/xDIi509nB4+GGq0Zyldz7EMJMqFhjTr3IKKAAABiH1bJOoA
|
|
||||||
AAQDAEgwRgIhAPHQQwLf5xSi1VH6BeOpiUKyTMawd36FFU8eCIdB43q6AiEA0KDD
|
|
||||||
yRssPcmGnyWDGq9Of3mpKjCChFrnxzpeDXCTlsswDQYJKoZIhvcNAQELBQADggEB
|
|
||||||
AKo8APReSKNTydks9yqASKhUjuLfXS+mpFQSl2tbU8ER6eIiYHx8o+n2QCdT7h91
|
|
||||||
ZLkGx8ZAmWBvVwXC3QPH5W08ilogi4EU/+HGffkditG5K6/Qn2bzjqnmFIyYqgdT
|
|
||||||
RVaRcxqS9byAGEw3oU5FSCIOuFSBeOHeTwaj+lSVMZTv6LmoovOpCo8sA5xZ6K6H
|
|
||||||
XVwNXIwunssaR4MrnWupB/5N+T7zkhanky4GgiLRuTm+mDbK+OIDx47Hv9jTe+tm
|
|
||||||
s4aixD0eWhzAaiA7HuHJI3Xi64YjK7eNlrwE0ZKdgy8KveDwUBiVcVtz7LR+0v1l
|
|
||||||
P27Z/OkZlA+42LvIJdISMl4=
|
|
||||||
-----END CERTIFICATE-----
|
|
||||||
-----BEGIN CERTIFICATE-----
|
|
||||||
MIIFFjCCAv6gAwIBAgIRAJErCErPDBinU/bWLiWnX1owDQYJKoZIhvcNAQELBQAw
|
|
||||||
TzELMAkGA1UEBhMCVVMxKTAnBgNVBAoTIEludGVybmV0IFNlY3VyaXR5IFJlc2Vh
|
|
||||||
cmNoIEdyb3VwMRUwEwYDVQQDEwxJU1JHIFJvb3QgWDEwHhcNMjAwOTA0MDAwMDAw
|
|
||||||
WhcNMjUwOTE1MTYwMDAwWjAyMQswCQYDVQQGEwJVUzEWMBQGA1UEChMNTGV0J3Mg
|
|
||||||
RW5jcnlwdDELMAkGA1UEAxMCUjMwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEK
|
|
||||||
AoIBAQC7AhUozPaglNMPEuyNVZLD+ILxmaZ6QoinXSaqtSu5xUyxr45r+XXIo9cP
|
|
||||||
R5QUVTVXjJ6oojkZ9YI8QqlObvU7wy7bjcCwXPNZOOftz2nwWgsbvsCUJCWH+jdx
|
|
||||||
sxPnHKzhm+/b5DtFUkWWqcFTzjTIUu61ru2P3mBw4qVUq7ZtDpelQDRrK9O8Zutm
|
|
||||||
NHz6a4uPVymZ+DAXXbpyb/uBxa3Shlg9F8fnCbvxK/eG3MHacV3URuPMrSXBiLxg
|
|
||||||
Z3Vms/EY96Jc5lP/Ooi2R6X/ExjqmAl3P51T+c8B5fWmcBcUr2Ok/5mzk53cU6cG
|
|
||||||
/kiFHaFpriV1uxPMUgP17VGhi9sVAgMBAAGjggEIMIIBBDAOBgNVHQ8BAf8EBAMC
|
|
||||||
AYYwHQYDVR0lBBYwFAYIKwYBBQUHAwIGCCsGAQUFBwMBMBIGA1UdEwEB/wQIMAYB
|
|
||||||
Af8CAQAwHQYDVR0OBBYEFBQusxe3WFbLrlAJQOYfr52LFMLGMB8GA1UdIwQYMBaA
|
|
||||||
FHm0WeZ7tuXkAXOACIjIGlj26ZtuMDIGCCsGAQUFBwEBBCYwJDAiBggrBgEFBQcw
|
|
||||||
AoYWaHR0cDovL3gxLmkubGVuY3Iub3JnLzAnBgNVHR8EIDAeMBygGqAYhhZodHRw
|
|
||||||
Oi8veDEuYy5sZW5jci5vcmcvMCIGA1UdIAQbMBkwCAYGZ4EMAQIBMA0GCysGAQQB
|
|
||||||
gt8TAQEBMA0GCSqGSIb3DQEBCwUAA4ICAQCFyk5HPqP3hUSFvNVneLKYY611TR6W
|
|
||||||
PTNlclQtgaDqw+34IL9fzLdwALduO/ZelN7kIJ+m74uyA+eitRY8kc607TkC53wl
|
|
||||||
ikfmZW4/RvTZ8M6UK+5UzhK8jCdLuMGYL6KvzXGRSgi3yLgjewQtCPkIVz6D2QQz
|
|
||||||
CkcheAmCJ8MqyJu5zlzyZMjAvnnAT45tRAxekrsu94sQ4egdRCnbWSDtY7kh+BIm
|
|
||||||
lJNXoB1lBMEKIq4QDUOXoRgffuDghje1WrG9ML+Hbisq/yFOGwXD9RiX8F6sw6W4
|
|
||||||
avAuvDszue5L3sz85K+EC4Y/wFVDNvZo4TYXao6Z0f+lQKc0t8DQYzk1OXVu8rp2
|
|
||||||
yJMC6alLbBfODALZvYH7n7do1AZls4I9d1P4jnkDrQoxB3UqQ9hVl3LEKQ73xF1O
|
|
||||||
yK5GhDDX8oVfGKF5u+decIsH4YaTw7mP3GFxJSqv3+0lUFJoi5Lc5da149p90Ids
|
|
||||||
hCExroL1+7mryIkXPeFM5TgO9r0rvZaBFOvV2z0gp35Z0+L4WPlbuEjN/lxPFin+
|
|
||||||
HlUjr8gRsI3qfJOQFy/9rKIJR0Y/8Omwt/8oTWgy1mdeHmmjk7j1nYsvC9JSQ6Zv
|
|
||||||
MldlTTKB3zhThV1+XWYp6rjd5JW1zbVWEkLNxE7GJThEUG3szgBVGP7pSWTUTsqX
|
|
||||||
nLRbwHOoq7hHwg==
|
|
||||||
-----END CERTIFICATE-----
|
|
||||||
-----BEGIN CERTIFICATE-----
|
|
||||||
MIIFYDCCBEigAwIBAgIQQAF3ITfU6UK47naqPGQKtzANBgkqhkiG9w0BAQsFADA/
|
|
||||||
MSQwIgYDVQQKExtEaWdpdGFsIFNpZ25hdHVyZSBUcnVzdCBDby4xFzAVBgNVBAMT
|
|
||||||
DkRTVCBSb290IENBIFgzMB4XDTIxMDEyMDE5MTQwM1oXDTI0MDkzMDE4MTQwM1ow
|
|
||||||
TzELMAkGA1UEBhMCVVMxKTAnBgNVBAoTIEludGVybmV0IFNlY3VyaXR5IFJlc2Vh
|
|
||||||
cmNoIEdyb3VwMRUwEwYDVQQDEwxJU1JHIFJvb3QgWDEwggIiMA0GCSqGSIb3DQEB
|
|
||||||
AQUAA4ICDwAwggIKAoICAQCt6CRz9BQ385ueK1coHIe+3LffOJCMbjzmV6B493XC
|
|
||||||
ov71am72AE8o295ohmxEk7axY/0UEmu/H9LqMZshftEzPLpI9d1537O4/xLxIZpL
|
|
||||||
wYqGcWlKZmZsj348cL+tKSIG8+TA5oCu4kuPt5l+lAOf00eXfJlII1PoOK5PCm+D
|
|
||||||
LtFJV4yAdLbaL9A4jXsDcCEbdfIwPPqPrt3aY6vrFk/CjhFLfs8L6P+1dy70sntK
|
|
||||||
4EwSJQxwjQMpoOFTJOwT2e4ZvxCzSow/iaNhUd6shweU9GNx7C7ib1uYgeGJXDR5
|
|
||||||
bHbvO5BieebbpJovJsXQEOEO3tkQjhb7t/eo98flAgeYjzYIlefiN5YNNnWe+w5y
|
|
||||||
sR2bvAP5SQXYgd0FtCrWQemsAXaVCg/Y39W9Eh81LygXbNKYwagJZHduRze6zqxZ
|
|
||||||
Xmidf3LWicUGQSk+WT7dJvUkyRGnWqNMQB9GoZm1pzpRboY7nn1ypxIFeFntPlF4
|
|
||||||
FQsDj43QLwWyPntKHEtzBRL8xurgUBN8Q5N0s8p0544fAQjQMNRbcTa0B7rBMDBc
|
|
||||||
SLeCO5imfWCKoqMpgsy6vYMEG6KDA0Gh1gXxG8K28Kh8hjtGqEgqiNx2mna/H2ql
|
|
||||||
PRmP6zjzZN7IKw0KKP/32+IVQtQi0Cdd4Xn+GOdwiK1O5tmLOsbdJ1Fu/7xk9TND
|
|
||||||
TwIDAQABo4IBRjCCAUIwDwYDVR0TAQH/BAUwAwEB/zAOBgNVHQ8BAf8EBAMCAQYw
|
|
||||||
SwYIKwYBBQUHAQEEPzA9MDsGCCsGAQUFBzAChi9odHRwOi8vYXBwcy5pZGVudHJ1
|
|
||||||
c3QuY29tL3Jvb3RzL2RzdHJvb3RjYXgzLnA3YzAfBgNVHSMEGDAWgBTEp7Gkeyxx
|
|
||||||
+tvhS5B1/8QVYIWJEDBUBgNVHSAETTBLMAgGBmeBDAECATA/BgsrBgEEAYLfEwEB
|
|
||||||
ATAwMC4GCCsGAQUFBwIBFiJodHRwOi8vY3BzLnJvb3QteDEubGV0c2VuY3J5cHQu
|
|
||||||
b3JnMDwGA1UdHwQ1MDMwMaAvoC2GK2h0dHA6Ly9jcmwuaWRlbnRydXN0LmNvbS9E
|
|
||||||
U1RST09UQ0FYM0NSTC5jcmwwHQYDVR0OBBYEFHm0WeZ7tuXkAXOACIjIGlj26Ztu
|
|
||||||
MA0GCSqGSIb3DQEBCwUAA4IBAQAKcwBslm7/DlLQrt2M51oGrS+o44+/yQoDFVDC
|
|
||||||
5WxCu2+b9LRPwkSICHXM6webFGJueN7sJ7o5XPWioW5WlHAQU7G75K/QosMrAdSW
|
|
||||||
9MUgNTP52GE24HGNtLi1qoJFlcDyqSMo59ahy2cI2qBDLKobkx/J3vWraV0T9VuG
|
|
||||||
WCLKTVXkcGdtwlfFRjlBz4pYg1htmf5X6DYO8A4jqv2Il9DjXA6USbW1FzXSLr9O
|
|
||||||
he8Y4IWS6wY7bCkjCWDcRQJMEhg76fsO3txE+FiYruq9RUWhiF1myv4Q6W+CyBFC
|
|
||||||
Dfvp7OOGAN6dEOM4+qR9sdjoSYKEBpsr6GtPAQw4dy753ec5
|
|
||||||
-----END CERTIFICATE-----
|
|
|
@ -1,14 +0,0 @@
|
||||||
server {
|
|
||||||
listen 80;
|
|
||||||
server_name localhost;
|
|
||||||
|
|
||||||
location / {
|
|
||||||
root /usr/share/nginx/html;
|
|
||||||
index index.html index.htm;
|
|
||||||
}
|
|
||||||
|
|
||||||
location /api {
|
|
||||||
proxy_pass http://django:8000;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
0
frontend/src/assets/css/toolshed.scss → frontend/node_modules/.forgit
generated
vendored
0
frontend/src/assets/css/toolshed.scss → frontend/node_modules/.forgit
generated
vendored
1430
frontend/package-lock.json
generated
1430
frontend/package-lock.json
generated
File diff suppressed because it is too large
Load diff
|
@ -8,6 +8,7 @@
|
||||||
"preview": "vite preview"
|
"preview": "vite preview"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
"bootstrap": "^4.6.2",
|
||||||
"bootstrap-icons-vue": "^1.10.3",
|
"bootstrap-icons-vue": "^1.10.3",
|
||||||
"dns-query": "^0.11.2",
|
"dns-query": "^0.11.2",
|
||||||
"js-nacl": "^1.4.0",
|
"js-nacl": "^1.4.0",
|
||||||
|
@ -21,7 +22,7 @@
|
||||||
"@vitejs/plugin-vue": "^4.0.0",
|
"@vitejs/plugin-vue": "^4.0.0",
|
||||||
"@vue/test-utils": "^2.3.2",
|
"@vue/test-utils": "^2.3.2",
|
||||||
"jsdom": "^22.0.0",
|
"jsdom": "^22.0.0",
|
||||||
"sass": "^1.62.1",
|
"sass": "^1.72.0",
|
||||||
"vite": "^4.1.4",
|
"vite": "^4.1.4",
|
||||||
"vitest": "^0.31.1"
|
"vitest": "^0.31.1"
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,5 +0,0 @@
|
||||||
-----BEGIN PRIVATE KEY-----
|
|
||||||
MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQgD6EmCAUWob1zUw4Q
|
|
||||||
F+Pf9cSOmSCTODe6u+Gst177IoihRANCAARYeMpTk1DaC8cigL3DivanGrLQahYB
|
|
||||||
EDm5B26VaS3gUmq9T0RNkEUxJIPnZBwdF8p7xAEBhlTXwgy3eBLAp8lA
|
|
||||||
-----END PRIVATE KEY-----
|
|
Binary file not shown.
Before ![]() (image error) Size: 1.3 MiB |
Binary file not shown.
Before ![]() (image error) Size: 1.1 MiB |
Binary file not shown.
Before ![]() (image error) Size: 1.3 MiB |
Binary file not shown.
Before ![]() (image error) Size: 1 MiB |
Binary file not shown.
Before ![]() (image error) Size: 1.1 MiB |
Binary file not shown.
Before ![]() (image error) Size: 1.2 MiB |
Binary file not shown.
Before ![]() (image error) Size: 1.1 MiB |
Binary file not shown.
Before ![]() (image error) Size: 888 KiB |
Binary file not shown.
Before ![]() (image error) Size: 970 KiB |
|
@ -3,23 +3,13 @@
|
||||||
|
|
||||||
<template>
|
<template>
|
||||||
<router-view></router-view>
|
<router-view></router-view>
|
||||||
<!-- TODO UI für Freunde liste, add, remove -->
|
|
||||||
</template>
|
</template>
|
||||||
|
|
||||||
<script>
|
<script>
|
||||||
|
|
||||||
|
|
||||||
import {mapMutations} from 'vuex';
|
|
||||||
import store from '@/store';
|
|
||||||
|
|
||||||
export default {
|
export default {
|
||||||
name: 'App',
|
name: 'App'
|
||||||
methods: {
|
|
||||||
...mapMutations(['init']),
|
|
||||||
},
|
|
||||||
beforeCreate () {
|
|
||||||
store.commit('load_local')
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
|
|
File diff suppressed because it is too large
Load diff
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue