# HG changeset patch # User Brian Neal # Date 1336255848 18000 # Node ID ee87ea74d46bf8cc83c7705ace03fa1e5dc04fd1 # Parent c525f3e0b5d05c433471c6b0463cb82071109dfa For Django 1.4, rearranged project structure for new manage.py. diff -r c525f3e0b5d0 -r ee87ea74d46b .hgignore --- a/.hgignore Sat May 05 15:08:07 2012 -0500 +++ b/.hgignore Sat May 05 17:10:48 2012 -0500 @@ -13,6 +13,5 @@ media/podcast media/potd media/smiley -gpp/celerybeat-schedule -gpp/xapian_index - +celerybeat-schedule +sg101/xapian_index diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,37 @@ +import datetime +import logging + +from django.contrib.auth.models import User + + +def create_new_user(pending_user, ip=None, admin_activation=False): + """ + This function contains the code to create a new user from a + pending user. The pending user is deleted and the new user + is saved. A log message is produced. If admin_activation is false, + then ip should be the user's IP they confirmed from, if available. + + """ + new_user = User() + + new_user.username = pending_user.username + new_user.first_name = '' + new_user.last_name = '' + new_user.email = pending_user.email + new_user.password = pending_user.password # already been hashed + new_user.is_staff = False + new_user.is_active = True + new_user.is_superuser = False + new_user.last_login = datetime.datetime.now() + new_user.date_joined = new_user.last_login + + new_user.save() + pending_user.delete() + + if admin_activation: + msg = 'Accounts registration confirmed by ADMIN for %s' % new_user.username + else: + msg = 'Accounts registration confirmed by USER for %s from %s' % ( + new_user.username, ip) + + logging.info(msg) diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,29 @@ +"""This file contains the automatic admin site definitions for the accounts Models""" + +from django.contrib import admin +from accounts.models import IllegalUsername +from accounts.models import IllegalEmail +from accounts.models import PendingUser +from accounts import create_new_user + + +class PendingUserAdmin(admin.ModelAdmin): + list_display = ('username', 'email', 'date_joined') + actions = ('activate_account', ) + + def activate_account(self, request, qs): + """ + Activate the accounts of the selected pending users. + + """ + for pending_user in qs: + create_new_user(pending_user, admin_activation=True) + + self.message_user(request, "%s accounts activated" % qs.count()) + + activate_account.short_description = "Activate accounts for selected users" + + +admin.site.register(IllegalUsername) +admin.site.register(IllegalEmail) +admin.site.register(PendingUser, PendingUserAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/fixtures/accounts.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/fixtures/accounts.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,30 @@ +[ + { + "pk": 1, + "model": "accounts.illegalusername", + "fields": { + "username": "root" + } + }, + { + "pk": 2, + "model": "accounts.illegalusername", + "fields": { + "username": "sg101" + } + }, + { + "pk": 3, + "model": "accounts.illegalusername", + "fields": { + "username": "surfguitar101" + } + }, + { + "pk": 4, + "model": "accounts.illegalusername", + "fields": { + "username": "webmaster" + } + } +] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,152 @@ +"""forms for the accounts application""" + +import logging + +from django import forms +from django.contrib.auth.models import User +from django.core.urlresolvers import reverse +from django.template.loader import render_to_string +from django.contrib.sites.models import Site +from django.conf import settings + +from core.functions import send_mail +from accounts.models import PendingUser +from accounts.models import IllegalUsername +from accounts.models import IllegalEmail +from antispam.rate_limit import block_ip + + +class RegisterForm(forms.Form): + """Form used to register with the website""" + username = forms.RegexField( + max_length=30, + regex=r'^\w+$', + error_messages={'invalid': ('Your username must be 30 characters or' + ' less and contain only letters, numbers and underscores.')}, + widget=forms.TextInput(attrs={'class': 'text'}), + ) + email = forms.EmailField(widget=forms.TextInput(attrs={'class': 'text'})) + password1 = forms.CharField(label="Password", + widget=forms.PasswordInput(attrs={'class': 'text'})) + password2 = forms.CharField(label="Password confirmation", + widget=forms.PasswordInput(attrs={'class': 'text'})) + agree_age = forms.BooleanField(required=True, + label='I certify that I am over the age of 13', + error_messages={ + 'required': 'Sorry, but you must be over the age of 13 to ' + 'register at our site.', + }) + agree_tos = forms.BooleanField(required=True, + label='I agree to the Terms of Service', + error_messages={ + 'required': 'You have not agreed to our Terms of Service.', + }) + agree_privacy = forms.BooleanField(required=True, + label='I agree to the Privacy Policy', + error_messages={ + 'required': 'You have not agreed to our Privacy Policy.', + }) + question1 = forms.CharField(label="What number appears in the site name?", + widget=forms.TextInput(attrs={'class': 'text'})) + question2 = forms.CharField(label='', required=False, + widget=forms.TextInput(attrs={'style': 'display: none;'})) + + def __init__(self, *args, **kwargs): + self.ip = kwargs.pop('ip', '?') + super(RegisterForm, self).__init__(*args, **kwargs) + + def clean_username(self): + username = self.cleaned_data['username'] + try: + User.objects.get(username=username) + except User.DoesNotExist: + try: + PendingUser.objects.get(username=username) + except PendingUser.DoesNotExist: + try: + IllegalUsername.objects.get(username=username) + except IllegalUsername.DoesNotExist: + return username + self._validation_error("That username is not allowed.", username) + self._validation_error("A pending user with that username already exists.", username) + self._validation_error("A user with that username already exists.", username) + + def clean_email(self): + email = self.cleaned_data['email'] + + if User.objects.filter(email=email).count(): + self._validation_error("A user with that email address already exists.", email) + elif PendingUser.objects.filter(email=email).count(): + self._validation_error("A pending user with that email address already exists.", email) + elif IllegalEmail.objects.filter(email=email).count(): + self._validation_error("That email address is not allowed.", email) + + # email is ok + return email + + def clean_password2(self): + password1 = self.cleaned_data.get("password1", "") + password2 = self.cleaned_data["password2"] + if password1 != password2: + self._validation_error("The two password fields didn't match.") + if len(password1) < 6: + self._validation_error("Please choose a password of 6 characters or more.") + return password2 + + def clean_question1(self): + answer = self.cleaned_data.get('question1') + success = False + if answer: + try: + val = int(answer) + except ValueError: + pass + else: + success = val == 101 + if not success: + self._validation_error("Incorrect answer to our anti-spam question.", answer) + return answer + + def clean_question2(self): + """ + Honeypot field should be empty. + """ + answer = self.cleaned_data.get('question2') + if answer: + block_ip(self.ip) + self._validation_error('Wrong answer #2: %s' % answer) + return answer + + def save(self): + pending_user = PendingUser.objects.create_pending_user(self.cleaned_data['username'], + self.cleaned_data['email'], + self.cleaned_data['password1']) + + # Send the confirmation email + + site = Site.objects.get_current() + admin_email = settings.ADMINS[0][1] + + activation_link = 'http://%s%s' % (site.domain, reverse('accounts.views.register_confirm', + kwargs = {'username' : pending_user.username, 'key' : pending_user.key})) + + msg = render_to_string('accounts/registration_email.txt', + { + 'site_name' : site.name, + 'site_domain' : site.domain, + 'user_email' : pending_user.email, + 'activation_link' : activation_link, + 'username' : pending_user.username, + 'admin_email' : admin_email, + }) + + subject = 'Registration Confirmation for ' + site.name + send_mail(subject, msg, admin_email, [self.cleaned_data['email']]) + logging.info('Accounts/registration conf. email sent to %s for user %s; IP = %s', + self.cleaned_data['email'], pending_user.username, self.ip) + + return pending_user + + def _validation_error(self, msg, param=None): + logging.error('Accounts/registration [%s]: %s (%s)', self.ip, msg, param) + raise forms.ValidationError(msg) diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/management/commands/rate_limit_clear.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/management/commands/rate_limit_clear.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,54 @@ +""" +The rate_limit_clear command is used to clear IP addresses out from our rate +limit protection database. + +""" +from optparse import make_option +import re + +from django.core.management.base import BaseCommand +import redis + +from core.services import get_redis_connection + + +IP_RE = re.compile(r'^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$') + + +class Command(BaseCommand): + help = """Remove IP addresses from the rate limit protection datastore.""" + option_list = list(BaseCommand.option_list) + [ + make_option("--purge", action="store_true", + help="Purge all IP addresses"), + ] + + def handle(self, *args, **kwargs): + try: + con = get_redis_connection() + + # get all rate-limit keys + keys = con.keys('rate-limit-*') + + # if purging, delete them all... + if kwargs['purge']: + if keys: + con.delete(*keys) + return + + # otherwise delete the ones the user asked for + ips = [] + for ip in args: + if IP_RE.match(ip): + key = 'rate-limit-%s' % ip + if key in keys: + ips.append(key) + else: + self.stdout.write('%s not found\n' % ip) + else: + self.stderr.write('invalid IP address %s\n' % ip) + + if ips: + con.delete(*ips) + + except redis.RedisError, e: + self.stderr.write('%s\n' % e) diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,90 @@ +"""Contains models for the accounts application""" + +import datetime +import random +import string +import hashlib +import base64 + +from django.db import models +from django.contrib.auth.models import User +from django.conf import settings + + +class IllegalUsername(models.Model): + """model to represent the list of illegal usernames""" + username = models.CharField(max_length=30, db_index=True) + + def __unicode__(self): + return self.username + + class Meta: + ordering = ('username', ) + + +class IllegalEmail(models.Model): + """model to represent the list of illegal/restricted email addresses""" + email = models.EmailField(db_index=True) + + def __unicode__(self): + return self.email + + class Meta: + ordering = ('email', ) + + +class PendingUserManager(models.Manager): + """user manager for PendingUser model""" + + create_count = 0 + + def create_pending_user(self, username, email, password): + '''creates a new pending user and saves it to the database''' + + temp_user = User() + temp_user.set_password(password) + + now = datetime.datetime.now() + pending_user = self.model(None, + username, + email, + temp_user.password, + now, + self._make_key()) + + pending_user.save() + self.create_count += 1 + return pending_user + + + def purge_expired(self): + expire_time = datetime.datetime.now() - datetime.timedelta(days=1) + expired_pending_users = self.filter(date_joined__lt=expire_time) + expired_pending_users.delete() + + + def _make_key(self): + s = ''.join(random.sample(string.printable, 8)) + delta = datetime.date.today() - datetime.date(1846, 12, 28) + days = base64.urlsafe_b64encode(str(delta * 10)) + key = hashlib.sha1(settings.SECRET_KEY + + unicode(self.create_count) + + unicode(s) + + unicode(days)).hexdigest()[::2] + return key + + +class PendingUser(models.Model): + """model for holding users while they go through the email registration cycle""" + + username = models.CharField(max_length=30, db_index=True) + email = models.EmailField() + password = models.CharField(max_length=128) + date_joined = models.DateTimeField(default=datetime.datetime.now, db_index=True) + key = models.CharField(max_length=20, editable=True) + + objects = PendingUserManager() + + def __unicode__(self): + return self.username + diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/static/js/ajax_login.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/static/js/ajax_login.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,68 @@ +$(function() { + var loginError = $('#login-error'); + var userBox = $('#ajax-login-username'); + var passBox = $('#ajax-login-password'); + var loginDialog = $('#login-dialog').dialog({ + autoOpen: false, + height: 375, + width: 380, + modal: true, + buttons: { + "Login": function() { + loginError.text('').hide(); + $.ajax({ + url: '/accounts/login/ajax/', + type: 'POST', + data: { + username: userBox.val(), + password: passBox.val(), + csrfmiddlewaretoken: csrf_token + }, + dataType: 'json', + success: function(data, textStatus) { + if (data.success) { + loginDialog.dialog("close"); + if (window.location.pathname == "/accounts/logout/") { + window.location.replace("/"); + } + else { + $('#header-nav').html(data.navbar_html); + } + } + else { + loginError.text(data.error).show(); + userBox.val(''); + passBox.val(''); + userBox.focus(); + } + }, + error: function (xhr, textStatus, ex) { + if (xhr.status == 403) { + loginDialog.dialog("close"); + alert("Oops, we are detecting some strange behavior and are blocking this action. If you feel this is an error, please feel free to contact us. Thank you."); + window.location.href = "/"; + } + else { + loginError.text('Oops, an error occurred. If this problem persists, please contact us.').show(); + } + } + }); + }, + "Cancel": function() { + loginDialog.dialog("close"); + } + }, + focus: function() { + $(':input', this).keyup(function(event) { + if (event.keyCode == 13) { + $('.ui-dialog-buttonpane button:first').click(); + } + }); + } + }); + $('#login-link').click(function() { + loginError.text('').hide(); + loginDialog.dialog("open"); + return false; + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/stats.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/stats.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,97 @@ +""" +This module performs user account related statistics. + +""" +import logging + +from django.db.models.signals import post_save +from django.contrib.auth.models import User + +from core.services import get_redis_connection + + +# Redis key names +USER_COUNT_KEY = "accounts:user_count" +NEW_USERS_KEY = "accounts:new_users" + + +logger = logging.getLogger(__name__) + + +def on_user_save(sender, **kwargs): + """ + This function is our signal handler for when a User object is saved. + + """ + from accounts.tasks import user_stats_task + + if kwargs['created']: + user = kwargs['instance'] + + # kick off a task to update user stats + + user_stats_task.delay(user.id) + + +def update_user_stats(user_id): + """ + This function is given a new user id and is responsible for updating various + user account statistics. + + """ + try: + user = User.objects.get(pk=user_id) + except User.DoesNotExist: + logger.warning("update_user_stats: user id %d doesn't exist", user_id) + return + + redis = get_redis_connection() + + # update the count of registered users + + count = redis.incr(USER_COUNT_KEY) + if count == 1: + # it is likely redis got wiped out; update it now + count = User.objects.all().count() + redis.set(USER_COUNT_KEY, count) + + # update the list of new users + + pipeline = redis.pipeline() + pipeline.lpush(NEW_USERS_KEY, user.username) + pipeline.ltrim(NEW_USERS_KEY, 0, 9) + pipeline.execute() + + +def get_user_count(redis=None): + """ + This function returns the current count of users. + + """ + if redis is None: + redis = get_redis_connection() + return redis.get(USER_COUNT_KEY) + + +def get_new_users(redis=None): + """ + This function returns a list of new usernames. + + """ + if redis is None: + redis = get_redis_connection() + return redis.lrange(NEW_USERS_KEY, 0, -1) + + +def get_user_stats(redis=None): + """ + This function returns a tuple of the user stats. Element 0 is the user count + and element 1 is the list of new users. + + """ + if redis is None: + redis = get_redis_connection() + return get_user_count(redis), get_new_users(redis) + + +post_save.connect(on_user_save, sender=User, dispatch_uid='accounts.stats') diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/tasks.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/tasks.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,16 @@ +""" +Celery tasks for the accounts application. + +""" +from celery.task import task + +from accounts.stats import update_user_stats + + +@task +def user_stats_task(user_id): + """ + Run the update_user_stats() function on a new task. + + """ + update_user_stats(user_id) diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/templatetags/accounts_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/templatetags/accounts_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,20 @@ +""" +Template tags for the accounts applications. + +""" +from django import template + +from accounts.stats import get_user_stats + + +register = template.Library() + + +@register.inclusion_tag('accounts/user_stats_tag.html') +def user_stats(): + """ + This tag renders the total number of site users and a list of new users. + + """ + num_users, new_users = get_user_stats() + return {'num_users': num_users, 'new_users': new_users} diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/tests/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/tests/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1 @@ +from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/tests/view_tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/tests/view_tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,254 @@ +""" +View tests for the accounts application. + +""" +import datetime + +from django.test import TestCase +from django.core.urlresolvers import reverse +from django.contrib.auth.models import User +from django.contrib.auth.hashers import check_password + +from antispam.rate_limit import unblock_ip +from accounts.models import PendingUser +from accounts.models import IllegalUsername +from accounts.models import IllegalEmail + + +class RegistrationTest(TestCase): + + def setUp(self): + u = User.objects.create_user('existing_user', 'existing_user@example.com', 'pw') + u.save() + + # a 2nd user has the same email as another + u = User.objects.create_user('existing_user2', 'existing_user@example.com', 'pw') + u.save() + + PendingUser.objects.create(username='pending_user', + email='pending_user@example.com', + password='pw', + date_joined=datetime.datetime.now(), + key='key') + + IllegalUsername.objects.create(username='illegalusername') + IllegalEmail.objects.create(email='illegal@example.com') + + def tearDown(self): + unblock_ip('127.0.0.1') + + def test_get_view(self): + """ + Test a simple get of the registration view + + """ + response = self.client.get(reverse('accounts-register')) + self.assertEqual(response.status_code, 200) + + def test_existing_user(self): + """ + Ensure we can't register with an existing username. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'existing_user', + 'email': 'test@example.com', + 'password1': 'my_password', + 'password2': 'my_password', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, 'A user with that username already exists') + + def test_pending_user(self): + """ + Ensure we can't register with a pending username. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'pending_user', + 'email': 'test@example.com', + 'password1': 'my_password', + 'password2': 'my_password', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, 'A pending user with that username already exists') + + def test_illegal_username(self): + """ + Ensure we can't register with a banned username. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'illegalusername', + 'email': 'test@example.com', + 'password1': 'my_password', + 'password2': 'my_password', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, 'That username is not allowed') + + def test_duplicate_existing_email(self): + """ + Ensure we can't register with a duplicate email address. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'a_new_user', + 'email': 'existing_user@example.com', + 'password1': 'my_password', + 'password2': 'my_password', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, 'A user with that email address already exists') + + def test_duplicate_pending_email(self): + """ + Ensure we can't register with a duplicate email address. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'a_new_user', + 'email': 'pending_user@example.com', + 'password1': 'my_password', + 'password2': 'my_password', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, 'A pending user with that email address already exists') + + def test_illegal_email(self): + """ + Ensure we can't register with a banned email address. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'a_new_user', + 'email': 'illegal@example.com', + 'password1': 'my_password', + 'password2': 'my_password', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, 'That email address is not allowed') + + def test_password_match(self): + """ + Ensure the passwords match. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'a_new_user', + 'email': 'test@example.com', + 'password1': 'my_password', + 'password2': 'my_password_doesnt match', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, "The two password fields didn't match") + + def test_question1(self): + """ + Ensure our anti-spam question is answered. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'a_new_user', + 'email': 'test@example.com', + 'password1': 'my_password', + 'password2': 'my_password_doesnt match', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': 'huh', + 'question2': '', + }) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, "Incorrect answer to our anti-spam question") + + def test_question2(self): + """ + Ensure our honeypot question check works. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'a_new_user', + 'email': 'test@example.com', + 'password1': 'my_password', + 'password2': 'my_password_doesnt match', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': 'non blank', + }) + + self.assertEqual(response.status_code, 403) + + def test_success(self): + """ + Ensure we can successfully register. + + """ + response = self.client.post(reverse('accounts-register'), { + 'username': 'a_new_user', + 'email': 'test@example.com', + 'password1': 'my_password', + 'password2': 'my_password', + 'agree_age': 'on', + 'agree_tos': 'on', + 'agree_privacy': 'on', + 'question1': '101', + 'question2': '', + }) + + self.assertEqual(response.status_code, 302) + + try: + pending = PendingUser.objects.get(username='a_new_user') + except PendingUser.DoesNotExist: + self.fail("PendingUser was not created") + + self.assertEqual(pending.email, 'test@example.com') + self.assertTrue(datetime.datetime.now() - pending.date_joined < + datetime.timedelta(minutes=1)) + self.assertTrue(check_password('my_password', pending.password)) diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,47 @@ +"""urls for the accounts application""" +from django.conf.urls import patterns, url +from django.conf import settings + +urlpatterns = patterns('accounts.views', + url(r'^login/ajax/$', 'login_ajax', name='accounts-login_ajax'), + url(r'^register/$', 'register', name='accounts-register'), + (r'^register/thanks/$', 'register_thanks'), + (r'^register/confirm/(?P[\w.@+-]{1,30})/(?P[a-zA-Z0-9]{20})/$', 'register_confirm'), +) + +urlpatterns += patterns('', + url(r'^login/$', + 'django.contrib.auth.views.login', + kwargs={'template_name': 'accounts/login.html'}, + name='accounts-login'), + url(r'^logout/$', + 'django.contrib.auth.views.logout', + kwargs={'template_name': 'accounts/logout.html'}, + name='accounts-logout'), + (r'^password/$', + 'django.contrib.auth.views.password_change', + {'template_name': 'accounts/password_change.html', + 'post_change_redirect': settings.LOGIN_REDIRECT_URL}), + url(r'^password/reset/$', + 'django.contrib.auth.views.password_reset', + kwargs={'template_name': 'accounts/password_reset.html', + 'email_template_name': 'accounts/password_reset_email.txt', + 'post_reset_redirect': '/accounts/password/reset/sent/'}, + name='accounts-password_reset'), + url(r'^password/reset/sent/$', + 'django.contrib.auth.views.password_reset_done', + kwargs={'template_name': 'accounts/password_reset_sent.html'}, + name='accounts-password_reset_sent'), + url(r'^password/reset/confirm/(?P[0-9a-z]+)/(?P[0-9a-z]+-\w+)/$', + 'django.contrib.auth.views.password_reset_confirm', + kwargs={ + 'template_name': 'accounts/password_reset_confirm.html', + 'post_reset_redirect': '/accounts/password/reset/success/', + }, + name='accounts-password_reset_confirm'), + url(r'^password/reset/success/$', + 'django.contrib.auth.views.password_reset_complete', + kwargs={'template_name': 'accounts/password_reset_complete.html'}, + name='accounts-password_reset_success'), +) + diff -r c525f3e0b5d0 -r ee87ea74d46b accounts/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/accounts/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,117 @@ +""" +Views for the accounts application. + +""" +import datetime +import logging + +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.template.loader import render_to_string +from django.contrib.auth.models import User +from django.http import HttpResponse, HttpResponseRedirect +from django.core.urlresolvers import reverse +from django.conf import settings +from django.contrib.auth.forms import AuthenticationForm +from django.contrib.auth import login +from django.utils import simplejson + +from accounts.models import PendingUser +from accounts.forms import RegisterForm +from accounts import create_new_user +from antispam.decorators import rate_limit + + +####################################################################### + +@rate_limit(count=10, interval=datetime.timedelta(minutes=1)) +def register(request): + if request.user.is_authenticated(): + return HttpResponseRedirect(settings.LOGIN_REDIRECT_URL) + + if request.method == 'POST': + form = RegisterForm(request.POST, ip=request.META.get('REMOTE_ADDR', '?')) + if form.is_valid(): + form.save() + return HttpResponseRedirect(reverse('accounts.views.register_thanks')) + else: + form = RegisterForm() + + return render_to_response('accounts/register.html', { + 'form': form, + }, + context_instance = RequestContext(request)) + +####################################################################### + +def register_thanks(request): + if request.user.is_authenticated(): + return HttpResponseRedirect(settings.LOGIN_REDIRECT_URL) + + return render_to_response('accounts/register_thanks.html', + context_instance = RequestContext(request)) + +####################################################################### + +def register_confirm(request, username, key): + if request.user.is_authenticated(): + return HttpResponseRedirect(settings.LOGIN_REDIRECT_URL) + + # purge expired users + + PendingUser.objects.purge_expired() + + ip = request.META.get('REMOTE_ADDR', '?') + try: + pending_user = PendingUser.objects.get(username = username) + except PendingUser.DoesNotExist: + logging.error('Accounts register_confirm [%s]: user does not exist: %s', ip, username) + return render_to_response('accounts/register_failure.html', { + 'username': username, + }, + context_instance = RequestContext(request)) + + if pending_user.key != key: + logging.error('Accounts register_confirm [%s]: key error: %s', ip, username) + return render_to_response('accounts/register_failure.html', { + 'username': username, + }, + context_instance = RequestContext(request)) + + create_new_user(pending_user, ip) + + return render_to_response('accounts/register_success.html', { + 'username': username, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@rate_limit(count=10, interval=datetime.timedelta(minutes=1), + lockout=datetime.timedelta(minutes=2)) +def login_ajax(request): + """ + This view function handles a login via AJAX. + + """ + if not request.is_ajax(): + return HttpResponseRedirect(reverse('accounts-login')) + + response = { + 'success': False, + 'error': '', + 'navbar_html': '' + } + + if request.method == "POST": + form = AuthenticationForm(data=request.POST) + if form.is_valid(): + login(request, form.get_user()) + response['success'] = True + response['navbar_html'] = render_to_string('navbar.html', + {'user': request.user}, RequestContext(request)) + else: + response['error'] = 'Invalid username or password' + + return HttpResponse(simplejson.dumps(response), + content_type='application/json') diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,13 @@ +import datetime + +from django.contrib.auth import views as auth_views + +from antispam.decorators import rate_limit + +SPAM_PHRASE_KEY = "antispam.spam_phrases" +BUSTED_MESSAGE = ("Your post has tripped our spam filter. Your account has " + "been suspended pending a review of your post. If this was a mistake " + "then we apologize; your account will be restored shortly.") + +# Install rate limiting on auth login +auth_views.login = rate_limit(lockout=datetime.timedelta(minutes=2))(auth_views.login) diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,12 @@ +"""Admin definitions for the antispam application.""" + +from django.contrib import admin + +from antispam.models import SpamPhrase + + +class SpamPhraseAdmin(admin.ModelAdmin): + search_fields = ('phrase', ) + + +admin.site.register(SpamPhrase, SpamPhraseAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/decorators.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/decorators.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,59 @@ +""" +This module contains decorators for the antispam application. + +""" +from datetime import timedelta +from functools import wraps + +from django.shortcuts import render +from django.utils import simplejson + +from antispam.rate_limit import RateLimiter, RateLimiterUnavailable + + +def rate_limit(count=10, interval=timedelta(minutes=1), + lockout=timedelta(hours=8)): + + def decorator(fn): + + @wraps(fn) + def wrapped(request, *args, **kwargs): + + ip = request.META.get('REMOTE_ADDR') + try: + rate_limiter = RateLimiter(ip, count, interval, lockout) + if rate_limiter.is_blocked(): + return render(request, 'antispam/blocked.html', status=403) + + except RateLimiterUnavailable: + # just call the function and return the result + return fn(request, *args, **kwargs) + + response = fn(request, *args, **kwargs) + + if request.method == 'POST': + + # Figure out if the view succeeded; if it is a non-ajax view, + # then success means a redirect is about to occur. If it is + # an ajax view, we have to decode the json response. + success = False + if not request.is_ajax(): + success = (response and response.has_header('location') and + response.status_code == 302) + elif response: + json_resp = simplejson.loads(response.content) + success = json_resp['success'] + + if not success: + try: + blocked = rate_limiter.incr() + except RateLimiterUnavailable: + blocked = False + + if blocked: + return render(request, 'antispam/blocked.html', status=403) + + return response + + return wrapped + return decorator diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,23 @@ +"""Models for the antispam application.""" +from django.db import models +from django.core.cache import cache + +from antispam import SPAM_PHRASE_KEY + + +class SpamPhrase(models.Model): + """A SpamPhrase is a string that is checked for in user input. User input + containing a SpamPhrase should be blocked and flagged. + """ + phrase = models.CharField(max_length=64) + + class Meta: + ordering = ('phrase', ) + + def __unicode__(self): + return self.phrase + + def save(self, *args, **kwargs): + cache.delete(SPAM_PHRASE_KEY) + self.phrase = self.phrase.lower() + super(SpamPhrase, self).save(*args, **kwargs) diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/rate_limit.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/rate_limit.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,152 @@ +""" +This module contains the rate limiting functionality. + +""" +import datetime +import logging + +import redis + +from core.services import get_redis_connection + + +logger = logging.getLogger(__name__) + + +# This exception is thrown upon any Redis error. This insulates client code from +# knowing that we are using Redis and will allow us to use something else in the +# future. +class RateLimiterUnavailable(Exception): + pass + + +def _make_key(ip): + """ + Creates and returns a key string from a given IP address. + + """ + return 'rate-limit-' + ip + + +def _get_connection(): + """ + Create and return a Redis connection. Returns None on failure. + """ + try: + conn = get_redis_connection() + except redis.RedisError, e: + logger.error("rate limit: %s" % e) + raise RateLimiterUnavailable + + return conn + + +def _to_seconds(interval): + """ + Converts the timedelta interval object into a count of seconds. + + """ + return interval.days * 24 * 3600 + interval.seconds + + +def block_ip(ip, count=1000000, interval=datetime.timedelta(weeks=2)): + """ + This function jams the rate limit record for the given IP so that the IP is + blocked for the given interval. If the record doesn't exist, it is created. + This is useful for manually blocking an IP after detecting suspicious + behavior. + This function may throw RateLimiterUnavailable. + + """ + key = _make_key(ip) + conn = _get_connection() + + try: + conn.setex(key, time=_to_seconds(interval), value=count) + except redis.RedisError, e: + logger.error("rate limit (block_ip): %s" % e) + raise RateLimiterUnavailable + + logger.info("Rate limiter blocked IP %s; %d / %s", ip, count, interval) + + +def unblock_ip(ip): + """ + This function removes the block for the given IP address. + + """ + key = _make_key(ip) + conn = _get_connection() + try: + conn.delete(key) + except redis.RedisError, e: + logger.error("rate limit (unblock_ip): %s" % e) + raise RateLimiterUnavailable + + logger.info("Rate limiter unblocked IP %s", ip) + + +class RateLimiter(object): + """ + This class encapsulates the rate limiting logic for a given IP address. + + """ + def __init__(self, ip, set_point, interval, lockout): + self.ip = ip + self.set_point = set_point + self.interval = interval + self.lockout = lockout + self.key = _make_key(ip) + self.conn = _get_connection() + + def is_blocked(self): + """ + Return True if the IP is blocked, and false otherwise. + + """ + try: + val = self.conn.get(self.key) + except redis.RedisError, e: + logger.error("RateLimiter (is_blocked): %s" % e) + raise RateLimiterUnavailable + + try: + val = int(val) if val else 0 + except ValueError: + return False + + blocked = val >= self.set_point + if blocked: + logger.info("Rate limiter blocking %s", self.ip) + + return blocked + + def incr(self): + """ + One is added to a counter associated with the IP address. If the + counter exceeds set_point per interval, True is returned, and False + otherwise. If the set_point is exceeded for the first time, the counter + associated with the IP is set to expire according to the lockout + parameter. + + """ + try: + val = self.conn.incr(self.key) + + # Set expire time, if necessary. + # If this is the first time, set it according to interval. + # If the set_point has just been exceeded, set it according to lockout. + if val == 1: + self.conn.expire(self.key, _to_seconds(self.interval)) + elif val == self.set_point: + self.conn.expire(self.key, _to_seconds(self.lockout)) + + tripped = val >= self.set_point + + if tripped: + logger.info("Rate limiter tripped for %s; counter = %d", self.ip, val) + return tripped + + except redis.RedisError, e: + logger.error("RateLimiter (incr): %s" % e) + raise RateLimiterUnavailable diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/tests/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/tests/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,2 @@ +from rate_limit_tests import * +from utils_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/tests/rate_limit_tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/tests/rate_limit_tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,77 @@ +""" +Tests for the rate limiting function in the antispam application. + +""" +from django.test import TestCase +from django.core.urlresolvers import reverse + +from antispam.rate_limit import _make_key +from core.services import get_redis_connection + + +class RateLimitTestCase(TestCase): + KEY = _make_key('127.0.0.1') + + def setUp(self): + self.conn = get_redis_connection() + self.conn.delete(self.KEY) + + def tearDown(self): + self.conn.delete(self.KEY) + + def testRegistrationLockout(self): + + for i in range(1, 11): + response = self.client.post( + reverse('accounts-register'), + {}, + follow=True) + + if i < 10: + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, 'accounts/register.html') + elif i >= 10: + self.assertEqual(response.status_code, 403) + self.assertTemplateUsed(response, 'antispam/blocked.html') + + def testLoginLockout(self): + + for i in range(1, 11): + response = self.client.post( + reverse('accounts-login'), + {}, + follow=True) + + if i < 10: + self.assertEqual(response.status_code, 200) + self.assertTemplateUsed(response, 'accounts/login.html') + elif i >= 10: + self.assertEqual(response.status_code, 403) + self.assertTemplateUsed(response, 'antispam/blocked.html') + + def testHoneypotLockout(self): + + response = self.client.post( + reverse('accounts-register'), { + 'username': u'test_user', + 'email': u'test_user@example.com', + 'password1': u'password', + 'password2': u'password', + 'agree_age': u'on', + 'agree_tos': u'on', + 'agree_privacy': u'on', + 'question1': u'101', + 'question2': u'DsjkdE$', + }, + follow=True) + + val = self.conn.get(self.KEY) + self.assertEqual(val, '1000001') + + response = self.client.post( + reverse('accounts-login'), + {}, + follow=True) + + self.assertEqual(response.status_code, 403) + self.assertTemplateUsed(response, 'antispam/blocked.html') diff -r c525f3e0b5d0 -r ee87ea74d46b antispam/tests/utils_tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/antispam/tests/utils_tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,37 @@ +""" +Tests for the antispam application. +""" +from django.test import TestCase +from django.core.cache import cache + +from antispam import SPAM_PHRASE_KEY +from antispam.models import SpamPhrase +from antispam.utils import contains_spam + + +class AntispamCase(TestCase): + + def test_no_phrases(self): + """ + Tests that an empty spam phrase table works. + """ + cache.delete(SPAM_PHRASE_KEY) + self.assertFalse(contains_spam("Here is some random text.")) + + def test_phrases(self): + """ + Simple test of some phrases. + """ + SpamPhrase.objects.create(phrase="grytner") + SpamPhrase.objects.create(phrase="allday.ru") + SpamPhrase.objects.create(phrase="stefa.pl") + + self.assert_(contains_spam("grytner")) + self.assert_(contains_spam("11grytner")) + self.assert_(contains_spam("11grytner>")) + self.assert_(contains_spam("1djkl jsd stefa.pl")) + self.assert_(contains_spam("1djkl jsd ' % (obj.image.url, obj.description) + image_tag.allow_tags = True + +admin.site.register(Campaign, CampaignAdmin) +admin.site.register(Banner, BannerAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b banners/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/banners/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,60 @@ +""" +Models for the banners application. + +""" +import datetime + +from django.db import models + + +class Campaign(models.Model): + """ + A model to represent an ad or banner campaign. + + """ + name = models.CharField(max_length=128) + slug = models.SlugField() + creation_date = models.DateTimeField(blank=True) + + def __unicode__(self): + return self.name + + class Meta: + ordering = ['name'] + + def save(self, *args, **kwargs): + if not self.pk and not self.creation_date: + self.creation_date = datetime.datetime.now() + + super(Campaign, self).save(*args, **kwargs) + + +def banner_upload_to(instance, filename): + """ + An "upload_to" function for the Banner model. + + """ + return "banners/%s/%s" % (instance.campaign.slug, filename) + + +class Banner(models.Model): + """ + A model to represent a banner. + + """ + campaign = models.ForeignKey(Campaign) + image = models.ImageField(upload_to=banner_upload_to) + description = models.CharField(max_length=128) + creation_date = models.DateTimeField(blank=True) + + def __unicode__(self): + return self.description + + class Meta: + ordering = ['-creation_date'] + + def save(self, *args, **kwargs): + if not self.pk and not self.creation_date: + self.creation_date = datetime.datetime.now() + + super(Banner, self).save(*args, **kwargs) diff -r c525f3e0b5d0 -r ee87ea74d46b banners/templatetags/banner_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/banners/templatetags/banner_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,60 @@ +""" +Template tags for the banners application. + +""" +import logging + +from django import template +import redis + +from core.services import get_redis_connection +from banners.models import Banner + + +register = template.Library() +logger = logging.getLogger(__name__) + +BANNER_URL_KEY = 'banners:url:%s' + + +@register.simple_tag +def banner_url(slug): + """ + Returns the URL for the next banner in the campaign whose slug is 'slug'. + + For each campaign, a list of banner URLs are kept in Redis. Each time this + tag is called, the front banner is popped off the list. When the list is + empty, we refresh the list from the database. In this way the banners for a + campaign are cycled through. + """ + key = BANNER_URL_KEY % slug + + try: + conn = get_redis_connection() + url = conn.lpop(key) + except redis.RedisError, e: + logger.error("banner_url: '%s' on lpop", e) + return u'' + + if url: + return url + + # list not found or empty, rebuild it from the database + + qs = Banner.objects.filter(campaign__slug=slug) + urls = [banner.image.url for banner in qs] + if not urls: + logger.warning("banner_url: no banners for campaign '%s'", slug) + return u'' + + url = urls[0] + urls = urls[1:] + + if urls: + try: + conn.rpush(key, *urls) + except redis.RedisError, e: + logger.error("banner_url: '%s' on rpush", e) + pass + + return url diff -r c525f3e0b5d0 -r ee87ea74d46b bio/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1 @@ +import signals diff -r c525f3e0b5d0 -r ee87ea74d46b bio/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,151 @@ +""" +This file contains the admin definitions for the bio application. +""" +import datetime + +from django.contrib import admin + +import django.contrib.auth.models +import django.contrib.auth.admin + +import bio.models +import bio.badges +from antispam.utils import deactivate_spammer + + +class BadgeOwnerInline(admin.TabularInline): + model = bio.models.BadgeOwnership + extra = 1 + + +class UserProfileAdmin(admin.ModelAdmin): + search_fields = ('user__username', 'user__first_name', 'user__last_name', + 'user__email') + exclude = ('profile_html', 'signature_html') + list_display = ('__unicode__', 'user_is_active', 'get_status_display', 'status_date') + readonly_fields = ('status', 'status_date', 'update_date') + list_filter = ('status', ) + date_hierarchy = 'status_date' + inlines = (BadgeOwnerInline, ) + actions = ( + 'mark_active', + 'mark_resigned', + 'mark_removed', + 'mark_suspended', + 'mark_spammer', + 'mark_stranger', + ) + + def get_status_display(self, obj): + return obj.get_status_display() + get_status_display.short_description = 'Status' + + def mark_user_status(self, request, qs, status): + """ + Common code for the admin actions. Updates the status field in the + profiles to 'status'. Updates the status_date. Sets the is_active + field to True if the status is STA_ACTIVE and False otherwise. + """ + now = datetime.datetime.now() + for profile in qs: + profile.user.is_active = (status == bio.models.STA_ACTIVE or + status == bio.models.STA_STRANGER) + profile.user.save() + profile.status = status + profile.status_date = now + profile.save(content_update=False) + + count = len(qs) + msg = "1 user" if count == 1 else "%d users" % count + self.message_user(request, "%s successfully marked as %s." % (msg, + bio.models.USER_STATUS_CHOICES[status][1])) + + def mark_active(self, request, qs): + """ + Marks users as active. Updates their profile status to STA_ACTIVE. + """ + self.mark_user_status(request, qs, bio.models.STA_ACTIVE) + mark_active.short_description = "Mark selected users as active" + + def mark_resigned(self, request, qs): + """ + Marks users as inactive. Updates their profile status to STA_RESIGNED. + """ + self.mark_user_status(request, qs, bio.models.STA_RESIGNED) + mark_resigned.short_description = "Mark selected users as resigned" + + def mark_removed(self, request, qs): + """ + Marks users as inactive. Updates their profile status to STA_REMOVED. + """ + self.mark_user_status(request, qs, bio.models.STA_REMOVED) + mark_removed.short_description = "Mark selected users as removed" + + def mark_suspended(self, request, qs): + """ + Marks users as inactive. Updates their profile status to STA_SUSPENDED. + """ + self.mark_user_status(request, qs, bio.models.STA_SUSPENDED) + mark_suspended.short_description = "Mark selected users as suspended" + + def mark_spammer(self, request, qs): + """ + Calls deactivate_spammer() on each user in the profile queryset. + + """ + count = qs.count() + for profile in qs: + deactivate_spammer(profile.user) + + self.message_user(request, + "%s profile(s) successfully marked as spammers." % count) + + mark_spammer.short_description = "Mark selected users as spammers" + + def mark_stranger(self, request, qs): + """ + Marks users as strangers. Updates their profile status to STA_STRANGER. + """ + self.mark_user_status(request, qs, bio.models.STA_STRANGER) + mark_stranger.short_description = "Mark selected users as strangers" + + +class UserProfileFlagAdmin(admin.ModelAdmin): + list_display = ['__unicode__', 'flag_date', 'get_profile_url'] + actions = ['accept_flags'] + raw_id_fields = ['user', 'profile'] + + def accept_flags(self, request, qs): + """ + This action awards a security pin to the user that reported the + profile, deletes the flags, then deactivates the spammers. + """ + count = qs.count() + for flag in qs: + deactivate_spammer(flag.profile.user) + bio.badges.award_badge(bio.badges.SECURITY_PIN, flag.user) + flag.delete() + + self.message_user(request, + "%s profile(s) successfully marked as spammers." % count) + + accept_flags.short_description = "Mark selected profiles as spammers" + + +class BadgeAdmin(admin.ModelAdmin): + list_display = ('name', 'html', 'order', 'numeric_id', 'description') + list_editable = ('order', 'numeric_id') + + +# We like the User admin but would like a date hierarcy on date_joined. +class UserAdmin(django.contrib.auth.admin.UserAdmin): + date_hierarchy = 'date_joined' + + +admin.site.register(bio.models.UserProfile, UserProfileAdmin) +admin.site.register(bio.models.UserProfileFlag, UserProfileFlagAdmin) +admin.site.register(bio.models.Badge, BadgeAdmin) + +# Unregister existing ModelAdmin for User, then register ours +admin.site.unregister(django.contrib.auth.models.User) +admin.site.register(django.contrib.auth.models.User, UserAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b bio/badges.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/badges.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,37 @@ +"""This module contains user profile badge-related functionality.""" +import logging + +from bio.models import Badge +from bio.models import BadgeOwnership + + +# Numeric ID's for badges that are awarded for user actions: +(CONTRIBUTOR_PIN, CALENDAR_PIN, NEWS_PIN, LINK_PIN, DOWNLOAD_PIN, + SECURITY_PIN, POTD_PIN) = range(7) + + +def award_badge(badge_id, user): + """This function awards the badge specified by badge_id + to the given user. If the user already has the badge, + the badge count is incremented by one. + """ + try: + badge = Badge.objects.get(numeric_id=badge_id) + except Badge.DoesNotExist: + logging.error("Can't award badge with numeric_id = %d", badge_id) + return + + profile = user.get_profile() + + # Does the user already have badges of this type? + try: + bo = BadgeOwnership.objects.get(profile=profile, badge=badge) + except BadgeOwnership.DoesNotExist: + # No badge of this type, yet + bo = BadgeOwnership(profile=profile, badge=badge, count=1) + else: + # Already have this badge + bo.count += 1 + bo.save() + + logging.info('Awarded %s with the badge: %s', user.username, badge.name) diff -r c525f3e0b5d0 -r ee87ea74d46b bio/fixtures/badges.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/fixtures/badges.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,112 @@ +[ + { + "pk": 7, + "model": "bio.badge", + "fields": { + "numeric_id": 2, + "image": "badges/newspaper.png", + "order": 0, + "name": "News Pin", + "description": "For submitting a news article to the site news." + } + }, + { + "pk": 4, + "model": "bio.badge", + "fields": { + "numeric_id": 1, + "image": "badges/date.png", + "order": 1, + "name": "Calendar Pin", + "description": "For adding an event to the site calendar." + } + }, + { + "pk": 9, + "model": "bio.badge", + "fields": { + "numeric_id": 3, + "image": "badges/world_link.png", + "order": 2, + "name": "Link Pin", + "description": "For submitting a link to the site web links database." + } + }, + { + "pk": 5, + "model": "bio.badge", + "fields": { + "numeric_id": 4, + "image": "badges/disk.png", + "order": 3, + "name": "Download Pin", + "description": "For uploading a file to the site downloads library." + } + }, + { + "pk": 6, + "model": "bio.badge", + "fields": { + "numeric_id": 0, + "image": "badges/money_dollar.png", + "order": 4, + "name": "Contributor Pin", + "description": "For making a donation to the site." + } + }, + { + "pk": 8, + "model": "bio.badge", + "fields": { + "numeric_id": 5, + "image": "badges/shield.png", + "order": 5, + "name": "Security Pin", + "description": "For reporting spam or abuse." + } + }, + { + "pk": 10, + "model": "bio.badge", + "fields": { + "numeric_id": 6, + "image": "badges/camera.png", + "order": 6, + "name": "POTD Pin", + "description": "For submitting a photo of the day." + } + }, + { + "pk": 1, + "model": "bio.badge", + "fields": { + "numeric_id": 100, + "image": "badges/award_star_bronze_1.png", + "order": 7, + "name": "Bronze Star", + "description": "For service to the site and community." + } + }, + { + "pk": 2, + "model": "bio.badge", + "fields": { + "numeric_id": 101, + "image": "badges/award_star_silver_2.png", + "order": 8, + "name": "Silver Star", + "description": "For distinguished and dedicated service to the site and community." + } + }, + { + "pk": 3, + "model": "bio.badge", + "fields": { + "numeric_id": 102, + "image": "badges/award_star_gold_3.png", + "order": 9, + "name": "Gold Star", + "description": "For exceptional and meritorious service to the site and community." + } + } +] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b bio/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,124 @@ +""" +This file contains the forms used by the bio application. +""" +try: + from cStringIO import StringIO +except: + from StringIO import StringIO + +from django import forms +from django.conf import settings +from django.core.files.base import ContentFile +from django.contrib.auth.models import User + +from bio.models import UserProfile +from core.widgets import AutoCompleteUserInput +from core.image import parse_image, downscale_image_square + + +class EditUserForm(forms.ModelForm): + """Form for editing the fields of the User model.""" + email = forms.EmailField(label='Email', required=True) + class Meta: + model = User + fields = ('first_name', 'last_name', 'email') + + +class EditUserProfileForm(forms.ModelForm): + """Form for editing the fields of the UserProfile model.""" + location = forms.CharField(required=False, widget=forms.TextInput(attrs={'size' : 64 })) + occupation = forms.CharField(required=False, widget=forms.TextInput(attrs={'size' : 64 })) + interests = forms.CharField(required=False, widget=forms.TextInput(attrs={'size' : 64 })) + time_zone = forms.CharField(required=False, widget=forms.HiddenInput()) + use_24_time = forms.BooleanField(label='Show times in 24-hour mode', required=False) + profile_text = forms.CharField(required=False, + widget=forms.Textarea(attrs={'class': 'markItUp'})) + signature = forms.CharField(required=False, + widget=forms.Textarea(attrs={'class': 'markItUp'})) + auto_favorite = forms.BooleanField( + label='Automatically favorite every forum topic I create or reply to', required=False) + auto_subscribe = forms.BooleanField( + label='Automatically subscribe to every forum topic I create or reply to', required=False) + + class Meta: + model = UserProfile + fields = ('location', 'birthday', 'occupation', 'interests', + 'profile_text', 'hide_email', 'signature', 'time_zone', + 'use_24_time', 'auto_favorite', 'auto_subscribe') + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui']) + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui'] + + ['js/bio.js', 'js/timezone.js']) + + +class UploadAvatarForm(forms.Form): + """Form used to change a user's avatar""" + avatar_file = forms.ImageField(required=False) + image = None + + def clean_avatar_file(self): + f = self.cleaned_data['avatar_file'] + if f is not None: + if f.size > settings.MAX_AVATAR_SIZE_BYTES: + raise forms.ValidationError("Please upload a file smaller than " + "%s bytes." % settings.MAX_AVATAR_SIZE_BYTES) + try: + self.image = parse_image(f) + except IOError: + raise forms.ValidationError("Please upload a valid image. " + "The file you uploaded was either not an image or a " + "corrupted image.") + self.file_type = self.image.format + return f + + def save(self): + """ + Perform any down-scaling needed on the new file, then return a tuple of + (filename, file object). Note that the file object returned may not + have a name; use the returned filename instead. + + """ + if not self.cleaned_data['avatar_file']: + return None, None + + name = self.cleaned_data['avatar_file'].name + dim = settings.MAX_AVATAR_SIZE_PIXELS + max_size = (dim, dim) + if self.image and self.image.size > max_size: + self.image = downscale_image_square(self.image, dim) + + # We need to return a Django File now. To get that from here, + # write the image data info a StringIO and then construct a + # Django ContentFile from that. The ContentFile has no name, + # that is why we return one ourselves explicitly. + s = StringIO() + self.image.save(s, self.file_type) + return name, ContentFile(s.getvalue()) + + return name, self.cleaned_data['avatar_file'] + + +class SearchUsersForm(forms.Form): + """ + A form to search for users. + """ + username = forms.CharField(max_length=30, widget=AutoCompleteUserInput()) + + class Media: + css = { + 'all': settings.GPP_THIRD_PARTY_CSS['jquery-ui'] + } + js = settings.GPP_THIRD_PARTY_JS['jquery-ui'] + + def clean_username(self): + username = self.cleaned_data['username'].strip() + try: + User.objects.get(username=username, is_active=True) + except User.DoesNotExist: + raise forms.ValidationError("That username does not exist.") + return username diff -r c525f3e0b5d0 -r ee87ea74d46b bio/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,216 @@ +""" +Contains models for the bio application. +I would have picked profile for this application, but that is already taken, apparently. +""" +import datetime +import os.path + +from django.db import models +from django.contrib.auth.models import User +from django.conf import settings +from django.core.cache import cache +from django.template.loader import render_to_string + +from core.markup import SiteMarkup + + +# These are the secondary user status enumeration values. +(STA_ACTIVE, # User is a full member in good standing. + STA_RESIGNED, # User has voluntarily asked to be removed. + STA_REMOVED, # User was removed for bad behavior. + STA_SUSPENDED, # User is temporarily suspended; e.g. a stranger tripped + # the spam filter. + STA_SPAMMER, # User has been removed for spamming. + STA_STRANGER, # New member, isn't fully trusted yet. Their comments and + # forum posts are scanned for spam. They can have their + # accounts deactivated by moderators for spamming. + ) = range(6) + +USER_STATUS_CHOICES = ( + (STA_ACTIVE, "Active"), + (STA_RESIGNED, "Resigned"), + (STA_REMOVED, "Removed"), + (STA_SUSPENDED, "Suspended"), + (STA_SPAMMER, "Spammer"), + (STA_STRANGER, "Stranger") +) + + +class Badge(models.Model): + """This model represents badges that users can earn.""" + image = models.ImageField(upload_to='badges') + name = models.CharField(max_length=64) + description = models.TextField(blank=True) + order = models.IntegerField() + numeric_id = models.IntegerField(db_index=True) + + class Meta: + ordering = ('order', ) + + def __unicode__(self): + return self.name + + def get_absolute_url(self): + return self.image.url + + def html(self): + """Returns a HTML img tag representation of the badge.""" + if self.image: + return u'%s' % ( + self.get_absolute_url(), self.name, self.name) + return u'' + html.allow_tags = True + + +def avatar_file_path(instance, filename): + ext = os.path.splitext(filename)[1] + if not ext: + ext = '.jpg' + avatar_name = instance.user.username + ext + return os.path.join(settings.AVATAR_DIR, 'users', avatar_name) + + +class UserProfile(models.Model): + """model to represent additional information about users""" + + user = models.ForeignKey(User, unique=True) + location = models.CharField(max_length=128, blank=True) + birthday = models.DateField(blank=True, null=True, + help_text='Optional; the year is not shown to others') + occupation = models.CharField(max_length=128, blank=True) + interests = models.CharField(max_length=255, blank=True) + profile_text = models.TextField(blank=True) + profile_html = models.TextField(blank=True) + hide_email = models.BooleanField(default=True) + signature = models.TextField(blank=True) + signature_html = models.TextField(blank=True) + avatar = models.ImageField(upload_to=avatar_file_path, blank=True) + time_zone = models.CharField(max_length=64, blank=True, + default='US/Pacific') + use_24_time = models.BooleanField(default=False) + forum_post_count = models.IntegerField(default=0) + status = models.IntegerField(default=STA_STRANGER, + choices=USER_STATUS_CHOICES) + status_date = models.DateTimeField(auto_now_add=True) + badges = models.ManyToManyField(Badge, through="BadgeOwnership") + update_date = models.DateTimeField(db_index=True, blank=True) + auto_favorite = models.BooleanField(default=False) + auto_subscribe = models.BooleanField(default=False) + + def __unicode__(self): + return self.user.username + + class Meta: + ordering = ('user__username', ) + + def save(self, *args, **kwargs): + """ + Custom profile save() function. + If content_update is True (default), then it is assumed that major + fields are being updated and that the profile_content_update signal + should be signalled. When content_update is False, the update_date is + not updated, expensive markup conversions are not performed, and the + signal is not signalled. This is useful for updating the + forum_post_count, for example. + + """ + content_update = kwargs.pop('content_update', True) + + if content_update: + self.update_date = datetime.datetime.now() + sm = SiteMarkup() + self.profile_html = sm.convert(self.profile_text) + self.signature_html = sm.convert(self.signature) + cache.delete('avatar_' + self.user.username) + + super(UserProfile, self).save(*args, **kwargs) + + if content_update: + notify_profile_content_update(self) + + @models.permalink + def get_absolute_url(self): + return ('bio-view_profile', (), {'username': self.user.username}) + + def badge_ownership(self): + return BadgeOwnership.objects.filter(profile=self).select_related('badge') + + def is_stranger(self): + """Returns True if this user profile status is STA_STRANGER.""" + return self.status == STA_STRANGER + + def user_is_active(self): + """Returns the profile's user is_active status. This function exists + for the admin. + """ + return self.user.is_active + user_is_active.boolean = True + user_is_active.short_description = "Is Active" + + def reset_text_fields(self): + """ + Reset profile text fields to empty defaults. + This function is useful when a spammer is identified. + + """ + self.location = '' + self.occupation = '' + self.interests = '' + self.profile_text = '' + self.signature = '' + + def search_title(self): + full_name = self.user.get_full_name() + if full_name: + return u"%s (%s)" % (self.user.username, full_name) + return self.user.username + + def search_summary(self): + text = render_to_string('search/indexes/bio/userprofile_text.txt', + {'object': self}); + return text + + +class UserProfileFlag(models.Model): + """This model represents a user flagging a profile as inappropriate.""" + user = models.ForeignKey(User) + profile = models.ForeignKey(UserProfile) + flag_date = models.DateTimeField(auto_now_add=True) + + def __unicode__(self): + return u"%s's profile flagged by %s" % (self.profile.user.username, + self.user.username) + + class Meta: + ordering = ('flag_date', ) + + def get_profile_url(self): + return 'Profile' % self.profile.get_absolute_url() + get_profile_url.allow_tags = True + + +class BadgeOwnership(models.Model): + """This model represents the ownership of badges by users.""" + profile = models.ForeignKey(UserProfile) + badge = models.ForeignKey(Badge) + count = models.IntegerField(default=1) + + class Meta: + verbose_name_plural = "badge ownership" + ordering = ('badge__order', ) + + def __unicode__(self): + if self.count == 1: + return u"%s owns 1 %s" % (self.profile.user.username, + self.badge.name) + else: + return u"%s owns %d %s badges" % (self.profile.user.username, + self.count, self.badge.name) + + def badge_count_str(self): + if self.count == 1: + return u"1 %s" % self.badge.name + return u"%d %ss" % (self.count, self.badge.name) + +# Put down here to avoid a circular import +from bio.signals import notify_profile_content_update diff -r c525f3e0b5d0 -r ee87ea74d46b bio/search_indexes.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/search_indexes.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,30 @@ +"""Haystack search index for the bio application.""" +from haystack.indexes import * +from haystack import site +from custom_search.indexes import CondQueuedSearchIndex + +from bio.models import UserProfile +from bio.signals import profile_content_update + + +class UserProfileIndex(CondQueuedSearchIndex): + text = CharField(document=True, use_template=True) + author = CharField(model_attr='user') + + def index_queryset(self): + return UserProfile.objects.filter(user__is_active=True) + + def get_updated_field(self): + return 'update_date' + + def _setup_save(self, model): + profile_content_update.connect(self.enqueue_save) + + def _teardown_save(self, model): + profile_content_update.disconnect(self.enqueue_save) + + def enqueue_save(self, sender, **kwargs): + return self.enqueue('update', sender) + + +site.register(UserProfile, UserProfileIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b bio/signals.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/signals.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,114 @@ +""" +Signal handlers & signals for the bio application. + +""" +from django.db.models.signals import post_save +from django.contrib.auth.models import User +import django.dispatch + +from donations.models import Donation +from weblinks.models import Link +from downloads.models import Download +from news.models import Story +from potd.models import Photo + + +def on_user_save(sender, **kwargs): + """ + This signal handler ensures that every User has a corresonding + UserProfile. It is called after User instance is saved. It creates + a UserProfile for the User if the created argument is True. + + """ + created = kwargs['created'] + if created: + user = kwargs['instance'] + profile = UserProfile() + profile.user = user + profile.save() + + +def on_donation_save(sender, **kwargs): + """ + This function is called after a Donation is saved. + If the Donation was newly created and not anonymous, + award the user a contributor pin. + + """ + if kwargs['created']: + donation = kwargs['instance'] + if not donation.is_anonymous and donation.user: + bio.badges.award_badge(bio.badges.CONTRIBUTOR_PIN, donation.user) + + +def on_link_save(sender, **kwargs): + """ + This function is called after a Link is saved. If the Link was newly + created, award the user a link pin. + + """ + if kwargs['created']: + link = kwargs['instance'] + bio.badges.award_badge(bio.badges.LINK_PIN, link.user) + + +def on_download_save(sender, **kwargs): + """ + This function is called after a Download is saved. If the Download was + newly created, award the user a download pin. + + """ + if kwargs['created']: + download = kwargs['instance'] + bio.badges.award_badge(bio.badges.DOWNLOAD_PIN, download.user) + + +def on_story_save(sender, **kwargs): + """ + This function is called after a Story is saved. If the Story was + newly created, award the user a news pin. + + """ + if kwargs['created']: + story = kwargs['instance'] + bio.badges.award_badge(bio.badges.NEWS_PIN, story.submitter) + + +def on_photo_save(sender, **kwargs): + """ + This function is called after a Photo is saved. If the Photo was + newly created, award the user a POTD pin. + + """ + if kwargs['created']: + photo = kwargs['instance'] + bio.badges.award_badge(bio.badges.POTD_PIN, photo.user) + + +post_save.connect(on_user_save, sender=User, dispatch_uid='bio.signals') +post_save.connect(on_donation_save, sender=Donation, dispatch_uid='bio.signals') +post_save.connect(on_link_save, sender=Link, dispatch_uid='bio.signals') +post_save.connect(on_download_save, sender=Download, dispatch_uid='bio.signals') +post_save.connect(on_story_save, sender=Story, dispatch_uid='bio.signals') +post_save.connect(on_photo_save, sender=Photo, dispatch_uid='bio.signals') + +# Signals for the bio application +# +# This signal is sent whenever a profile has had its textual content updated. +# The provided arguments to the receiver function are: +# - sender - the profile model instance + +profile_content_update = django.dispatch.Signal(providing_args=[]) + + +def notify_profile_content_update(profile): + """ + Convenience function to send the profile content update signal. + + """ + profile_content_update.send_robust(profile) + + +# To avoid circular imports +import bio.badges +from bio.models import UserProfile diff -r c525f3e0b5d0 -r ee87ea74d46b bio/static/css/bio.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/static/css/bio.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,38 @@ +div.user_profile th { + font-weight: bold; + text-align: left; + padding: 5px 5px; +} +div.user_profile td { + font-weight: normal; + text-align: left; + padding: 5px 5px; +} + +div.members-list table { + border-collapse: collapse; + width: 95%; + border: 1px solid black; + margin: 1em auto 1em auto; +} + +div.members-list th { + font-weight: bold; + text-align: center; + padding: 5px 5px; +} + +div.members-list tr { + border-top: 1px solid black; + border-bottom: 1px solid black; + text-align: center; +} + +div.members-list td { + padding: 5px 5px; + text-align: center; +} + +div.members-list tr.odd { + background-color: #ddd; +} diff -r c525f3e0b5d0 -r ee87ea74d46b bio/static/js/bio.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/static/js/bio.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,36 @@ +$(document).ready(function() { + var bday = $('#id_birthday'); + // jquery ui may not always be loaded + if (bday.length) { + bday.datepicker({changeMonth: true, + changeYear: true, + dateFormat: 'yy-mm-dd', + defaultDate: '-30y', + minDate: new Date(1909, 0, 1), + maxDate: new Date(), + yearRange: '-100:+0'}); + } + $('a.profile-flag').click(function() { + var id = this.id; + if (id.match(/fp-(\d+)/)) { + id = RegExp.$1; + if (confirm('Only report a profile if you feel it is spam, abuse, ' + + 'violates site rules, or is not appropriate. ' + + 'A moderator will be notified and will review the profile. ' + + 'Are you sure you want to report this profile?')) { + $.ajax({ + url: '/profile/flag/' + id + '/', + type: 'POST', + dataType: 'text', + success: function (response, textStatus) { + alert(response); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); + } + }); + } + } + return false; + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b bio/templatetags/bio_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/templatetags/bio_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,82 @@ +""" +Template tags for the bio application. +""" +from django import template +from django.conf import settings +from django.core.cache import cache + +from bio.models import UserProfile + + +register = template.Library() + + +def get_img_url(profile=None): + """ + This function returns a URL for a user profile avatar. + If the profile is None or the profile doesn't contain a valid + avatar, the URL for the default avatar is returned. + + """ + if profile is None or profile.avatar.name == '': + return settings.AVATAR_DEFAULT_URL + else: + return profile.avatar.url + + +@register.inclusion_tag('bio/avatar_tag.html') +def avatar(user, profile_link=True, align='bottom'): + """ + Returns the HTML for a user's avatar image. + + If the user object has an attribute 'user_profile', this is assumed to be + the user's profile that has been pre-fetched. Otherwise, the cache is + consulted to retrieve the avatar info for the user. If there is a cache + miss, only then will a get_profile() call be made. + """ + # img_info is a tuple that contains info about the avatar: + # (url, width, height) + + if hasattr(user, 'user_profile'): + img_url = get_img_url(user.user_profile) + else: + # try the cache + cache_key = 'avatar_' + user.username + img_url = cache.get(cache_key) + if img_url is None: + try: + profile = user.get_profile() + except UserProfile.DoesNotExist: + profile = None + + img_url = get_img_url(profile) + cache.set(cache_key, img_url) + + title = user.username + style = '' + if align == 'left': + style = 'style="float:left;margin-right:3px;"' + # other styles not supported + + return { + 'url': img_url, + 'title': title, + 'style': style, + 'username': user.username, + 'profile_link': profile_link, + } + + +@register.inclusion_tag('bio/profile_link_tag.html') +def profile_link(username, trailing_text=''): + """ + Renders a link to a given user's profile page. + Trailing text is any text that you want displayed after the final tag. + Because of the way the Django template system works, a newline will + automatically be inserted after this tag is expanded. If you want a period + to follow immediately after the link, then set trailing_text to '.'. + Otherwise a space will appear between the linked text and any text that + follows the tag. + + """ + return {'username': username, 'trailing_text': trailing_text } diff -r c525f3e0b5d0 -r ee87ea74d46b bio/templatetags/elsewhere_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/templatetags/elsewhere_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,17 @@ +""" +Template tags for the elsewhere application. +""" +from django import template +from django.conf import settings + +register = template.Library() + + +@register.inclusion_tag('bio/elsewhere_links.html') +def elsewhere_links(user): + return { + 'social_nets': user.social_network_profiles.all(), + 'ims': user.instant_messenger_profiles.all(), + 'websites': user.website_profiles.all(), + 'STATIC_URL': settings.STATIC_URL, + } diff -r c525f3e0b5d0 -r ee87ea74d46b bio/tests/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/tests/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1 @@ +from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b bio/tests/view_tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/tests/view_tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,78 @@ +""" +View tests for the bio application. + +""" +import datetime + +from django.contrib.auth.models import User +from django.test import TestCase +from django.core.urlresolvers import reverse, NoReverseMatch + + +class MemberSearchTest(TestCase): + + USERNAME = u'John' + + def setUp(self): + user = User.objects.create_user(self.USERNAME, '', 'password') + user.save() + + self.username = 'test_user' + self.pw = 'password' + self.user = User.objects.create_user(self.username, '', self.pw) + self.user.save() + self.assertTrue(self.client.login(username=self.username, + password=self.pw)) + + def tearDown(self): + self.client.logout() + + def testValidName(self): + """ + Test a valid username. + """ + + response = self.client.post(reverse('bio-member_search'), + {'username': self.USERNAME}, + follow=True) + + self.assertEqual(len(response.redirect_chain), 1) + if response.redirect_chain: + self.assertEqual(response.redirect_chain[0][0], + 'http://testserver' + reverse('bio-view_profile', + kwargs={'username': self.USERNAME})) + self.assertEqual(response.redirect_chain[0][1], 302) + + self.assertEqual(response.status_code, 200) + + def testInvalidName(self): + """ + Test a invalid username. + """ + + response = self.client.post(reverse('bio-member_search'), + {'username': self.USERNAME + '!'}) + + self.assertEqual(response.status_code, 200) + self.assertContains(response, "That username does not exist.") + + def testTrailingSpace(self): + """ + Test a username with a trailing space. + """ + + try: + response = self.client.post(reverse('bio-member_search'), + {'username': self.USERNAME + ' '}, + follow=True) + except NoReverseMatch: + self.fail('bit by a MySQL bug?') + + self.assertEqual(len(response.redirect_chain), 1) + if response.redirect_chain: + self.assertEqual(response.redirect_chain[0][0], + 'http://testserver' + reverse('bio-view_profile', + kwargs={'username': self.USERNAME})) + self.assertEqual(response.redirect_chain[0][1], 302) + + self.assertEqual(response.status_code, 200) diff -r c525f3e0b5d0 -r ee87ea74d46b bio/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,15 @@ +"""urls for the bio application""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('bio.views', + url(r'^members/(?Puser|date)/$', + 'member_list', + name='bio-member_list'), + url(r'^members/search/$', 'member_search', name='bio-member_search'), + url(r'^me/$', 'my_profile', name='bio-me'), + url(r'^view/(?P[\w.@+-]{1,30})/$', 'view_profile', name='bio-view_profile'), + url(r'^edit/$', 'edit_profile', name='bio-edit_profile'), + url(r'^edit/elsewhere/$', 'edit_elsewhere', name='bio-edit_elsewhere'), + url(r'^avatar/$', 'change_avatar', name='bio-change_avatar'), + url(r'^flag/(\d+)/$', 'flag_profile', name='bio-flag_profile'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b bio/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bio/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,288 @@ +""" +Views for the bio application. + +""" +from django.shortcuts import render_to_response +from django.shortcuts import get_object_or_404 +from django.template import RequestContext +from django.contrib import messages +from django.contrib.auth.models import User +from django.http import HttpResponse +from django.http import HttpResponseBadRequest +from django.http import HttpResponseRedirect +from django.http import HttpResponseServerError +from django.http import Http404 +from django.core.paginator import InvalidPage +from django.core.urlresolvers import reverse +from django.contrib.auth.decorators import login_required +from django.views.decorators.http import require_POST + +from elsewhere.models import SocialNetworkForm +from elsewhere.models import InstantMessengerForm +from elsewhere.models import WebsiteForm + +from bio.models import UserProfile +from bio.models import UserProfileFlag +from bio.models import BadgeOwnership +from bio.forms import UploadAvatarForm +from bio.forms import EditUserForm +from bio.forms import EditUserProfileForm +from bio.forms import SearchUsersForm +from bio.signals import notify_profile_content_update +from core.paginator import DiggPaginator +from core.functions import email_admins +from core.functions import get_page + +####################################################################### + +@login_required +def member_list(request, type='user'): + """ + This view displays the member list. Only active members are displayed. + """ + qs = User.objects.filter(is_active=True) + if type == 'user': + qs = qs.order_by('username') + else: + qs = qs.order_by('date_joined') + num_members = qs.count() + + paginator = DiggPaginator(qs, 20, body=5, tail=3, margin=3, padding=2) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + # Attach user profiles to each user to avoid using get_user_profile() in + # the template. + users = set(user.id for user in the_page.object_list) + + profiles = UserProfile.objects.filter(user__id__in=users).select_related() + user_profiles = dict((profile.user.id, profile) for profile in profiles) + + for user in the_page.object_list: + user.user_profile = user_profiles[user.id] + + return render_to_response('bio/members.html', { + 'page': the_page, + 'type': type, + 'num_members': num_members, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def my_profile(request): + profile = request.user.get_profile() + badge_collection = BadgeOwnership.objects.filter( + profile=profile).select_related("badge") + + return render_to_response('bio/view_profile.html', { + 'subject': request.user, + 'profile': profile, + 'hide_email': False, + 'this_is_me': True, + 'badge_collection': badge_collection, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def view_profile(request, username): + + user = get_object_or_404(User, username=username) + if user == request.user: + return HttpResponseRedirect(reverse('bio.views.my_profile')) + + profile = user.get_profile() + hide_email = profile.hide_email + + badge_collection = BadgeOwnership.objects.filter( + profile=profile).select_related("badge") + + return render_to_response('bio/view_profile.html', { + 'subject': user, + 'profile': profile, + 'hide_email': hide_email, + 'this_is_me': False, + 'badge_collection': badge_collection, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def edit_profile(request): + if request.method == 'POST': + if request.POST.get('submit_button', 'Cancel') == 'Cancel': + return HttpResponseRedirect(reverse('bio.views.my_profile')) + profile = request.user.get_profile() + user_form = EditUserForm(request.POST, instance=request.user) + profile_form = EditUserProfileForm(request.POST, instance=profile) + if user_form.is_valid() and profile_form.is_valid(): + user_form.save() + profile = profile_form.save(commit=False) + profile.user = request.user + profile.save() + return HttpResponseRedirect(reverse('bio.views.my_profile')) + else: + profile = request.user.get_profile() + user_form = EditUserForm(instance=request.user) + profile_form = EditUserProfileForm(instance=profile) + + return render_to_response('bio/edit_profile.html', { + 'user_form': user_form, + 'profile_form': profile_form, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def change_avatar(request): + if request.method == 'POST': + form = UploadAvatarForm(request.POST, request.FILES) + if form.is_valid(): + # Update the profile with the new avatar + profile = request.user.get_profile() + + # First delete any old avatar file + if profile.avatar.name != '': + profile.avatar.delete(save=False) + + try: + name, avatar = form.save() + except IOError: + messages.error(request, 'A file error occurred.') + return HttpResponseRedirect(reverse('bio-me')) + + if avatar is not None: + profile.avatar.save(name, avatar, save=False) + profile.save() + + messages.success(request, 'Avatar updated') + return HttpResponseRedirect(reverse('bio-me')) + else: + form = UploadAvatarForm() + + return render_to_response('bio/avatar.html', { + 'form': form, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@require_POST +def flag_profile(request, profile_id): + """ + This function handles the flagging of profiles by users. This function should + be the target of an AJAX post. + """ + if not request.user.is_authenticated(): + return HttpResponse('Please login or register to flag a profile.') + + try: + profile = UserProfile.objects.get(pk=profile_id) + except UserProfile.DoesNotExist: + return HttpResponseBadRequest("That profile doesn't exist.") + + flag = UserProfileFlag(user=request.user, profile=profile) + flag.save() + email_admins('A Profile Has Been Flagged', """Hello, + +A user has flagged a profile for review. +""") + return HttpResponse('The profile was flagged. A moderator will review the' \ + ' profile shortly. Thanks for helping to improve the content on this ' \ + 'site.') + +####################################################################### + +@login_required +def edit_elsewhere(request): + im_id = 'id_im_%s' # to prevent duplicate ID in HTML output + if request.method == 'POST': + new_data = request.POST.copy() + + # Add forms + if new_data.get('sn-form') or new_data.get('im-form') or new_data.get('w-form'): + + if new_data.get('sn-form'): + sn_form = SocialNetworkForm(new_data) + im_form = InstantMessengerForm(auto_id=im_id) + w_form = WebsiteForm() + form = sn_form + elif new_data.get('im-form'): + sn_form = SocialNetworkForm() + im_form = InstantMessengerForm(new_data, auto_id=im_id) + w_form = WebsiteForm() + form = im_form + elif new_data.get('w-form'): + sn_form = SocialNetworkForm() + im_form = InstantMessengerForm(auto_id=im_id) + w_form = WebsiteForm(new_data) + form = w_form + + if form.is_valid(): + profile = form.save(commit=False) + profile.user = request.user + profile.save() + return HttpResponseRedirect(request.path) + + # Delete forms + elif new_data.get('delete-sn-form') or new_data.get('delete-im-form') or new_data.get('delete-w-form'): + delete_id = request.POST['delete_id'] + + update_occurred = True + if new_data.get('delete-sn-form'): + request.user.social_network_profiles.get(id=delete_id).delete() + elif new_data.get('delete-im-form'): + request.user.instant_messenger_profiles.get(id=delete_id).delete() + elif new_data.get('delete-w-form'): + request.user.website_profiles.get(id=delete_id).delete() + else: + update_occurred = False + + if update_occurred: + notify_profile_content_update(request.user.get_profile()) + + return HttpResponseRedirect(request.path) + + # WTF? + else: + return HttpResponseServerError + + else: + # Create blank forms + sn_form = SocialNetworkForm() + im_form = InstantMessengerForm(auto_id=im_id) + w_form = WebsiteForm() + + return render_to_response('bio/edit_elsewhere.html', { + 'sn_form': sn_form, + 'im_form': im_form, + 'w_form': w_form, + }, + context_instance=RequestContext(request)) + +####################################################################### + +@login_required +def member_search(request): + if request.method == "POST": + form = SearchUsersForm(request.POST) + if form.is_valid(): + username = form.cleaned_data['username'] + return HttpResponseRedirect(reverse("bio-view_profile", + kwargs={'username': username})) + else: + form = SearchUsersForm() + + return render_to_response('bio/member_search.html', { + 'form': form, + }, + context_instance=RequestContext(request)) + diff -r c525f3e0b5d0 -r ee87ea74d46b bulletins/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bulletins/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,20 @@ +''' +This file contains the automatic admin site definitions for the Bulletins models. +''' + +from django.contrib import admin +from django.conf import settings + +from bulletins.models import Bulletin + +class BulletinAdmin(admin.ModelAdmin): + list_display = ('title', 'start_date', 'end_date', 'is_enabled') + list_filter = ('start_date', 'end_date', 'is_enabled') + search_fields = ('title', 'text') + date_hierarchy = 'start_date' + + class Media: + js = settings.GPP_THIRD_PARTY_JS['tiny_mce'] + + +admin.site.register(Bulletin, BulletinAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b bulletins/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bulletins/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,38 @@ +"""Models for the bulletins app. +Bulletins allow the sited admins to display and manage important notices for the website. +""" + +import datetime +from django.db import models +from django.db.models import Q + + +class BulletinManager(models.Manager): + """Manager for the Bulletin model.""" + + def get_current(self): + now = datetime.datetime.now() + return self.filter( + Q(is_enabled=True), + Q(start_date__lte=now), + Q(end_date__isnull=True) | Q(end_date__gte=now)) + + +class Bulletin(models.Model): + """Model to represent site bulletins.""" + title = models.CharField(max_length=200) + text = models.TextField() + start_date = models.DateTimeField(db_index=True, + help_text='Start date for when the bulletin will be active.',) + end_date = models.DateTimeField(blank=True, null=True, db_index=True, + help_text='End date for the bulletin. Leave blank to keep it open-ended.') + is_enabled = models.BooleanField(default=True, db_index=True, + help_text='Check to allow the bulletin to be viewed on the site.') + + objects = BulletinManager() + + class Meta: + ordering = ('-start_date', ) + + def __unicode__(self): + return self.title diff -r c525f3e0b5d0 -r ee87ea74d46b bulletins/templatetags/bulletin_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/bulletins/templatetags/bulletin_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,17 @@ +""" +Template tags for the bulletins application. +""" +from django import template + +from bulletins.models import Bulletin + + +register = template.Library() + + +@register.inclusion_tag('bulletins/bulletins.html') +def current_bulletins(): + bulletins = Bulletin.objects.get_current() + return { + 'bulletins': bulletins, + } diff -r c525f3e0b5d0 -r ee87ea74d46b comments/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,48 @@ +""" +This file contains the automatic admin site definitions for the comment models. +""" +from django.contrib import admin +from comments.models import Comment +from comments.models import CommentFlag +import bio.badges + + +class CommentAdmin(admin.ModelAdmin): + fieldsets = ( + (None, + {'fields': ('content_type', 'object_id', )} + ), + ('Content', + {'fields': ('user', 'comment')} + ), + ('Metadata', + {'fields': ('ip_address', 'is_public', 'is_removed')} + ), + ) + list_display = ('__unicode__', 'content_type', 'object_id', 'ip_address', + 'creation_date', 'is_public', 'not_removed') + list_filter = ('creation_date', 'is_public', 'is_removed') + date_hierarchy = 'creation_date' + ordering = ('-creation_date', ) + search_fields = ('comment', 'user__username', 'ip_address') + raw_id_fields = ('user', 'content_type') + + +class CommentFlagAdmin(admin.ModelAdmin): + list_display = ('__unicode__', 'flag_date', 'get_comment_url') + actions = ('accept_flags', ) + raw_id_fields = ('user', 'comment') + + def accept_flags(self, request, qs): + """This admin action awards a security pin to the user who reported + the comment and then deletes the flagged comment object. + """ + for flag in qs: + bio.badges.award_badge(bio.badges.SECURITY_PIN, flag.user) + flag.delete() + + accept_flags.short_description = "Accept selected comment flags" + + +admin.site.register(Comment, CommentAdmin) +admin.site.register(CommentFlag, CommentFlagAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b comments/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,74 @@ +""" +Forms for the comments application. +""" +import datetime +from django import forms +from django.conf import settings +from django.contrib.contenttypes.models import ContentType + +from comments.models import Comment + +COMMENT_MAX_LENGTH = getattr(settings, 'COMMENT_MAX_LENGTH', 3000) + +class CommentForm(forms.Form): + comment = forms.CharField(label='', + min_length=1, + max_length=COMMENT_MAX_LENGTH, + widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) + content_type = forms.CharField(widget=forms.HiddenInput) + object_pk = forms.CharField(widget=forms.HiddenInput) + + def __init__(self, target_object, data=None, initial=None): + self.target_object = target_object + if initial is None: + initial = {} + initial.update({ + 'content_type': str(self.target_object._meta), + 'object_pk': str(self.target_object.pk), + }) + super(CommentForm, self).__init__(data=data, initial=initial) + + def get_comment_object(self, user, ip_address): + """ + Return a new (unsaved) comment object based on the information in this + form. Assumes that the form is already validated and will throw a + ValueError if not. + """ + if not self.is_valid(): + raise ValueError("get_comment_object may only be called on valid forms") + + new = Comment( + content_type = ContentType.objects.get_for_model(self.target_object), + object_id = self.target_object.pk, + user = user, + comment = self.cleaned_data["comment"], + ip_address = ip_address, + is_public = True, + is_removed = False, + ) + + # Check that this comment isn't duplicate. (Sometimes people post comments + # twice by mistake.) If it is, fail silently by returning the old comment. + today = datetime.date.today() + possible_duplicates = Comment.objects.filter( + content_type = new.content_type, + object_id = new.object_id, + user = new.user, + creation_date__year = today.year, + creation_date__month = today.month, + creation_date__day = today.day, + ) + for old in possible_duplicates: + if old.comment == new.comment: + return old + + return new + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui']), + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui'] + + ['js/comments.js']) diff -r c525f3e0b5d0 -r ee87ea74d46b comments/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,99 @@ +""" +Models for the comments application. +""" +import datetime + +from django.db import models +from django.conf import settings +from django.contrib.contenttypes.models import ContentType +from django.contrib.contenttypes import generic +from django.contrib.auth.models import User +from django.core import urlresolvers + +from core.markup import site_markup + + +COMMENT_MAX_LENGTH = getattr(settings, 'COMMENT_MAX_LENGTH', 3000) + +class CommentManager(models.Manager): + """Manager for the Comment model class.""" + + def for_object(self, obj, filter_public=True): + """QuerySet for all comments for a particular model instance.""" + ct = ContentType.objects.get_for_model(obj) + qs = self.get_query_set().filter(content_type__pk=ct.id, + object_id=obj.id) + if filter_public: + qs = qs.filter(is_public=True) + return qs + + +class Comment(models.Model): + """My own version of a Comment class that can attach comments to any other model.""" + content_type = models.ForeignKey(ContentType) + object_id = models.PositiveIntegerField(db_index=True) + content_object = generic.GenericForeignKey('content_type', 'object_id') + user = models.ForeignKey(User) + comment = models.TextField(max_length=COMMENT_MAX_LENGTH) + html = models.TextField(blank=True) + creation_date = models.DateTimeField() + ip_address = models.IPAddressField('IP Address') + is_public = models.BooleanField(default=True, + help_text='Uncheck this field to make the comment invisible.') + is_removed = models.BooleanField(default=False, + help_text='Check this field to replace the comment with a ' \ + '"This comment has been removed" message') + + # Attach manager + objects = CommentManager() + + class Meta: + ordering = ('creation_date', ) + + def __unicode__(self): + return u'%s: %s...' % (self.user.username, self.comment[:50]) + + def save(self, *args, **kwargs): + if not self.id: + self.creation_date = datetime.datetime.now() + + self.html = site_markup(self.comment) + super(Comment, self).save(*args, **kwargs) + + def get_absolute_url(self): + return self.get_content_object_url() + ('#c%s' % self.id) + + def get_content_object_url(self): + """ + Get a URL suitable for redirecting to the content object. + """ + return urlresolvers.reverse( + "comments-url-redirect", + args=(self.content_type_id, self.object_id) + ) + + def not_removed(self): + """ + Returns not self.is_removed. Used on the admin display for + "green board" display purposes. + """ + return not self.is_removed + not_removed.boolean = True + + +class CommentFlag(models.Model): + """This model represents a user flagging a comment as inappropriate.""" + user = models.ForeignKey(User) + comment = models.ForeignKey(Comment) + flag_date = models.DateTimeField(auto_now_add=True) + + def __unicode__(self): + return u'Comment ID %s flagged by %s' % (self.comment.id, self.user.username) + + class Meta: + ordering = ('flag_date', ) + + def get_comment_url(self): + return 'Comment' % self.comment.id + get_comment_url.allow_tags = True + diff -r c525f3e0b5d0 -r ee87ea74d46b comments/static/css/comments.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/static/css/comments.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,28 @@ +div.comment-list { + float: left; + font-size: 18px; + font-weight: bold; + color: #999; + padding-right: .5em; +} +div.comment { + padding: 0.5em; + border-bottom: 1px dashed black; + font: 12px/18px "Lucida Grande", Verdana, sans-serif; + color: #333; +} +div.comment-avatar { + float: left; + padding-right: 1.5em; +} +div.comment-text { +} +div.comment-text-removed { + font-style: italic; +} +div.comment-details { + clear: both; + font-size: smaller; + font-style: italic; + padding-top: 0.5em; +} diff -r c525f3e0b5d0 -r ee87ea74d46b comments/static/js/comments.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/static/js/comments.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,73 @@ +$(document).ready(function() { + var postText = $('#id_comment'); + var postButton = $('#comment-form-post'); + postButton.click(function () { + var text = $.trim(postText.val()); + if (text.length == 0) { + alert('Please enter some text.'); + return false; + } + postButton.attr('disabled', 'disabled').val('Posting Comment...'); + $.ajax({ + url: '/comments/post/', + type: 'POST', + data: { + comment : text, + content_type : $('#id_content_type').val(), + object_pk : $('#id_object_pk').val() + }, + dataType: 'html', + success: function (data, textStatus) { + postText.val(''); + $('#comment-container').append(data); + var newDiv = $('#comment-container > div:last'); + newDiv.hide(); + var num = $('.comment-list', newDiv); + num.html($('#comment-container > div').size() + "."); + newDiv.fadeIn(3000); + postButton.removeAttr('disabled').val('Post Comment'); + var count = $('#comment-count'); + if (count.length) { + count.html(parseInt(count.html()) + 1); + } + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + postButton.removeAttr('disabled').val('Post Comment'); + } + }); + return false; + }); + $('a.comment-flag').click(function () { + var id = this.id; + if (id.match(/fc-(\d+)/)) { + id = RegExp.$1; + if (confirm('Only flag a comment if you feel it is spam, abuse, violates site rules, ' + + 'or is not appropriate. ' + + 'A moderator will be notified and will review the comment. ' + + 'Are you sure you want to flag this comment?')) { + $.ajax({ + url: '/comments/flag/', + type: 'POST', + data: {id: id}, + dataType: 'text', + success: function (response, textStatus) { + alert(response); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); + } + }); + } + } + return false; + }); + + $('.comment-text img').fadeIn('fast', function() { + var pic = $(this); + if (pic.width() > 720) { + pic.css('width', '720px'); + } + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b comments/templatetags/comment_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/templatetags/comment_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,169 @@ +""" +Template tags for our Comments application. +We support the following template tags: + {% get_comment_count for [object] as [var] %} + {% get_comment_list for [object] as [var] %}` + {% get_comment_form for [object] as [var] %}` + {% render_comment_form for [object] %} + {% render_comment_list [object] %} +""" +from django import template +from django.conf import settings +from django.template.loader import render_to_string +from django.contrib.contenttypes.models import ContentType + +from comments.models import Comment +from comments.forms import CommentForm + + +register = template.Library() + + +class GetCommentCountNode(template.Node): + def __init__(self, obj, var): + self.object = template.Variable(obj) + self.as_var = var + + def render(self, context): + object = self.object.resolve(context) + qs = Comment.objects.for_object(object) + context[self.as_var] = qs.count() + return '' + +@register.tag +def get_comment_count(parser, token): + """ + Gets the comment count for the specified object and makes it available in the + template context under the variable name specified. + Syntax: + {% get_comment_count for [object] as [varname] %} + """ + try: + (tag, for_word, obj, as_word, var) = token.split_contents() + except ValueError: + raise template.TemplateSyntaxError, "%r tag requires exactly 4 arguments" % token.contents.split()[0] + + if for_word != 'for': + raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) + + if as_word != 'as': + raise template.TemplateSyntaxError("Third argument in %r tag must be 'as'" % tag) + + return GetCommentCountNode(obj, var) + + +class GetCommentListNode(template.Node): + def __init__(self, obj, var): + self.object = template.Variable(obj) + self.as_var = var + + def render(self, context): + object = self.object.resolve(context) + qs = Comment.objects.for_object(object) + context[self.as_var] = list(qs) + return '' + + +@register.tag +def get_comment_list(parser, token): + """ + Gets a list of comments for the specified object and makes it available in the + template context under the variable name specified. + Syntax: + {% get_comment_list for [object] as [varname] %} + """ + try: + (tag, for_word, obj, as_word, var) = token.split_contents() + except ValueError: + raise template.TemplateSyntaxError, "%r tag requires exactly 4 arguments" % token.contents.split()[0] + + if for_word != 'for': + raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) + + if as_word != 'as': + raise template.TemplateSyntaxError("Third argument in %r tag must be 'as'" % tag) + + return GetCommentListNode(obj, var) + + +class GetCommentFormNode(template.Node): + def __init__(self, obj, var): + self.object = template.Variable(obj) + self.as_var = var + + def render(self, context): + object = self.object.resolve(context) + context[self.as_var] = CommentForm(object) + return '' + + +@register.tag +def get_comment_form(parser, token): + """ + Gets the comment form for an object and makes it available in the + template context under the variable name specified. + Syntax: + {% get_comment_form for [object] as [varname] %} + """ + try: + (tag, for_word, obj, as_word, var) = token.split_contents() + except ValueError: + raise template.TemplateSyntaxError, "%r tag requires exactly 4 arguments" % token.contents.split()[0] + + if for_word != 'for': + raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) + + if as_word != 'as': + raise template.TemplateSyntaxError("Third argument in %r tag must be 'as'" % tag) + + return GetCommentFormNode(obj, var) + + +class RenderCommentFormNode(template.Node): + def __init__(self, obj): + self.object = template.Variable(obj) + + def render(self, context): + object = self.object.resolve(context) + context.push() + form_str = render_to_string('comments/comment_form.html', { + 'form': CommentForm(object), + }, + context) + context.pop() + return form_str + + +@register.tag +def render_comment_form(parser, token): + """ + Renders a comment form for the specified object using the template + comments/comment_form.html. + Syntax: + {% render_comment_form for [object] %} + """ + try: + (tag, for_word, obj) = token.split_contents() + except ValueError: + raise template.TemplateSyntaxError, "%r tag requires exactly 2 arguments" % token.contents.split()[0] + + if for_word != 'for': + raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) + + return RenderCommentFormNode(obj) + + +@register.inclusion_tag('comments/comment_list.html') +def render_comment_list(object): + """ + Renders the comments for the specified object using the template + comments/comment_list.html. + Syntax: + {% render_comment_list [object] %} + """ + qs = Comment.objects.for_object(object).select_related('user') + return { + 'comments': qs, + 'STATIC_URL': settings.STATIC_URL, + } + diff -r c525f3e0b5d0 -r ee87ea74d46b comments/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,16 @@ +""" +URLs for the comments application. +""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('comments.views', + url(r'^flag/$', 'flag_comment', name='comments-flag'), + url(r'^markdown/$', 'markdown_preview', name='comments-markdown_preview'), + url(r'^post/$', 'post_comment', name='comments-post'), +) + +urlpatterns += patterns('', + url(r'^cr/(\d+)/(\d+)/$', + 'django.contrib.contenttypes.views.shortcut', + name='comments-url-redirect'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b comments/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/comments/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,134 @@ +""" +Views for the comments application. +""" +from django.contrib.auth.decorators import login_required +from django.core.exceptions import ObjectDoesNotExist +from django.http import HttpResponse +from django.http import HttpResponseRedirect +from django.http import HttpResponseBadRequest +from django.http import HttpResponseForbidden +from django.db.models import get_model +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.utils.html import escape +from django.views.decorators.http import require_POST + +from core.functions import email_admins +from core.markup import site_markup +from comments.forms import CommentForm +from comments.models import Comment +from comments.models import CommentFlag +import antispam +import antispam.utils + + +@login_required +@require_POST +def post_comment(request): + """ + This function handles the posting of comments. If successful, returns + the comment text as the response. This function is meant to be the target + of an AJAX post. + """ + # Look up the object we're trying to comment about + ctype = request.POST.get('content_type', None) + object_pk = request.POST.get('object_pk', None) + if ctype is None or object_pk is None: + return HttpResponseBadRequest('Missing content_type or object_pk field.') + + try: + model = get_model(*ctype.split('.', 1)) + target = model.objects.get(pk=object_pk) + except TypeError: + return HttpResponseBadRequest( + "Invalid content_type value: %r" % escape(ctype)) + except AttributeError: + return HttpResponseBadRequest( + "The given content-type %r does not resolve to a valid model." % \ + escape(ctype)) + except ObjectDoesNotExist: + return HttpResponseBadRequest( + "No object matching content-type %r and object PK %r exists." % \ + (escape(ctype), escape(object_pk))) + + # Can we comment on the target object? + if hasattr(target, 'can_comment_on'): + if callable(target.can_comment_on): + can_comment_on = target.can_comment_on() + else: + can_comment_on = target.can_comment_on + else: + can_comment_on = True + + if not can_comment_on: + return HttpResponseForbidden('Cannot comment on this item.') + + # Check form validity + + form = CommentForm(target, request.POST) + if not form.is_valid(): + return HttpResponseBadRequest('Invalid comment; missing parameters?') + + comment = form.get_comment_object(request.user, request.META.get("REMOTE_ADDR", None)) + + # Check for spam + + if antispam.utils.spam_check(request, comment.comment): + return HttpResponseForbidden(antispam.BUSTED_MESSAGE) + + comment.save() + + # return the rendered comment + return render_to_response('comments/comment.html', { + 'comment': comment, + }, + context_instance = RequestContext(request)) + + +@require_POST +def flag_comment(request): + """ + This function handles the flagging of comments by users. This function should + be the target of an AJAX post. + """ + if not request.user.is_authenticated(): + return HttpResponse('Please login or register to flag a comment.') + + id = request.POST.get('id', None) + if id is None: + return HttpResponseBadRequest('No id') + + try: + comment = Comment.objects.get(pk=id) + except Comment.DoesNotExist: + return HttpResponseBadRequest('No comment with id %s' % id) + + flag = CommentFlag(user=request.user, comment=comment) + flag.save() + email_admins('A Comment Has Been Flagged', """Hello, + +A user has flagged a comment for review. +""") + return HttpResponse('The comment was flagged. A moderator will review the comment shortly. ' \ + 'Thanks for helping to improve the discussions on this site.') + + +@require_POST +def markdown_preview(request): + """ + This function should be the target of an AJAX POST. It takes the 'data' parameter + from the POST parameters and returns a rendered HTML page from the data, which + is assumed to be in markdown format. The HTML page is suitable for the preview + function for a javascript editor such as markItUp. + """ + if not request.user.is_authenticated(): + return HttpResponseForbidden('This service is only available to logged in users.') + + data = request.POST.get('data', None) + if data is None: + return HttpResponseBadRequest('No data') + + return render_to_response('comments/markdown_preview.html', { + 'data': site_markup(data), + }, + context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b contact/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contact/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,47 @@ +"""forms for the contact application""" + +from django import forms +from django.conf import settings +from django.template.loader import render_to_string +from django.contrib.sites.models import Site +from core.functions import send_mail + + +class ContactForm(forms.Form): + """Form used to contact the website admins""" + name = forms.CharField(label = "Your Name", max_length = 61, + widget = forms.TextInput(attrs = {'size' : 50 })) + email = forms.EmailField(label = "Your Email", + widget = forms.TextInput(attrs = {'size' : 50 })) + subject = forms.CharField(max_length = 64, + widget = forms.TextInput(attrs = {'size' : 50 })) + honeypot = forms.CharField(max_length = 64, required = False, + label = 'If you enter anything in this field your message will be treated as spam') + message = forms.CharField(label = "Your Message", + widget = forms.Textarea(attrs = {'rows' : 16, 'cols' : 50}), + max_length = 3000) + + recipient_list = [mail_tuple[1] for mail_tuple in settings.MANAGERS] + + def clean_honeypot(self): + value = self.cleaned_data['honeypot'] + if value: + raise forms.ValidationError(self.fields['honeypot'].label) + return value + + def save(self): + # Send the feedback message email + + site = Site.objects.get_current() + + msg = render_to_string('contact/contact_email.txt', + { + 'site_name' : site.name, + 'user_name' : self.cleaned_data['name'], + 'user_email' : self.cleaned_data['email'], + 'message' : self.cleaned_data['message'], + }) + + subject = site.name + ' Feedback: ' + self.cleaned_data['subject'] + send_mail(subject, msg, self.cleaned_data['email'], self.recipient_list) + diff -r c525f3e0b5d0 -r ee87ea74d46b contact/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contact/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,7 @@ +"""urls for the contact application""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('contact.views', + url(r'^$', 'contact_form', name='contact-form'), + (r'^thanks/$', 'contact_thanks'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b contact/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contact/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,33 @@ +# Create your views here. + +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.http import HttpResponseRedirect +from django.core.urlresolvers import reverse + +from contact.forms import ContactForm +from core.functions import get_full_name + + +def contact_form(request): + if request.method == 'POST': + form = ContactForm(request.POST) + if form.is_valid(): + form.save() + return HttpResponseRedirect(reverse('contact.views.contact_thanks')) + else: + initial_data = {} + if request.user.is_authenticated(): + name = get_full_name(request.user) + initial_data = {'name' : name, 'email' : request.user.email} + + form = ContactForm(initial = initial_data) + + return render_to_response('contact/contact_form.html', + {'form' : form}, + context_instance = RequestContext(request)) + + +def contact_thanks(request): + return render_to_response('contact/contact_thanks.html', + context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b contests/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,50 @@ +""" +Admin definitions for the contest application. + +""" +from django.contrib import admin +from django.conf import settings + +from contests.models import Contest + + +class ContestAdmin(admin.ModelAdmin): + list_display = ['title', 'is_public', 'creation_date', 'end_date', + 'contestant_count', 'winner'] + list_editable = ['is_public'] + date_hierarchy = 'creation_date' + search_fields = ['title', 'description'] + prepopulated_fields = {'slug': ['title']} + raw_id_fields = ['winner', 'contestants'] + actions = ['pick_winner'] + + class Media: + js = (['js/contests/contests_admin.js'] + + settings.GPP_THIRD_PARTY_JS['tiny_mce']) + + def contestant_count(self, obj): + return obj.contestants.count() + contestant_count.short_description = '# Entries' + + def pick_winner(self, request, qs): + """ + Picks a winner on the contests selected by the admin. Note that for + safety reasons, we only update those contests that don't have winners + already. + + """ + count = 0 + for contest in qs: + if not contest.winner: + contest.pick_winner() + contest.save() + count += 1 + + self.message_user(request, "%d of %d winners picked" % (count, + qs.count())) + + pick_winner.short_description = "Pick winners for selected contests" + + + +admin.site.register(Contest, ContestAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b contests/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,93 @@ +""" +Models for the contest application. + +""" +import random +import datetime + +from django.db import models +from django.contrib.auth.models import User + + +class PublicContestManager(models.Manager): + """ + The manager for all public contests. + + """ + def get_query_set(self): + return super(PublicContestManager, self).get_query_set().filter(is_public=True) + + +class Contest(models.Model): + """ + A model to represent contests where users sign up to win something. + + """ + title = models.CharField(max_length=64) + slug = models.SlugField(max_length=64) + description = models.TextField() + is_public = models.BooleanField(db_index=True) + creation_date = models.DateTimeField(blank=True) + end_date = models.DateTimeField() + contestants = models.ManyToManyField(User, related_name='contests', + null=True, blank=True) + winner = models.ForeignKey(User, null=True, blank=True, + related_name='winning_contests') + win_date = models.DateTimeField(null=True, blank=True) + meta_description = models.TextField() + + objects = models.Manager() + public_objects = PublicContestManager() + + class Meta: + ordering = ['-creation_date'] + + def __unicode__(self): + return self.title + + @models.permalink + def get_absolute_url(self): + return ('contests-contest', [], {'slug': self.slug}) + + def save(self, *args, **kwargs): + if not self.pk and not self.creation_date: + self.creation_date = datetime.datetime.now() + + super(Contest, self).save(*args, **kwargs) + + def is_active(self): + """ + Returns True if the contest is still active. + + """ + now = datetime.datetime.now() + return self.creation_date <= now < self.end_date + + def can_enter(self): + """ + Returns True if the contest is still active and does not have a winner. + + """ + return not self.winner and self.is_active() + + def pick_winner(self): + """ + This function randomly picks a winner from all the contestants. + + """ + user_ids = self.contestants.values_list('id', flat=True) + winner_id = random.choice(user_ids) + self.winner = User.objects.get(id=winner_id) + self.win_date = datetime.datetime.now() + + def ogp_tags(self): + """ + Returns a dict of Open Graph Protocol meta tags. + + """ + return { + 'og:title': self.title, + 'og:type': 'article', + 'og:url': self.get_absolute_url(), + 'og:description': self.meta_description, + } diff -r c525f3e0b5d0 -r ee87ea74d46b contests/static/js/contests/contests.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/static/js/contests/contests.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,30 @@ +$(function() { + var $button = $('#contest-button'); + $button.click(function() { + var buttonLabel = $button.text(); + $button.attr('disabled', 'disabled').val('Please wait...'); + + $.ajax({ + url: '/contests/enter/', + type: 'POST', + data: { + contest_id : contest_id + }, + dataType: 'json', + success: function (data, textStatus) { + var classname = data.entered ? 'success' : 'info'; + var $p = $('#contest-entry'); + $p.hide(); + $p.addClass(classname); + $p.html(data.msg); + $p.fadeIn(3000); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + $button.removeAttr('disabled').text(buttonLabel); + } + }); + return false; + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b contests/static/js/contests/contests_admin.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/static/js/contests/contests_admin.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,3 @@ +django.jQuery(document).ready(function() { + django.jQuery('#id_meta_description').addClass('mceNoEditor'); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b contests/tests/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/tests/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,2 @@ +from model_tests import * +from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b contests/tests/model_tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/tests/model_tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,166 @@ +""" +Model tests for the contests application. + +""" +import datetime + +from django.test import TestCase +from django.contrib.auth.models import User + +from contests.models import Contest + + +class ContestTestCase(TestCase): + + def test_creation_date(self): + + c = Contest(title='test', + slug='test', + description='test', + is_public=False, + end_date=datetime.datetime.now() + datetime.timedelta(days=30)) + + c.save() + + self.assertTrue(c.creation_date) + self.assertTrue(datetime.datetime.now() - c.creation_date < + datetime.timedelta(seconds=1)) + + def test_is_active(self): + + now = datetime.datetime.now() + start = now + datetime.timedelta(days=7) + end = start + datetime.timedelta(days=30) + + c = Contest(title='test', + slug='test', + description='test', + is_public=False, + creation_date=start, + end_date=end) + + self.failIf(c.is_active()) + + start = now - datetime.timedelta(days=7) + end = start + datetime.timedelta(days=30) + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end) + + self.assertTrue(c.is_active()) + + start = now - datetime.timedelta(days=7) + end = start - datetime.timedelta(days=3) + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end) + + self.failIf(c.is_active()) + + def test_can_enter(self): + + now = datetime.datetime.now() + start = now + datetime.timedelta(days=7) + end = start + datetime.timedelta(days=30) + + c = Contest(title='test', + slug='test', + description='test', + is_public=False, + creation_date=start, + end_date=end) + + self.failIf(c.can_enter()) + + start = now - datetime.timedelta(days=7) + end = start + datetime.timedelta(days=30) + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end) + + self.assertTrue(c.can_enter()) + + start = now - datetime.timedelta(days=7) + end = start - datetime.timedelta(days=3) + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end) + + self.failIf(c.can_enter()) + + start = now - datetime.timedelta(days=7) + end = start + datetime.timedelta(days=30) + + user = User.objects.create_user('test_user', '', 'password') + user.save() + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end, + winner=user, + win_date=now) + + self.failIf(c.can_enter()) + + start = now - datetime.timedelta(days=7) + end = start - datetime.timedelta(days=3) + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end, + winner=user, + win_date=end + datetime.timedelta(days=1)) + + self.failIf(c.can_enter()) + + def test_pick_winner(self): + + now = datetime.datetime.now() + start = now - datetime.timedelta(days=7) + end = start - datetime.timedelta(days=3) + + c = Contest(title='test', + slug='test', + description='test', + is_public=False, + creation_date=start, + end_date=end) + c.save() + + user1 = User.objects.create_user('test_user1', '', 'password') + user1.save() + user2 = User.objects.create_user('test_user2', '', 'password') + user2.save() + user3 = User.objects.create_user('test_user3', '', 'password') + user3.save() + + c.contestants.add(user1, user2, user3) + + c.pick_winner() + + self.assertTrue(datetime.datetime.now() - c.win_date < + datetime.timedelta(seconds=1)) + self.assertTrue(c.winner.id in [user1.id, user2.id, user3.id]) + diff -r c525f3e0b5d0 -r ee87ea74d46b contests/tests/view_tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/tests/view_tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,123 @@ +""" +View tests for the contests application. + +""" +import datetime +from django.test import TestCase +from django.contrib.auth.models import User +from django.core.urlresolvers import reverse +from django.utils import simplejson + +from contests.models import Contest + + +class NoConstestsTestCase(TestCase): + + def test_no_contests(self): + response = self.client.get(reverse('contests-index')) + self.assertEqual(response.status_code, 200) + + url = reverse('contests-contest', kwargs={'slug': 'test'}) + response = self.client.get(url) + self.assertEqual(response.status_code, 404) + + +class ConstestsTestCase(TestCase): + + def setUp(self): + now = datetime.datetime.now() + start = now - datetime.timedelta(days=7) + end = start - datetime.timedelta(days=3) + + user = User.objects.create_user('test_user', '', 'password') + user.save() + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end, + winner=user, + win_date=end + datetime.timedelta(days=1)) + c.save() + self.contest_id = c.id + + def test_contests(self): + response = self.client.get(reverse('contests-index')) + self.assertEqual(response.status_code, 200) + + url = reverse('contests-contest', kwargs={'slug': 'test'}) + response = self.client.get(url) + self.assertEqual(response.status_code, 200) + + +class ContestEntryTestCase(TestCase): + + def setUp(self): + self.username = 'test_user' + self.pw = 'password' + self.user = User.objects.create_user(self.username, '', self.pw) + self.user.save() + self.assertTrue(self.client.login(username=self.username, + password=self.pw)) + + now = datetime.datetime.now() + start = now - datetime.timedelta(days=7) + end = now + datetime.timedelta(days=3) + + c = Contest(title='test', + slug='test', + description='test', + is_public=True, + creation_date=start, + end_date=end) + c.save() + self.contest_id = c.id + + def test_entry_toggle(self): + response = self.client.post(reverse('contests-enter'), + {'contest_id': self.contest_id}, + HTTP_X_REQUESTED_WITH='XMLHttpRequest') + self.assertEqual(response.status_code, 200) + + json = simplejson.loads(response.content) + self.assertTrue(json['entered']) + + contest = Contest.objects.get(pk=self.contest_id) + self.assertTrue(self.user in contest.contestants.all()) + + response = self.client.post(reverse('contests-enter'), + {'contest_id': self.contest_id}, + HTTP_X_REQUESTED_WITH='XMLHttpRequest') + self.assertEqual(response.status_code, 200) + + json = simplejson.loads(response.content) + self.failIf(json['entered']) + + contest = Contest.objects.get(pk=self.contest_id) + self.failIf(self.user in contest.contestants.all()) + + +class NoPublicConstestsTestCase(TestCase): + + def setUp(self): + now = datetime.datetime.now() + start = now - datetime.timedelta(days=7) + end = start - datetime.timedelta(days=3) + + c = Contest(title='test', + slug='test', + description='test', + is_public=False, + creation_date=start, + end_date=end) + c.save() + + def test_contests(self): + response = self.client.get(reverse('contests-index')) + self.assertEqual(response.status_code, 200) + + url = reverse('contests-contest', kwargs={'slug': 'test'}) + response = self.client.get(url) + self.assertEqual(response.status_code, 404) diff -r c525f3e0b5d0 -r ee87ea74d46b contests/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,27 @@ +""" +Url patterns for the contests application. + +""" +from django.conf.urls import patterns, url +from django.views.generic import DetailView, ListView + +from contests.models import Contest + + +urlpatterns = patterns('', + url(r'^$', + ListView.as_view( + context_object_name='contests', + queryset=Contest.public_objects.select_related('winner')), + name='contests-index'), + + url(r'^enter/$', + 'contests.views.enter', + name='contests-enter'), + + url(r'^c/(?P[\w-]+)/$', + DetailView.as_view( + context_object_name='contest', + queryset=Contest.public_objects.all().select_related('winner')), + name='contests-contest'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b contests/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/contests/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,46 @@ +""" +Views for the contests application. + +""" +from django.http import (HttpResponse, HttpResponseForbidden, + HttpResponseBadRequest) +from django.shortcuts import get_object_or_404 +from django.utils import simplejson +from django.views.decorators.http import require_POST + +from contests.models import Contest + + +@require_POST +def enter(request): + """ + This view is an AJAX view that is used to enter or withdraw a user from a + given contest. This function toggles the user's entered state in the + contest. + + """ + if not request.user.is_authenticated(): + return HttpResponseForbidden("Please login first") + + contest_id = request.POST.get('contest_id') + if not contest_id: + return HttpResponseBadRequest("Missing contest_id") + + contest = get_object_or_404(Contest, pk=contest_id) + if not contest.can_enter(): + return HttpResponseForbidden("Contest is over") + + # Toggle the user's state in the contest + + result = {} + if request.user in contest.contestants.all(): + contest.contestants.remove(request.user) + result['entered'] = False + result['msg'] = 'You have been withdrawn from this contest.' + else: + contest.contestants.add(request.user) + result['entered'] = True + result['msg'] = 'You have been entered into this contest!' + + json = simplejson.dumps(result) + return HttpResponse(json, content_type='application/json') diff -r c525f3e0b5d0 -r ee87ea74d46b core/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,12 @@ +from django.contrib import admin +from django.contrib.flatpages.models import FlatPage +from django.contrib.flatpages.admin import FlatPageAdmin as FlatPageAdminOld +from django.conf import settings + +class FlatPageAdmin(FlatPageAdminOld): + class Media: + js = settings.GPP_THIRD_PARTY_JS['tiny_mce'] + +# We have to unregister it, and then reregister +admin.site.unregister(FlatPage) +admin.site.register(FlatPage, FlatPageAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b core/fixtures/flatpages.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/fixtures/flatpages.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,62 @@ +[ + { + "pk": 1, + "model": "flatpages.flatpage", + "fields": { + "registration_required": false, + "title": "About SurfGuitar101.com", + "url": "/about/", + "template_name": "", + "sites": [ + 1 + ], + "content": "

SurfGuitar101.com is the premier place on the web for friends and fans of the world-wide phenomenon known as surf music. Surf music was created in the early 1960's in Southern California by such bands as The Belairs, Dick Dale & His Deltones, and The Chantays, and popularized further by bands like The Ventures, The Astronauts, The Pyramids, & The Lively Ones. Surf music was all but forgotten when The Beatles and the British Invasion landed in America in the mid to late 1960's. In the late 70's and early 1980's a revival began when bands like Jon & The Nightriders, The Surf Raiders, and The Halibuts heard the call of the surf and reintroduced it to hungry audiences. This revival continues today and has spread world-wide. Today you can find surf bands not only in California, but all across America, Europe, Australia, Central and South America, and Japan.

\r\n

Join us in our forums to discuss this great form of popular music. Discover great bands old and new. Check out our podcasts as we highlight the classic surf bands and the bands of today. Meet new friends and learn about the next surf show in your town. Exchange tips on playing and performing surf music and even starting your own band!

\r\n

Thanks for being part of the greatest online community dedicated to surf music!

\r\n

A Short History of SurfGuitar101.com

\r\n

This site started as a Yahoo Group in late October, 2001. There were several other surf music Yahoo groups at the time, so we started our focus on the musician aspect of playing surf music (hence the \"guitar 101\"). After a short time we dropped that angle and fully embraced all aspects of surf music.

\r\n

After seeing The Surf Coasters (Japan) on their first US tour in the summer of 2004, we needed a place to host our many photos and videos. The domain name surfguitar101.com was registered, and a simple static website was created to host media files as a supplement to the Yahoo group. 

\r\n

Cramped by the confines of the Yahoo Group, in February of 2006 we launched an interactive version of the website, complete with our now famous forums. This format was kept until February, 2011 when the website software was rewritten and a new look was designed.

\r\n

The SG101 community held its first annual convention weekend in 2008 in Southern California, a tradition that continues today. Ever year our members get together for a surf music packed weekend, and each year has been bigger and better than the last. In 2010, Germany's The Space Rangers and Italy's (via Antigua) Wadadli Riders were the first non-US bands to play at the convention. Fans of surf music get to see, hear, and mingle with musicians from the original 60's bands as well as the up and coming bands of today.

\r\n

Surf's Up!

", + "enable_comments": false + } + }, + { + "pk": 4, + "model": "flatpages.flatpage", + "fields": { + "registration_required": false, + "title": "Colophon", + "url": "/colophon/", + "template_name": "", + "sites": [ + 1 + ], + "content": "

SurfGuitar101.com was created by Brian Neal. The server-side code is written in the Python programming language using the awesome Django Web framework. Client-side coding was done in Javascript, making heavy use of the jQuery and jQuery UI libraries.

\r\n

The site design was created by Ken Dushane of the band The Crashmatics. Various icons and graphics were contributed by Ariel (DreadInBabylon), Ferenc Dobronyi, and Joseph Koch. Additional icons courtesy of FamFamFam.

\r\n

The following 3rd party libraries were leveraged in the construction of this site: MySQLdb, python-markdown, PIL, pytz, django-tagging, django-elsewhere, gdata-python-client, python-memcached, html5lib, tinymce, markItUp!, Haystack, xapian-haystack, Blueprint, jQuery Cycle, JEditable, & repoze.timeago.

\r\n

The site runs on an infrastructure powered by many open-source tools: the Apache server with mod_wsgi, a MySQL database, the Xapian search engine library, and memcached. The server is running Ubuntu, an operating system based upon the Debian GNU / Linux distribution.

\r\n

Special thanks to Abraham Aguilar and Brian Fady for providing useful feedback and testing.

", + "enable_comments": false + } + }, + { + "pk": 3, + "model": "flatpages.flatpage", + "fields": { + "registration_required": false, + "title": "SurfGuitar101.com Privacy Policy", + "url": "/policy/privacy/", + "template_name": "", + "sites": [ + 1 + ], + "content": "

SurfGuitar101.com is committed to ensuring the privacy of its readers and registered members and wants you to fully understand our terms and conditions This privacy statement describes how any personal, and anonymous, information is collected and managed and how you can request changes to any sharing of this information that may occur.

\r\n

Statistical Reports

\r\n

SurfGuitar101.com's servers automatically recognize a visitor's IP address and domain name. These items do not reveal any personal information about the visitor. The information is used solely to compile statistics that enable us to examine page impression levels and numbers of unique users visiting our Web sites. This information helps us to understand the areas of our sites that people visit in order to deliver more effective content.

\r\n

Cookies

\r\n

Like most other Web sites, SurfGuitar101.com uses cookies. Cookies are small data files that some Web sites write to your hard drive when you visit them. A cookie file can contain information such as a user ID that the site uses to track the pages you've visited. Cookies do not tell us who you are unless you've specifically given us personally identifiable information. A cookie can't read data off your hard drive or read cookie files created by other sites.

SurfGuitar101.com uses cookies to allow automatic logins to improve your experience with our sites. For example, we may use a cookie to identify our site members so they don't have to re-enter a user id and password when they sign-in.  Cookies can also be used to help us to better understand how visitors interact with our sites leading to the delivery of more relevant content. Cookies may be created directly by our sites for these purposes, or by third-party companies operating on our behalf. If you choose to become a member of SurfGuitar101.com, you must have cookies enabled to access the member related pages (i.e. Discussion Boards and Member Profile pages).

Most web browsers automatically accept cookies but allow you to modify security settings so you can approve or reject cookies on a case-by-case basis.

\r\n

Pixel Tags

\r\n

SurfGuitar101.com does not currently use pixel tags, also known as beacons, web bugs or clear gifs.

\r\n

Online Ad Serving

\r\n

SurfGuitar101.com does not currently use third-party advertising companies to serve advertisements.

\r\n

Newsletters / Mailing Lists

\r\n

Through the registration process for SurfGuitar101.com, we request some personal information such as your e-mail address, company information, your name, job title, etc. We will never give your personal information to any third party vendor without your prior consent. We currently do not make our email and postal lists available to any third-party.

\r\n

SurfGuitar101.com Email Announcements

\r\n

At this time we do not send mass e-mails to make site-wide announcements.

\r\n

Necessary Disclosure

\r\n

The necessary disclosure of any of the above information to third parties will be governed by the following principles:

\r\n
    \r\n
  1. Where SurfGuitar101.com is required to do so by law and any order of the court.
  2. \r\n
  3. Where it is necessary to identify anyone who may be violating the rights of others or the law in general.
  4. \r\n
  5. Where SurfGuitar101.com intends to co-operate with the investigation of any alleged unlawful activities without being required to by virtue of any court order or other legal requirement.
  6. \r\n
  7. Where it is necessary to protect the rights of SurfGuitar101.com.
  8. \r\n
\r\n

Security

\r\n

We use all reasonable precautions to securely maintain all information given to us by our registered members and we are not responsible for any breach of the reasonable security measures installed to protect the said information. We are not responsible for the private policies of any site linked to, or from, SurfGuitar101.com.

\r\n

Opt Out Policy

\r\n

SurfGuitar101.com gives users options whenever necessary, and practical. Such choices include: Opting not to receive our electronic messages, opting not to provide certain optional personal information when registering for an account.

\r\n

Transfer of Information

\r\n

SurfGuitar101.com reserves the right to transfer any information accumulated as described above in the event of the sale of part or all of SurfGuitar101.com assets and/or stock. By visiting our Web sites and by registering you consent to the collection and use of information in the manner herein described.

\r\n

Privacy Policy Changes

\r\n

This Privacy Policy may be modified from time to time. Any modifications to our Privacy Policy will be reflected on this page. If there is a significant change, we will indicate it on our sites and provide a link to the new policy.

", + "enable_comments": false + } + }, + { + "pk": 2, + "model": "flatpages.flatpage", + "fields": { + "registration_required": false, + "title": "SurfGuitar101.com Terms of Service", + "url": "/policy/tos/", + "template_name": "", + "sites": [ + 1 + ], + "content": "
\r\n

Your use of our Internet sites is subject to these Terms of Service (\"Terms\"). We may modify these Terms at any time without notice to you by posting revised Terms on our sites. Your use of our sites constitutes your binding acceptance of these Terms, including any modifications that we make.

\r\n

Content on Our Sites

\r\n

Our sites include a combination of content that we create and that our users create. You are solely responsible for all materials, whether publicly posted or privately transmitted, that you upload, post, email, transmit or otherwise make available on our sites (\"Your Content\"). You certify that you own all intellectual property rights in Your Content. You hereby grant us, our affiliates and our partners a worldwide, irrevocable, royalty-free, nonexclusive, sublicensable license to use, reproduce, create derivative works of, distribute, publicly perform, publicly display, transfer, transmit, distribute and publish Your Content and subsequent versions of Your Content for the purposes of (i) displaying Your Content on our sites, (ii) distributing Your Content, either electronically or via other media, to users seeking to download or otherwise acquire it, and/or (iii) storing Your Content in a remote database accessible by end users. This license shall apply to the distribution and the storage of Your Content in any form, medium, or technology now known or later developed.

\r\n

Your Conduct on Our Sites

\r\n

You agree not to post or transmit material that is knowingly false and/or defamatory, misleading, inaccurate, abusive, vulgar, hateful, harassing, obscene, profane, sexually oriented, threatening or invasive of a person's privacy; that otherwise violates any law; or that encourages conduct constituting a criminal offense.

\r\n

User Agreement for SurfGuitar101.com Forums

\r\n

This message forum, and other user contributed/comment areas (\"Forums\") are provided as a service to members of our community. By using or participating on the Forums, you agree to this User Agreement including but not limited to the Rules of Conduct and the Terms of Service stated below. For purposes of this agreement, \"User\" refers to any individual posting on or otherwise using the Forums and SG101 refers to the owners and staff of SurfGuitar101.com and their authorized representatives.

\r\n

SG101 reserves the right to change the Rules of Conduct, Terms of Service and all other parts of this User Agreement at its sole discretion and without notice.

\r\n

As a standard operating procedure, SG101 does not enter into correspondence, discussions or other communication, either public or private, about SG101 policies, individual moderators, enforcement or application of the User Agreement, bans or other sanctions, etc.

\r\n

RULES OF CONDUCT

\r\n

User agrees not to post material that is knowingly false and/or defamatory, misleading, inaccurate, abusive, vulgar, hateful, harassing, obscene, profane, sexually oriented, threatening, invasive of a person's privacy, that otherwise violates any law, or that encourages conduct constituting a criminal offense.

\r\n

User agrees not to post any material that is protected by copyright, trademark or other proprietary right without the express permission of the owner(s) of said copyright, trademark or other proprietary right.

\r\n

User agrees not to use nicknames that might be deemed abusive, vulgar, hateful, harassing, obscene, profane, sexually oriented, threatening, invasive of a person's privacy, or otherwise inappropriate. User agrees not to use nicknames that might mislead other Users. This includes but is not limited to using nicknames that impersonate developers, staff, or other Users, or other individuals outside of SG101.

\r\n

TERMS OF SERVICE

\r\n

User acknowledges and agrees that use of the SG101 is a privilege, not a right, and that SG101 has the right, at its sole discretion, to revoke this privilege at any time without notice or reason. User agrees that this Agreement in its entirety applies to both public and private messages.

\r\n

The goal of the Forums is to foster communication and the interchange of ideas within the User community. User agrees and acknowledges that any posts, nicknames or other material deemed offensive, harassing, baiting or otherwise inappropriate may be removed at the sole discretion of SG101.

\r\n

User authorizes SG101 to make use of any original stories, concepts, ideas, drawings, photographs, opinions and other creative materials posted on the Forums without compensation or other recourse. User also agrees to indemnify and hold harmless SG101 and our agents with respect to any claims based upon or arising from the transmission and/or content of your message(s).

\r\n

SG101 has the right but not the obligation to monitor and/or moderate the Forums, and offers no assurances in this regard.

\r\n

SG101 is not responsible for messages posted on the Forums or the content therein. We do not vouch for or warrant the accuracy, completeness or usefulness of any message. Each message expresses the views of its originating User, not necessarily those of SG101. Unless expressly stated otherwise by a senior SG101 representative, this includes messages posted by SG101 personnel, agents, delegates, representatives et al.

\r\n

Any User who feels that a posted message is objectionable is encouraged to contact us. We have the ability to remove messages and we will make every effort to do so within a reasonable time if we determine that removal is necessary. This is a manual process, however, so please realize that we may not be able to act immediately. Removal of messages is at the sole discretion of SG101.

\r\n

The appropriate individual to contact is usually the editor of the site associated with the board where the message in question is to be found. As a standard operating procedure, SG101 does not enter into discussions, either public or private, about Forum policies, individual moderators, bans or other sanctions, etc.

\r\n

SG101 reserves the right to reveal the identity of and/or whatever information we know about any User in the event of a complaint or legal action arising from any message posted by said User.

\r\n

Advertisements, chain letters, pyramid schemes and other commercial solicitations are inappropriate on the Forums.

\r\n

SG101 does not permit children under the age of 13 to become members, post home pages or web sites on our service.

\r\n

SG101 is not responsible for the content posted by SG101 members or visitors on any area of our site including without limitation. The opinions and views expressed by SG101's members or visitors do not necessarily represent those of SG101 and SG101 does not verify, endorse, or vouch for the content of such opinions or views. Further, SG101 is not responsible for the delivery or quality of any goods or services sold or advertised through or on SG101 members' page(s). If you believe that any of the content posted by our members or visitors violates your proprietary rights, including copyrights, please contact us.

\r\n

You are solely and fully responsible for any content that you post any area of our site. We do not regularly review the contents of materials posted by our members or other visitors to our site. We strictly prohibit the posting of the following types of content on all areas of our sites:

\r\n
    \r\n
  • nudity, pornography, and sexual material of a lewd, lecherous or obscene nature and intent or that violates local, state and national laws.
  • \r\n
  • any material that violates or infringes in any way upon the proprietary rights of others, including, without limitation, copyright or trademark rights; this includes \"WAREZ\" (copyrighted software that is distributed illegally), \"mp3\" files of copyrighted music, copyrighted photographs, text, video or artwork. If you don't own the copyright or have express authorization and documented permission to use it, don't put it on SG101 (if you do have express permission you must say so clearly). SG101 will terminate the memberships of, and remove the pages of, repeat infringers.
  • \r\n
  • any material that is threatening, abusive, harassing, defamatory, invasive of privacy or publicity rights, vulgar, obscene, profane, indecent, or otherwise objectionable; including posting other peoples' private information.
  • \r\n
  • content that promotes, encourages, or provides instructional information about illegal activities - specifically hacking, cracking, or phreaking.
  • \r\n
  • any software, information, or other material that contains a virus, \"Trojan Horse\", \"worm\" corrupted data, or any other harmful or damaging component;
  • \r\n
  • hate propaganda or hate mongering, swearing, or fraudulent material or activity;
  • \r\n
\r\n
\r\n
\r\n

By submitting your data to SG101, you represent that the data complies with SG101's Terms of Service. If any third party brings a claim, lawsuit or other proceeding against SG101 based on your conduct or use of SG101 services, you agree to compensate SG101 (including its officers, directors, employees and agents) for any and all losses, liabilities, damages or expenses, including attorney's fees, incurred by SG101 in connection with any such claim, lawsuit or proceeding.

\r\n

SG101 is the final arbiter of what IS and IS NOT allowed on our site. Further, SG101 reserves the right to modify or remove anything submitted to SG101, and to cancel any membership, at any time for any reason without prior notice. SG101 is not obliged to maintain back-ups copies of any material submitted or posted on our site. Actions or activities that may cause termination of your membership and/or removal of your page(s) include, but are not limited to:

\r\n
    \r\n
  • posting or providing links to any content which violates our Terms of Service:
  • \r\n
  • conducting or providing links to any raffle, contest, or game which violates any local, state or national laws;
  • \r\n
  • using in the registration of your SG101 membership an email account that is not your own or that is or becomes inactive.
  • \r\n
  • violating the SG101 Terms of Service. Please read and familiarize yourself with the SG101 Terms of Service.
  • \r\n
  • sending unsolicited email using a SG101 address
  • \r\n
  • reproducing, distributing, republishing or retransmitting material posted by other SG101 members without the prior permission of such members.
  • \r\n
\r\n
\r\n
\r\n

We reserve the right to monitor, and to investigate any complaints regarding any content of SG101 members' pages, message-board postings, and to take appropriate action if SG101 finds violations of these Terms of Service. In the case of any such complaint, SG101 reserves the right to remove the content complained of while the SG101 member and the complaining party attempt to resolve their dispute. This could result in your posts(s) being removed from SG101 for as long as it takes to resolve the dispute.

\r\n

You grant to SG101 and its affiliates a royalty-free, perpetual, irrevocable, nonexclusive, worldwide, unrestricted license to use, copy, modify, transmit, distribute, and publicly perform or display the submitted pages or other content for the purposes of displaying such information on SG101's sites and for the promotion and marketing of SG101's services.

\r\n

MISC.

\r\n

SG101 makes no guarantee of availability of service and reserves the right to change, withdraw, suspend, or discontinue any functionality or feature of the SG101 service. IN NO EVENT WILL BE LIABLE FOR ANY DAMAGES, INCLUDING, WITHOUT LIMITATION, DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL, OR PUNITIVE DAMAGES ARISING OUT OF THE USE OF OR INABILITY TO USE SG101'S SERVICES OR ANY CONTENT THEREON FOR ANY REASON INCLUDING, WITHOUT LIMITATION, SG101'S REMOVAL OR DELETION OF ANY MATERIALS OR RECORDS SUBMITTED OR POSTED ON SG101'S SITE FOR ANY REASON. THIS DISCLAIMER APPLIES, WITHOUT LIMITATION, TO ANY DAMAGES OR INJURY, WHETHER FOR BREACH OF CONTRACT, TORT, OR OTHERWISE, CAUSED; ANY FAILURE OF PERFORMANCE; ERROR; OMISSION; INTERRUPTION; DELETION; DEFECT; DELAY IN OPERATION OR TRANSMISSION; COMPUTER VIRUS; FILE CORRUPTION; COMMUNICATION-LINE FAILURE; NETWORK OR SYSTEM OUTAGE; OR THEFT, DESTRUCTION, UNAUTHORIZED ACCESS TO, ALTERATION OF, OR USE OF ANY RECORD.

\r\n

SG101 reserves the right to change or amend these Terms of Service at any time without prior notice. By registering and/or submitting any content, including without limitation, message-board postings, you signify your agreement to these Terms of Service.

\r\n
", + "enable_comments": false + } + } +] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b core/functions.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/functions.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,117 @@ +"""This file houses various core utility functions for GPP""" +import datetime +import re +import logging + +from django.contrib.sites.models import Site +from django.conf import settings +import django.core.mail + +import core.tasks + + +def send_mail(subject, message, from_email, recipient_list, defer=True, **kwargs): + """ + The main send email function. Use this function to send email from the + site. All applications should use this function instead of calling + Django's directly. + If defer is True, the email will be sent to a Celery task to actually send + the email. Otherwise it is sent on the caller's thread. In any event, the + email will be logged at the DEBUG level. + + """ + # Guard against empty email addresses + recipient_list = [dest for dest in recipient_list if dest] + if not recipient_list: + logging.warning("Empty recipient_list in send_mail") + return + + logging.debug('EMAIL:\nFrom: %s\nTo: %s\nSubject: %s\nMessage:\n%s', + from_email, str(recipient_list), subject, message) + + if defer: + core.tasks.send_mail.delay(subject, message, from_email, recipient_list, + **kwargs) + else: + django.core.mail.send_mail(subject, message, from_email, recipient_list, + **kwargs) + + +def email_admins(subject, message): + """Emails the site admins. Goes through the site send_mail function.""" + site = Site.objects.get_current() + subject = '[%s] %s' % (site.name, subject) + send_mail(subject, + message, + '%s@%s' % (settings.GPP_NO_REPLY_EMAIL, site.domain), + [mail_tuple[1] for mail_tuple in settings.ADMINS]) + + +def email_managers(subject, message): + """Emails the site managers. Goes through the site send_mail function.""" + site = Site.objects.get_current() + subject = '[%s] %s' % (site.name, subject) + send_mail(subject, + msg, + '%s@%s' % (settings.GPP_NO_REPLY_EMAIL, site.domain), + [mail_tuple[1] for mail_tuple in settings.MANAGERS]) + + +def get_full_name(user): + """Returns the user's full name if available, otherwise falls back + to the username.""" + full_name = user.get_full_name() + if full_name: + return full_name + return user.username + + +BASE_YEAR = 2010 + +def copyright_str(): + curr_year = datetime.datetime.now().year + if curr_year == BASE_YEAR: + year_range = str(BASE_YEAR) + else: + year_range = "%d - %d" % (BASE_YEAR, curr_year) + + return 'Copyright (C) %s, SurfGuitar101.com' % year_range + + +IP_PAT = re.compile('(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})') + +def get_ip(request): + """Returns the IP from the request or None if it cannot be retrieved.""" + ip = request.META.get('HTTP_X_FORWARDED_FOR', + request.META.get('REMOTE_ADDR')) + + if ip: + match = IP_PAT.match(ip) + ip = match.group(1) if match else None + + return ip + + +def get_page(qdict): + """Attempts to retrieve the value for "page" from the given query dict and + return it as an integer. If the key cannot be found or converted to an + integer, 1 is returned. + """ + n = qdict.get('page', 1) + try: + n = int(n) + except ValueError: + n = 1 + return n + + +def quote_message(who, message): + """ + Builds a message reply by quoting the existing message in a + typical email-like fashion. The quoting is compatible with Markdown. + """ + msg = "> %s" % message.replace('\n', '\n> ') + if msg.endswith('\n> '): + msg = msg[:-2] + + return "*%s wrote:*\n\n%s\n\n" % (who, msg) diff -r c525f3e0b5d0 -r ee87ea74d46b core/html.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/html.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,28 @@ +import html5lib +from html5lib import sanitizer, treebuilders, treewalkers, serializer + +def sanitizer_factory(*args, **kwargs): + san = sanitizer.HTMLSanitizer(*args, **kwargs) + # This isn't available yet + # san.strip_tokens = True + return san + +def clean_html(buf): + """Cleans HTML of dangerous tags and content.""" + buf = buf.strip() + if not buf: + return buf + + p = html5lib.HTMLParser(tree=treebuilders.getTreeBuilder("dom"), + tokenizer=sanitizer_factory) + dom_tree = p.parseFragment(buf) + + walker = treewalkers.getTreeWalker("dom") + stream = walker(dom_tree) + + s = serializer.htmlserializer.HTMLSerializer( + omit_optional_tags=False, + quote_attr_values=True) + return s.render(stream) + +# vim: ts=4 sw=4 diff -r c525f3e0b5d0 -r ee87ea74d46b core/image.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/image.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,43 @@ +""" +This file contains common utility functions for manipulating images for +the rest of the applications in the project. +""" +from PIL import ImageFile +from PIL import Image + + +def parse_image(file): + """ + Returns a PIL Image from the supplied Django file object. + Throws IOError if the file does not parse as an image file or some other + I/O error occurred. + + """ + parser = ImageFile.Parser() + for chunk in file.chunks(): + parser.feed(chunk) + image = parser.close() + return image + + +def downscale_image_square(image, size): + """ + Scale an image to the square dimensions given by size (in pixels). + The new image is returned. + If the image is already smaller than (size, size) then no scaling + is performed and the image is returned unchanged. + + """ + # don't upscale + if (size, size) >= image.size: + return image + + (w, h) = image.size + if w > h: + diff = (w - h) / 2 + image = image.crop((diff, 0, w - diff, h)) + elif h > w: + diff = (h - w) / 2 + image = image.crop((0, diff, w, h - diff)) + image = image.resize((size, size), Image.ANTIALIAS) + return image diff -r c525f3e0b5d0 -r ee87ea74d46b core/management/commands/max_users.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/management/commands/max_users.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,17 @@ +""" +max_users is a custom manage.py command. +It is intended to be called from a cron job to calculate the maximum +number of users online statistic. +""" +import datetime + +from django.core.management.base import NoArgsCommand + +from core.whos_online import max_users + + +class Command(NoArgsCommand): + help = "Run periodically to compute the max users online statistic." + + def handle_noargs(self, **options): + max_users() diff -r c525f3e0b5d0 -r ee87ea74d46b core/markup.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/markup.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,57 @@ +""" +Markup related utitlities useful for the entire project. +""" +import markdown as _markdown +from django.utils.encoding import force_unicode + +from smiley import SmilifyMarkdown + +class Markdown(object): + """ + This is a thin wrapper around the Markdown class which deals with the + differences in Markdown versions on the production and development server. + This code was inspired by the code in + django/contrib/markup/templatetags/markup.py. + Currently, we only have to worry about Markdown 1.6b and 2.0. + """ + def __init__(self, safe_mode='escape'): + # Unicode support only in markdown v1.7 or above. Version_info + # exists only in markdown v1.6.2rc-2 or above. + self.unicode_support = getattr(_markdown, "version_info", None) >= (1, 7) + self.md = _markdown.Markdown(safe_mode=safe_mode, + extensions=['urlize', 'nl2br', 'del']) + + def convert(self, s): + if self.unicode_support: + return self.md.convert(force_unicode(s)) + else: + return force_unicode(self.md.convert(s)) + + +def markdown(s): + """ + A convenience function for one-off markdown jobs. + """ + md = Markdown() + return md.convert(s) + + +class SiteMarkup(object): + """ + This class provides site markup by combining markdown and + our own smiley markup. + """ + def __init__(self): + self.md = Markdown() + self.smiley = SmilifyMarkdown() + + def convert(self, s): + return self.md.convert(self.smiley.convert(s)) + + +def site_markup(s): + """ + Convenience function for one-off site markup jobs. + """ + sm = SiteMarkup() + return sm.convert(s) diff -r c525f3e0b5d0 -r ee87ea74d46b core/middleware.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/middleware.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,72 @@ +"""Common middleware for the entire project.""" +import datetime +import logging + +from django.db import IntegrityError +from django.contrib.auth import logout +from django.conf import settings + +from core.functions import get_ip +from core.whos_online import report_user, report_visitor + + +class InactiveUserMiddleware(object): + """ + This middleware ensures users with is_active set to False get their + session destroyed and are treated as logged out. + This middleware should come after the 'django.contrib.auth.middleware. + AuthenticationMiddleware' in settings.py. + Idea taken from: http://djangosnippets.org/snippets/1105/ + """ + + def process_view(self, request, view_func, view_args, view_kwargs): + if request.user.is_authenticated() and not request.user.is_active: + logout(request) + + +ONLINE_COOKIE = 'sg101_online' # online cookie name +ONLINE_TIMEOUT = 5 * 60 # online cookie lifetime in seconds + + +class WhosOnline(object): + """ + This middleware class keeps track of which registered users have + been seen recently, and the number of unique unregistered users. + This middleware should come after the authentication middleware, + as we count on the user attribute being attached to the request. + """ + + def process_response(self, request, response): + """ + Keep track of who is online. + """ + # Note that some requests may not have a user attribute + # as these may have been redirected in the middleware chain before + # the auth middleware got a chance to run. If this is the case, just + # bail out. We also ignore AJAX requests. + + if not hasattr(request, 'user') or request.is_ajax(): + return response + + if request.user.is_authenticated(): + if request.COOKIES.get(ONLINE_COOKIE) is None: + # report that we've seen the user + report_user(request.user.username) + + # set a cookie to expire + response.set_cookie(ONLINE_COOKIE, '1', max_age=ONLINE_TIMEOUT) + else: + if request.COOKIES.get(settings.CSRF_COOKIE_NAME) is not None: + # We have a non-authenticated user that has cookies enabled. This + # means we can track them. + if request.COOKIES.get(ONLINE_COOKIE) is None: + # see if we can get the IP address + ip = get_ip(request) + if ip: + # report that we've seen this visitor + report_visitor(ip) + + # set a cookie to expire + response.set_cookie(ONLINE_COOKIE, '1', max_age=ONLINE_TIMEOUT) + + return response diff -r c525f3e0b5d0 -r ee87ea74d46b core/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,24 @@ +""" +This file contains the core Models used in gpp +""" +import datetime + +from django.db import models +from django.contrib.auth.models import User + + +class Statistic(models.Model): + """ + This model keeps track of site statistics. Currently, the only statistic + is the maximum number of users online. This stat is computed by a mgmt. + command that is run on a cron job to peek at the previous two models. + """ + max_users = models.IntegerField() + max_users_date = models.DateTimeField() + max_anon_users = models.IntegerField() + max_anon_users_date = models.DateTimeField() + + def __unicode__(self): + return u'%d users on %s' % (self.max_users, + self.max_users_date.strftime('%Y-%m-%d %H:%M:%S')) + diff -r c525f3e0b5d0 -r ee87ea74d46b core/paginator.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/paginator.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,286 @@ +""" +Digg.com style paginator. +References: +http://www.djangosnippets.org/snippets/773/ +http://blog.elsdoerfer.name/2008/05/26/diggpaginator-update/ +http://blog.elsdoerfer.name/2008/03/06/yet-another-paginator-digg-style/ +""" +import math +from django.core.paginator import \ + Paginator, QuerySetPaginator, Page, InvalidPage + +__all__ = ( + 'InvalidPage', + 'ExPaginator', + 'DiggPaginator', + 'QuerySetDiggPaginator', +) + +class ExPaginator(Paginator): + """Adds a ``softlimit`` option to ``page()``. If True, querying a + page number larger than max. will not fail, but instead return the + last available page. + + This is useful when the data source can not provide an exact count + at all times (like some search engines), meaning the user could + possibly see links to invalid pages at some point which we wouldn't + want to fail as 404s. + + >>> items = range(1, 1000) + >>> paginator = ExPaginator(items, 10) + >>> paginator.page(1000) + Traceback (most recent call last): + InvalidPage: That page contains no results + >>> paginator.page(1000, softlimit=True) + + + # [bug] graceful handling of non-int args + >>> paginator.page("str") + Traceback (most recent call last): + InvalidPage: That page number is not an integer + """ + def _ensure_int(self, num, e): + # see Django #7307 + try: + return int(num) + except ValueError: + raise e + + def page(self, number, softlimit=False): + try: + return super(ExPaginator, self).page(number) + except InvalidPage, e: + number = self._ensure_int(number, e) + if number > self.num_pages and softlimit: + return self.page(self.num_pages, softlimit=False) + else: + raise e + +class DiggPaginator(ExPaginator): + """ + Based on Django's default paginator, it adds "Digg-style" page ranges + with a leading block of pages, an optional middle block, and another + block at the end of the page range. They are available as attributes + on the page: + + {# with: page = digg_paginator.page(1) #} + {% for num in page.leading_range %} ... + {% for num in page.main_range %} ... + {% for num in page.trailing_range %} ... + + Additionally, ``page_range`` contains a nun-numeric ``False`` element + for every transition between two ranges. + + {% for num in page.page_range %} + {% if not num %} ... {# literally output dots #} + {% else %}{{ num }} + {% endif %} + {% endfor %} + + Additional arguments passed to the constructor allow customization of + how those bocks are constructed: + + body=5, tail=2 + + [1] 2 3 4 5 ... 91 92 + |_________| |___| + body tail + |_____| + margin + + body=5, tail=2, padding=2 + + 1 2 ... 6 7 [8] 9 10 ... 91 92 + |_| |__| + ^padding^ + |_| |__________| |___| + tail body tail + + ``margin`` is the minimum number of pages required between two ranges; if + there are less, they are combined into one. + + When ``align_left`` is set to ``True``, the paginator operates in a + special mode that always skips the right tail, e.g. does not display the + end block unless necessary. This is useful for situations in which the + exact number of items/pages is not actually known. + + # odd body length + >>> print DiggPaginator(range(1,1000), 10, body=5).page(1) + 1 2 3 4 5 ... 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=5).page(100) + 1 2 ... 96 97 98 99 100 + + # even body length + >>> print DiggPaginator(range(1,1000), 10, body=6).page(1) + 1 2 3 4 5 6 ... 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=6).page(100) + 1 2 ... 95 96 97 98 99 100 + + # leading range and main range are combined when close; note how + # we have varying body and padding values, and their effect. + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2).page(3) + 1 2 3 4 5 ... 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=6, padding=2, margin=2).page(4) + 1 2 3 4 5 6 ... 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2).page(6) + 1 2 3 4 5 6 7 ... 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2).page(7) + 1 2 ... 5 6 7 8 9 ... 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2).page(7) + 1 2 ... 5 6 7 8 9 ... 99 100 + + # the trailing range works the same + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2, ).page(98) + 1 2 ... 96 97 98 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=6, padding=2, margin=2, ).page(97) + 1 2 ... 95 96 97 98 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2, ).page(95) + 1 2 ... 94 95 96 97 98 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2, ).page(94) + 1 2 ... 92 93 94 95 96 ... 99 100 + >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2, ).page(94) + 1 2 ... 92 93 94 95 96 ... 99 100 + + # all three ranges may be combined as well + >>> print DiggPaginator(range(1,151), 10, body=6, padding=2).page(7) + 1 2 3 4 5 6 7 8 9 ... 14 15 + >>> print DiggPaginator(range(1,151), 10, body=6, padding=2).page(8) + 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 + >>> print DiggPaginator(range(1,151), 10, body=6, padding=1).page(8) + 1 2 3 4 5 6 7 8 9 ... 14 15 + + # no leading or trailing ranges might be required if there are only + # a very small number of pages + >>> print DiggPaginator(range(1,80), 10, body=10).page(1) + 1 2 3 4 5 6 7 8 + >>> print DiggPaginator(range(1,80), 10, body=10).page(8) + 1 2 3 4 5 6 7 8 + >>> print DiggPaginator(range(1,12), 10, body=5).page(1) + 1 2 + + # test left align mode + >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(1) + 1 2 3 4 5 + >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(50) + 1 2 ... 48 49 50 51 52 + >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(97) + 1 2 ... 95 96 97 98 99 + >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(100) + 1 2 ... 96 97 98 99 100 + + # padding: default value + >>> DiggPaginator(range(1,1000), 10, body=10).padding + 4 + + # padding: automatic reduction + >>> DiggPaginator(range(1,1000), 10, body=5).padding + 2 + >>> DiggPaginator(range(1,1000), 10, body=6).padding + 2 + + # padding: sanity check + >>> DiggPaginator(range(1,1000), 10, body=5, padding=3) + Traceback (most recent call last): + ValueError: padding too large for body (max 2) + """ + def __init__(self, *args, **kwargs): + self.body = kwargs.pop('body', 10) + self.tail = kwargs.pop('tail', 2) + self.align_left = kwargs.pop('align_left', False) + self.margin = kwargs.pop('margin', 4) # TODO: make the default relative to body? + # validate padding value + max_padding = int(math.ceil(self.body/2.0)-1) + self.padding = kwargs.pop('padding', min(4, max_padding)) + if self.padding > max_padding: + raise ValueError('padding too large for body (max %d)'%max_padding) + super(DiggPaginator, self).__init__(*args, **kwargs) + + def page(self, number, *args, **kwargs): + """Return a standard ``Page`` instance with custom, digg-specific + page ranges attached. + """ + + page = super(DiggPaginator, self).page(number, *args, **kwargs) + number = int(number) # we know this will work + + # easier access + num_pages, body, tail, padding, margin = \ + self.num_pages, self.body, self.tail, self.padding, self.margin + + # put active page in middle of main range + main_range = map(int, [ + math.floor(number-body/2.0)+1, # +1 = shift odd body to right + math.floor(number+body/2.0)]) + # adjust bounds + if main_range[0] < 1: + main_range = map(abs(main_range[0]-1).__add__, main_range) + if main_range[1] > num_pages: + main_range = map((num_pages-main_range[1]).__add__, main_range) + + # Determine leading and trailing ranges; if possible and appropriate, + # combine them with the main range, in which case the resulting main + # block might end up considerable larger than requested. While we + # can't guarantee the exact size in those cases, we can at least try + # to come as close as possible: we can reduce the other boundary to + # max padding, instead of using half the body size, which would + # otherwise be the case. If the padding is large enough, this will + # of course have no effect. + # Example: + # total pages=100, page=4, body=5, (default padding=2) + # 1 2 3 [4] 5 6 ... 99 100 + # total pages=100, page=4, body=5, padding=1 + # 1 2 3 [4] 5 ... 99 100 + # If it were not for this adjustment, both cases would result in the + # first output, regardless of the padding value. + if main_range[0] <= tail+margin: + leading = [] + main_range = [1, max(body, min(number+padding, main_range[1]))] + main_range[0] = 1 + else: + leading = range(1, tail+1) + # basically same for trailing range, but not in ``left_align`` mode + if self.align_left: + trailing = [] + else: + if main_range[1] >= num_pages-(tail+margin)+1: + trailing = [] + if not leading: + # ... but handle the special case of neither leading nor + # trailing ranges; otherwise, we would now modify the + # main range low bound, which we just set in the previous + # section, again. + main_range = [1, num_pages] + else: + main_range = [min(num_pages-body+1, max(number-padding, main_range[0])), num_pages] + else: + trailing = range(num_pages-tail+1, num_pages+1) + + # finally, normalize values that are out of bound; this basically + # fixes all the things the above code screwed up in the simple case + # of few enough pages where one range would suffice. + main_range = [max(main_range[0], 1), min(main_range[1], num_pages)] + + # make the result of our calculations available as custom ranges + # on the ``Page`` instance. + page.main_range = range(main_range[0], main_range[1]+1) + page.leading_range = leading + page.trailing_range = trailing + page.page_range = reduce(lambda x, y: x+((x and y) and [False])+y, + [page.leading_range, page.main_range, page.trailing_range]) + + page.__class__ = DiggPage + return page + +class DiggPage(Page): + def __str__(self): + return " ... ".join(filter(None, [ + " ".join(map(str, self.leading_range)), + " ".join(map(str, self.main_range)), + " ".join(map(str, self.trailing_range))])) + +class QuerySetDiggPaginator(DiggPaginator, QuerySetPaginator): + pass + +if __name__ == "__main__": + import doctest + doctest.testmod() diff -r c525f3e0b5d0 -r ee87ea74d46b core/services.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/services.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,21 @@ +""" +This module provides a common way for the various apps to integrate with services +that are installed at this site. + +""" +from django.conf import settings +import redis + +# Redis connection and database settings + +REDIS_HOST = getattr(settings, 'REDIS_HOST', 'localhost') +REDIS_PORT = getattr(settings, 'REDIS_PORT', 6379) +REDIS_DB = getattr(settings, 'REDIS_DB', 0) + + +def get_redis_connection(host=REDIS_HOST, port=REDIS_PORT, db=REDIS_DB): + """ + Create and return a Redis connection using the supplied parameters. + + """ + return redis.StrictRedis(host=host, port=port, db=db) diff -r c525f3e0b5d0 -r ee87ea74d46b core/tasks.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/tasks.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,61 @@ +""" +Celery tasks for the core application. + +""" +from celery.task import task +import django.core.mail + +import core.whos_online + + +@task +def add(x, y): + """ + It is useful to have a test task laying around. This is it. + + """ + return x + y + + +@task +def send_mail(subject, message, from_email, recipient_list, **kwargs): + """ + A task to send mail via Django. + + """ + django.core.mail.send_mail(subject, message, from_email, recipient_list, + **kwargs) + + +@task +def cleanup(): + """ + A task to perform site-wide cleanup actions. + + """ + # These imports, when placed at the top of the module, caused all kinds of + # import problems when running on the production server (Python 2.5 and + # mod_wsgi). Moving them here worked around that problem. + + from django.core.management.commands.cleanup import Command as CleanupCommand + from forums.management.commands.forum_cleanup import Command as ForumCleanup + + # Execute Django's cleanup command (deletes old sessions). + + command = CleanupCommand() + command.execute() + + # Execute our forum cleanup command to delete old last visit records. + + command = ForumCleanup() + command.execute() + + +@task +def max_users(): + """ + Run the periodic task to calculate the who's online max users/visitors + statistics. + + """ + core.whos_online.max_users() diff -r c525f3e0b5d0 -r ee87ea74d46b core/templatetags/core_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/templatetags/core_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,223 @@ +""" +Miscellaneous/utility template tags. + +""" +import collections +import datetime +import urllib + +from django import template +from django.conf import settings +from django.core.cache import cache +from django.contrib.sites.models import Site + +import repoze.timeago + +from core.whos_online import get_users_online, get_visitors_online, get_stats +from bio.models import UserProfile + + +register = template.Library() + +ICON_PARAMS = { + True: (settings.STATIC_URL + 'icons/accept.png', 'Yes'), + False: (settings.STATIC_URL + 'icons/delete.png', 'No'), +} + +@register.simple_tag +def bool_icon(flag): + params = ICON_PARAMS[bool(flag)] + return u"""%s""" % ( + params[0], params[1], params[1]) + + +@register.inclusion_tag('core/comment_dialogs.html') +def comment_dialogs(): + return {'STATIC_URL': settings.STATIC_URL} + + +@register.inclusion_tag('core/max_users_tag.html') +def max_users(): + """ + Displays max users online information. + + """ + return { + 'stats': get_stats(), + } + +@register.inclusion_tag('core/whos_online_tag.html') +def whos_online(): + """ + Displays a list of who is online. + + """ + users = get_users_online() + users.sort(key=str.lower) + + visitors = get_visitors_online() + + return { + 'num_users': len(users), + 'users': users, + 'num_guests': len(visitors), + 'total': len(users) + len(visitors), + } + + +@register.inclusion_tag('core/social_sharing_tag.html') +def social_sharing(title, url): + """ + Displays social media sharing buttons. + + """ + site = Site.objects.get_current() + url = _fully_qualify(url, site.domain) + + return { + 'title': title, + 'url': url, + } + + +def _fully_qualify(url, domain): + """ + Returns a "fully qualified" URL by checking the given url. + If the url starts with '/' then http://domain is pre-pended + onto it. Otherwise the original URL is returned. + + """ + if url.startswith('/'): + url = "http://%s%s" % (domain, url) + return url + + +@register.inclusion_tag('core/open_graph_meta_tag.html') +def open_graph_meta_tags(item=None): + """ + Generates Open Graph meta tags by interrogating the given item. + To generate tags for the home page, set item to None. + + """ + site = Site.objects.get_current() + + if item: + props = item.ogp_tags() + else: + props = { + 'og:title': site.name, + 'og:type': 'website', + 'og:url': 'http://%s' % site.domain, + 'og:description': settings.OGP_SITE_DESCRIPTION, + } + + props['og:site_name'] = site.name + props['fb:admins'] = settings.OGP_FB_ID + + if 'og:image' not in props: + props['og:image'] = settings.OGP_DEFAULT_IMAGE + + if 'og:url' in props: + props['og:url'] = _fully_qualify(props['og:url'], site.domain) + + if 'og:image' in props: + props['og:image'] = _fully_qualify(props['og:image'], site.domain) + + return {'props': props} + + +# A somewhat ugly hack until we decide if we should be using UTC time +# everywhere or not. +repoze.timeago._NOW = datetime.datetime.now + +@register.filter(name='elapsed') +def elapsed(timestamp): + """ + This filter accepts a datetime and computes an elapsed time from "now". + The elapsed time is displayed as a "humanized" string. + Examples: + 1 minute ago + 5 minutes ago + 1 hour ago + 10 hours ago + 1 day ago + 7 days ago + + """ + return repoze.timeago.get_elapsed(timestamp) +elapsed.is_safe = True + + +class Birthday(object): + """ + A simple named tuple-type class for birthdays. + This class was created to make things easier in the template. + + """ + day = None + profiles = [] + + def __init__(self, day, profiles=None): + self.day = day + self.profiles = profiles if profiles else [] + + +@register.inclusion_tag('core/birthday_block.html') +def birthday_block(): + """ + A template tag to display all the users who have birthdays this month. + """ + today = datetime.date.today() + profiles = list(UserProfile.objects.filter(birthday__month=today.month).select_related( + 'user')) + + days = collections.defaultdict(list) + for profile in profiles: + days[profile.birthday.day].append(profile) + + birthdays = [Birthday(day, profiles) for day, profiles in days.iteritems()] + birthdays.sort(key=lambda b: b.day) + + return { + 'STATIC_URL': settings.STATIC_URL, + 'birthdays': birthdays, + 'today': today, + } + + +class EncodeParamsNode(template.Node): + """ + This is the Node class for the encode_params template tag. + This template tag retrieves the named parameters from the supplied + querydict and returns them as a urlencoded string. + + """ + def __init__(self, querydict, args): + self.querydict = template.Variable(querydict) + self.args = args + + def render(self, context): + querydict = self.querydict.resolve(context) + params = [] + for arg in self.args: + params.extend([(arg, value) for value in querydict.getlist(arg)]) + + return urllib.urlencode(params) + + +@register.tag +def encode_params(parser, token): + """ + This is the compilation function for the encode_params template tag. + This template tag retrieves the named parameters from the supplied + querydict and returns them as a urlencoded string. + + """ + bits = token.split_contents() + if len(bits) < 3: + raise template.TemplateSyntaxError("%s takes at least 2 arguments: " + "querydict arg1 [arg2 arg3 ... argN]" % bits[0]) + + querydict = bits[1] + args = [arg[1:-1] for arg in bits[2:]] + return EncodeParamsNode(querydict, args) diff -r c525f3e0b5d0 -r ee87ea74d46b core/templatetags/custom_admin_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/templatetags/custom_admin_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,50 @@ +""" +Custom template tags for the admin. +""" +from django import template +from django.db.models import Q + +from bio.models import UserProfileFlag +from comments.models import CommentFlag +from downloads.models import PendingDownload +from forums.models import FlaggedPost +from gcalendar.models import Event +from news.models import PendingStory +from weblinks.models import PendingLink, FlaggedLink +from shoutbox.models import ShoutFlag + + +register = template.Library() + + +@register.inclusion_tag('core/admin_dashboard.html') +def admin_dashboard(user): + """ + This tag is used in the admin to create a dashboard + of pending content that an admin must approve. + """ + flagged_profiles = UserProfileFlag.objects.count() + flagged_comments = CommentFlag.objects.count() + new_downloads = PendingDownload.objects.count() + flagged_posts = FlaggedPost.objects.count() + event_requests = Event.objects.filter( + Q(status=Event.NEW) | + Q(status=Event.EDIT_REQ) | + Q(status=Event.DEL_REQ)).count() + new_stories = PendingStory.objects.count() + new_links = PendingLink.objects.count() + broken_links = FlaggedLink.objects.count() + flagged_shouts = ShoutFlag.objects.count() + + return { + 'user': user, + 'flagged_profiles': flagged_profiles, + 'flagged_comments': flagged_comments, + 'new_downloads': new_downloads, + 'flagged_posts': flagged_posts, + 'event_requests': event_requests, + 'new_stories': new_stories, + 'new_links': new_links, + 'broken_links': broken_links, + 'flagged_shouts': flagged_shouts, + } diff -r c525f3e0b5d0 -r ee87ea74d46b core/templatetags/script_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/templatetags/script_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,27 @@ +""" +Template tags to generate and ' % (prefix, path) + + return s diff -r c525f3e0b5d0 -r ee87ea74d46b core/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,9 @@ +""" +Urls for the core application. +""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('core.views', + url(r'^markdown_help/$', 'markdown_help', name='core-markdown_help'), + url(r'^ajax/users/$', 'ajax_users', name='core-ajax_users'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b core/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,38 @@ +""" +Views for the core application. These are mainly shared, common views +used by multiple applications. +""" +from django.contrib.auth.models import User +from django.http import HttpResponse +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.contrib.auth.decorators import login_required +from django.views.decorators.http import require_GET +import django.utils.simplejson as json + + +@login_required +@require_GET +def markdown_help(request): + """ + This view provides the Markdown help cheat sheet. It is expected + to be called via AJAX. + """ + return render_to_response('core/markdown_help.html') + + +def ajax_users(request): + """ + If the user is authenticated, return a JSON array of strings of usernames + whose names start with the 'q' GET parameter, limited by the 'limit' GET + parameter. Only active usernames are returned. + If the user is not authenticated, return an empty array. + """ + q = request.GET.get('q', None) + if q is None or not request.user.is_authenticated(): + return HttpResponse(json.dumps([]), content_type='application/json') + + limit = int(request.GET.get('limit', 10)) + users = User.objects.filter(is_active=True, + username__istartswith=q).values_list('username', flat=True)[:limit] + return HttpResponse(json.dumps(list(users)), content_type='application/json') diff -r c525f3e0b5d0 -r ee87ea74d46b core/whos_online.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/whos_online.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,239 @@ +""" +This module keeps track of who is online. We maintain records for both +authenticated users ("users") and non-authenticated visitors ("visitors"). +""" +import datetime +import logging +import time + +import redis + +from core.services import get_redis_connection +from core.models import Statistic + + +# Users and visitors each have a sorted set in a Redis database. When a user or +# visitor is seen, the respective set is updated with the score of the current +# time. Periodically we remove elements by score (time) to stale out members. + +# Redis key names: +USER_SET_KEY = "whos_online:users" +VISITOR_SET_KEY = "whos_online:visitors" + +CORE_STATS_KEY = "core:stats" + +# the period over which we collect who's online stats: +MAX_AGE = datetime.timedelta(minutes=15) + + +# Logging: we don't want a Redis malfunction to bring down the site. So we +# catch all Redis exceptions, log them, and press on. +logger = logging.getLogger(__name__) + + +def _get_connection(): + """ + Create and return a Redis connection. Returns None on failure. + """ + try: + conn = get_redis_connection() + return conn + except redis.RedisError, e: + logger.error(e) + + return None + + +def to_timestamp(dt): + """ + Turn the supplied datetime object into a UNIX timestamp integer. + + """ + return int(time.mktime(dt.timetuple())) + + +def _zadd(key, member): + """ + Adds the member to the given set key, using the current time as the score. + + """ + conn = _get_connection() + if conn: + ts = to_timestamp(datetime.datetime.now()) + try: + conn.zadd(key, ts, member) + except redis.RedisError, e: + logger.error(e) + + +def _zrangebyscore(key): + """ + Performs a zrangebyscore operation on the set given by key. + The minimum score will be a timestap equal to the current time + minus MAX_AGE. The maximum score will be a timestap equal to the + current time. + + """ + conn = _get_connection() + if conn: + now = datetime.datetime.now() + min = to_timestamp(now - MAX_AGE) + max = to_timestamp(now) + try: + return conn.zrangebyscore(key, min, max) + except redis.RedisError, e: + logger.error(e) + + return [] + + +def report_user(username): + """ + Call this function when a user has been seen. The username will be added to + the set of users online. + + """ + _zadd(USER_SET_KEY, username) + + +def report_visitor(ip): + """ + Call this function when a visitor has been seen. The IP address will be + added to the set of visitors online. + + """ + _zadd(VISITOR_SET_KEY, ip) + + +def get_users_online(): + """ + Returns a list of user names from the user set. + sets. + """ + return _zrangebyscore(USER_SET_KEY) + + +def get_visitors_online(): + """ + Returns a list of visitor IP addresses from the visitor set. + """ + return _zrangebyscore(VISITOR_SET_KEY) + + +def _tick(conn): + """ + Call this function to "age out" the sets by removing old users/visitors. + It then returns a tuple of the form: + (zcard users, zcard visitors) + + """ + cutoff = to_timestamp(datetime.datetime.now() - MAX_AGE) + + try: + pipeline = conn.pipeline(transaction=False) + pipeline.zremrangebyscore(USER_SET_KEY, 0, cutoff) + pipeline.zremrangebyscore(VISITOR_SET_KEY, 0, cutoff) + pipeline.zcard(USER_SET_KEY) + pipeline.zcard(VISITOR_SET_KEY) + result = pipeline.execute() + except redis.RedisError, e: + logger.error(e) + return 0, 0 + + return result[2], result[3] + + +def max_users(): + """ + Run this function periodically to clean out the sets and to compute our max + users and max visitors statistics. + + """ + conn = _get_connection() + if not conn: + return + + num_users, num_visitors = _tick(conn) + now = datetime.datetime.now() + + stats = get_stats(conn) + update = False + + if stats is None: + stats = Statistic(id=1, + max_users=num_users, + max_users_date=now, + max_anon_users=num_visitors, + max_anon_users_date=now) + update = True + else: + if num_users > stats.max_users: + stats.max_users = num_users + stats.max_users_date = now + update = True + + if num_visitors > stats.max_anon_users: + stats.max_anon_users = num_visitors + stats.max_anon_users_date = now + update = True + + if update: + _save_stats_to_redis(conn, stats) + stats.save() + + +def get_stats(conn=None): + """ + This function retrieves the who's online max user stats out of Redis. If + the keys do not exist in Redis, we fall back to the database. If the stats + are not available, None is returned. + Note that if we can find stats data, it will be returned as a Statistic + object. + + """ + if conn is None: + conn = _get_connection() + + stats = None + if conn: + try: + stats = conn.hgetall(CORE_STATS_KEY) + except redis.RedisError, e: + logger.error(e) + + if stats: + return Statistic( + id=1, + max_users=stats['max_users'], + max_users_date=datetime.datetime.fromtimestamp( + float(stats['max_users_date'])), + max_anon_users=stats['max_anon_users'], + max_anon_users_date=datetime.datetime.fromtimestamp( + float(stats['max_anon_users_date']))) + + try: + stats = Statistic.objects.get(pk=1) + except Statistic.DoesNotExist: + return None + else: + _save_stats_to_redis(conn, stats) + return stats + + +def _save_stats_to_redis(conn, stats): + """ + Saves the statistics to Redis. A TTL is put on the key to prevent Redis and + the database from becoming out of sync. + + """ + fields = dict( + max_users=stats.max_users, + max_users_date=to_timestamp(stats.max_users_date), + max_anon_users=stats.max_anon_users, + max_anon_users_date=to_timestamp(stats.max_anon_users_date)) + + try: + conn.hmset(CORE_STATS_KEY, fields) + conn.expire(CORE_STATS_KEY, 4 * 60 * 60) + except redis.RedisError, e: + logger.error(e) diff -r c525f3e0b5d0 -r ee87ea74d46b core/widgets.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/core/widgets.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,55 @@ +""" +Various useful widgets for the GPP application. +""" + +from django import forms +from django.utils.safestring import mark_safe +from django.core.urlresolvers import reverse +from django.conf import settings + + +class AutoCompleteUserInput(forms.TextInput): + + def render(self, name, value, attrs=None): + url = reverse('core-ajax_users') + output = super(AutoCompleteUserInput, self).render(name, value, attrs) + return output + mark_safe(u"""\ +""" % (name, url)) + diff -r c525f3e0b5d0 -r ee87ea74d46b custom_search/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/custom_search/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,34 @@ +""" +This module contains custom forms to tailor the Haystack search application to +our needs. + +""" +from django import forms +from haystack.forms import ModelSearchForm + + +MODEL_CHOICES = ( + ('forums.topic', 'Forum Topics'), + ('forums.post', 'Forum Posts'), + ('news.story', 'News Stories'), + ('bio.userprofile', 'User Profiles'), + ('weblinks.link', 'Links'), + ('downloads.download', 'Downloads'), + ('podcast.item', 'Podcasts'), + ('ygroup.post', 'Yahoo Group Archives'), +) + + +class CustomModelSearchForm(ModelSearchForm): + """ + This customized ModelSearchForm allows us to explictly label and order + the model choices. + + """ + q = forms.CharField(required=False, label='', + widget=forms.TextInput(attrs={'class': 'text', 'size': 48})) + + def __init__(self, *args, **kwargs): + super(CustomModelSearchForm, self).__init__(*args, **kwargs) + self.fields['models'] = forms.MultipleChoiceField(choices=MODEL_CHOICES, + label='', widget=forms.CheckboxSelectMultiple) diff -r c525f3e0b5d0 -r ee87ea74d46b custom_search/indexes.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/custom_search/indexes.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,31 @@ +""" +This module contains custom search indexes to tailor the Haystack search +application to our needs. + +""" +from queued_search.indexes import QueuedSearchIndex + + +class CondQueuedSearchIndex(QueuedSearchIndex): + """ + This customized version of QueuedSearchIndex conditionally enqueues items + to be indexed by calling the can_index() method. + + """ + def can_index(self, instance): + """ + The default is to index all instances. Override this method to + customize the behavior. This will be called on all update operations. + + """ + return True + + def enqueue(self, action, instance): + """ + This method enqueues the instance only if the can_index() method + returns True. + + """ + if (action == 'update' and self.can_index(instance) or + action == 'delete'): + super(CondQueuedSearchIndex, self).enqueue(action, instance) diff -r c525f3e0b5d0 -r ee87ea74d46b custom_search/tasks.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/custom_search/tasks.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,18 @@ +""" +Tasks for our custom search application. + +""" +from celery.task import task + +from queued_search.management.commands.process_search_queue import Command + + +@task +def process_search_queue_task(): + """ + Celery task to run the queued_search application's process_search_queue + command. + + """ + command = Command() + command.execute() diff -r c525f3e0b5d0 -r ee87ea74d46b donations/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/donations/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,10 @@ +""" +This file contains the admin definitions for the donations application. +""" +from django.contrib import admin +from donations.models import Donation + +class DonationAdmin(admin.ModelAdmin): + raw_id_fields = ('user', ) + +admin.site.register(Donation, DonationAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b donations/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/donations/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,85 @@ +""" +Models for the donations application. +""" +import datetime +import decimal + +from django.db import models +from django.contrib.auth.models import User +from django.conf import settings + + +class DonationManager(models.Manager): + def monthly_stats(self, year=None, month=None): + """ + Returns a tuple of items for the given month in the given + year. If year is None, the current year is used. If month is None, + the current month is used. + The returned tuple has the following items, in order: + (gross, net, donations) + where: + 'gross': total gross donations + 'net': total net donations + 'donations': list of donation objects + """ + today = datetime.date.today() + if year is None: + year = today.year + if month is None: + month = today.month + + qs = self.filter(payment_date__year=year, + payment_date__month=month, + test_ipn=settings.DONATIONS_DEBUG).order_by( + 'payment_date').select_related('user') + + gross = decimal.Decimal() + net = decimal.Decimal() + donations = [] + for donation in qs: + gross += donation.mc_gross + net += donation.mc_gross - donation.mc_fee + donations.append(donation) + + return gross, net, donations + + +class Donation(models.Model): + """Model to represent a donation to the website.""" + + user = models.ForeignKey(User, null=True, blank=True) + is_anonymous = models.BooleanField() + test_ipn = models.BooleanField(default=False, verbose_name="Test IPN") + txn_id = models.CharField(max_length=20, verbose_name="Txn ID") + txn_type = models.CharField(max_length=64) + first_name = models.CharField(max_length=64, blank=True) + last_name = models.CharField(max_length=64, blank=True) + payer_email = models.EmailField(max_length=127, blank=True) + payer_id = models.CharField(max_length=13, blank=True, verbose_name="Payer ID") + mc_fee = models.DecimalField(max_digits=8, decimal_places=2, verbose_name="Fee") + mc_gross = models.DecimalField(max_digits=8, decimal_places=2, verbose_name="Gross") + memo = models.TextField(blank=True) + payer_status = models.CharField(max_length=10, blank=True) + payment_date = models.DateTimeField() + + objects = DonationManager() + + class Meta: + ordering = ('-payment_date', ) + + def __unicode__(self): + if self.user: + return u'%s from %s' % (self.mc_gross, self.user.username) + return u'%s from %s %s' % (self.mc_gross, self.first_name, self.last_name) + + def donor(self): + """Returns the donor name for the donation.""" + if self.is_anonymous: + return settings.DONATIONS_ANON_NAME + if self.user is not None: + return self.user.username + if self.first_name or self.last_name: + name = u'%s %s' % (self.first_name, self.last_name) + return name.strip() + return settings.DONATIONS_ANON_NAME + diff -r c525f3e0b5d0 -r ee87ea74d46b donations/tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/donations/tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,110 @@ +""" +Tests for the donations application. +""" +import urlparse +from decimal import Decimal +import datetime + +from django.contrib.auth.models import User +from django.test import TestCase +from django.core.urlresolvers import reverse + +from donations.models import Donation +import bio.badges + + +# This data was copy/pasted from my actual Paypal IPN history. Some alterations +# were made since this file is getting committed to version control and I +# didn't want to store "real" data that could be used to trace a transaction or +# real payer. + +# This data is for a non-anonymous donation: +TEST_POST_DATA_1 = """\ +mc_gross=5.00&protection_eligibility=Ineligible&payer_id=FAKEPAYERID01&tax=0.00&payment_date=04:14:08 Jan 21, 2011 PST&payment_status=Completed&charset=windows-1252&first_name=John&option_selection1=No&mc_fee=0.50¬ify_version=3.0&custom=test_user&payer_status=verified&business=brian@surfguitar101.com&quantity=1&verify_sign=Ai1PaTHIS-IS-FAKE-DATA-jB264AOjpiTa4vcsPCEavq-83oyIclHKI&payer_email=test_user@example.com&option_name1=List your name?&txn_id=TESTTXNID5815921V&payment_type=instant&last_name=Doe&receiver_email=brian@surfguitar101.com&payment_fee=0.50&receiver_id=FAKERECEIVERU&txn_type=web_accept&item_name=Donation for www.surfguitar101.com&mc_currency=USD&item_number=500&residence_country=AU&handling_amount=0.00&transaction_subject=test_user&payment_gross=5.00&shipping=0.00""" + +# Data from a user that wanted to remain anonymous +TEST_POST_DATA_2 = """\ +mc_gross=100.00&protection_eligibility=Ineligible&payer_id=FAKEPAYERID02&tax=0.00&payment_date=05:40:33 Jan 16, 2011 PST&payment_status=Completed&charset=windows-1252&first_name=John&option_selection1=No&mc_fee=3.20¬ify_version=3.0&custom=test_user&payer_status=unverified&business=brian@surfguitar101.com&quantity=1&verify_sign=AIkKNFAKE-DATA-NOT-REALpqCSxA-E7Tm4rMGlUpNy6ym0.exBzfiyI&payer_email=test_user@example.com&option_name1=List your name?&txn_id=TESTTXNIDK548343A&payment_type=instant&last_name=Doe&receiver_email=brian@surfguitar101.com&payment_fee=3.20&receiver_id=FAKERECEIVERU&txn_type=web_accept&item_name=Donation for www.surfguitar101.com&mc_currency=USD&item_number=501&residence_country=US&handling_amount=0.00&transaction_subject=test_user&payment_gross=100.00&shipping=0.00""" + + +class DonationsTest(TestCase): + fixtures = ['badges'] + + def test_ipn_post_1(self): + """ + Test a simulated IPN post + """ + user = User.objects.create_user('test_user', 'test_user@example.com', + 'password') + user.save() + + args = urlparse.parse_qs(TEST_POST_DATA_1) + response = self.client.post(reverse('donations-ipn'), args) + + self.assertEqual(response.status_code, 200) + + try: + d = Donation.objects.get(pk=1) + except Donation.DoesNotExist: + self.fail("Donation object was not created") + else: + self.assertEqual(d.user, user) + self.assertFalse(d.is_anonymous) + self.assertFalse(d.test_ipn) + self.assertEqual(d.txn_id, 'TESTTXNID5815921V') + self.assertEqual(d.txn_type, 'web_accept') + self.assertEqual(d.first_name, 'John') + self.assertEqual(d.last_name, 'Doe') + self.assertEqual(d.payer_email, 'test_user@example.com') + self.assertEqual(d.payer_id, 'FAKEPAYERID01') + self.assertEqual(d.mc_fee, Decimal('0.50')) + self.assertEqual(d.mc_gross, Decimal('5.00')) + self.assertEqual(d.memo, '') + self.assertEqual(d.payer_status, 'verified') + self.assertEqual(d.payment_date, + datetime.datetime(2011, 1, 21, 4, 14, 8)) + + # user should have got a badge for donating + p = user.get_profile() + badges = list(p.badges.all()) + self.assertEqual(len(badges), 1) + if len(badges) == 1: + self.assertEqual(badges[0].numeric_id, bio.badges.CONTRIBUTOR_PIN) + + def test_ipn_post_2(self): + """ + Test a simulated IPN post + """ + user = User.objects.create_user('test_user', 'test_user@example.com', + 'password') + user.save() + + args = urlparse.parse_qs(TEST_POST_DATA_2) + response = self.client.post(reverse('donations-ipn'), args) + + self.assertEqual(response.status_code, 200) + + try: + d = Donation.objects.get(pk=1) + except Donation.DoesNotExist: + self.fail("Donation object was not created") + else: + self.assertEqual(d.user, user) + self.assertTrue(d.is_anonymous) + self.assertFalse(d.test_ipn) + self.assertEqual(d.txn_id, 'TESTTXNIDK548343A') + self.assertEqual(d.txn_type, 'web_accept') + self.assertEqual(d.first_name, 'John') + self.assertEqual(d.last_name, 'Doe') + self.assertEqual(d.payer_email, 'test_user@example.com') + self.assertEqual(d.payer_id, 'FAKEPAYERID02') + self.assertEqual(d.mc_fee, Decimal('3.20')) + self.assertEqual(d.mc_gross, Decimal('100.00')) + self.assertEqual(d.memo, '') + self.assertEqual(d.payer_status, 'unverified') + self.assertEqual(d.payment_date, + datetime.datetime(2011, 1, 16, 5, 40, 33)) + + # user should not have got a badge for donating + p = user.get_profile() + self.assertEqual(p.badges.count(), 0) diff -r c525f3e0b5d0 -r ee87ea74d46b donations/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/donations/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,14 @@ +""" +URLs for the donations application. +""" +from django.conf.urls import patterns, url +from django.views.generic import TemplateView + +urlpatterns = patterns('donations.views', + url(r'^$', 'index', name='donations-index'), + url(r'^ipn/$', 'ipn', name='donations-ipn'), +) +urlpatterns += patterns('', + url(r'^thanks/$', TemplateView.as_view(template_name='donations/thanks.html'), + name='donations-thanks'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b donations/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/donations/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,221 @@ +""" +Views for the donations application. +""" +import urllib2 +import decimal +import datetime +import logging + +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.conf import settings +from django.contrib.sites.models import Site +from django.http import HttpResponse +from django.http import HttpResponseServerError +from django.contrib.auth.models import User +from django.views.decorators.csrf import csrf_exempt + + +from donations.models import Donation + +PP_DATE_FMT = '%H:%M:%S %b %d, %Y' + +def paypal_params(): + """ + This function returns a tuple where the 1st element is the Paypal + URL and the 2nd element is the Paypal business email. This information + depends on the setting DONATIONS_DEBUG. + """ + if settings.DONATIONS_DEBUG: + form_action = 'https://www.sandbox.paypal.com/cgi-bin/webscr' + business = settings.DONATIONS_BUSINESS_DEBUG + else: + form_action = 'https://www.paypal.com/cgi-bin/webscr' + business = settings.DONATIONS_BUSINESS + + return form_action, business + + +def verify_request(params): + """ + Send the parameters back to Paypal and return the response string. + """ + # If we are doing localhost-type unit tests, just return whatever + # the test wants us to... + if hasattr(settings, 'DONATIONS_DEBUG_VERIFY_RESPONSE'): + return settings.DONATIONS_DEBUG_VERIFY_RESPONSE + + req = urllib2.Request(paypal_params()[0], params) + req.add_header("Content-type", "application/x-www-form-urlencoded") + try: + response = urllib2.urlopen(req) + except URLError, e: + logging.exception('IPN: exception verifying IPN: %s', e) + return None + + return response.read() + + +def index(request): + gross, net, donations = Donation.objects.monthly_stats() + current_site = Site.objects.get_current() + form_action, business = paypal_params() + + return render_to_response('donations/index.html', { + 'goal': settings.DONATIONS_GOAL, + 'gross': gross, + 'net': net, + 'left': settings.DONATIONS_GOAL - net, + 'donations': donations, + 'form_action': form_action, + 'business': business, + 'anonymous': settings.DONATIONS_ANON_NAME, + 'item_name': settings.DONATIONS_ITEM_NAME, + 'item_number': settings.DONATIONS_ITEM_NUM, + 'item_anon_number': settings.DONATIONS_ITEM_ANON_NUM, + 'domain': current_site.domain, + }, + context_instance = RequestContext(request)) + + +@csrf_exempt +def ipn(request): + """ + This function is the IPN listener and handles the IPN POST from Paypal. + The algorithm here roughly follows the outline described in chapter 2 + of Paypal's IPNGuide.pdf "Implementing an IPN Listener". + + """ + # Log some info about this IPN event + ip = request.META.get('REMOTE_ADDR', '?') + parameters = request.POST.copy() + logging.info('IPN from %s; post data: %s', ip, parameters.urlencode()) + + # Now we follow the instructions in chapter 2 of the Paypal IPNGuide.pdf. + # Create a request that contains exactly the same IPN variables and values in + # the same order, preceded with cmd=_notify-validate + parameters['cmd']='_notify-validate' + + # Post the request back to Paypal (either to the sandbox or the real deal), + # and read the response: + status = verify_request(parameters.urlencode()) + if status != 'VERIFIED': + logging.warning('IPN: Payapl did not verify; status was %s', status) + return HttpResponse() + + # Response was VERIFIED; act on this if it is a Completed donation, + # otherwise don't handle it (we are just a donations application. Here + # is where we could be expanded to be a more general payment processor). + + payment_status = parameters.get('payment_status') + if payment_status != 'Completed': + logging.info('IPN: payment_status is %s; we are done.', payment_status) + return HttpResponse() + + # Is this a donation to the site? + item_number = parameters.get('item_number') + if (item_number == settings.DONATIONS_ITEM_NUM or + item_number == settings.DONATIONS_ITEM_ANON_NUM): + process_donation(item_number, parameters) + else: + logging.info('IPN: not a donation; done.') + + return HttpResponse() + + +def process_donation(item_number, params): + """ + A few validity and duplicate checks are made on the donation params. + If everything is ok, construct a donation object from the parameters and + store it in the database. + + """ + # Has this transaction been processed before? + txn_id = params.get('txn_id') + if txn_id is None: + logging.error('IPN: missing txn_id') + return + + try: + donation = Donation.objects.get(txn_id__exact=txn_id) + except Donation.DoesNotExist: + pass + else: + logging.warning('IPN: duplicate txn_id') + return # no exception, this is a duplicate + + # Is the email address ours? + business = params.get('business') + if business != paypal_params()[1]: + logging.warning('IPN: invalid business: %s', business) + return + + # is this a payment received? + txn_type = params.get('txn_type') + if txn_type != 'web_accept': + logging.warning('IPN: invalid txn_type: %s', txn_type) + return + + # Looks like a donation, save it to the database. + # Determine which user this came from, if any. + # The username is stored in the custom field if the user was logged in when + # the donation was made. + user = None + if 'custom' in params and params['custom']: + try: + user = User.objects.get(username__exact=params['custom']) + except User.DoesNotExist: + pass + + is_anonymous = item_number == settings.DONATIONS_ITEM_ANON_NUM + test_ipn = params.get('test_ipn') == '1' + + first_name = params.get('first_name', '') + last_name = params.get('last_name', '') + payer_email = params.get('payer_email', '') + payer_id = params.get('payer_id', '') + memo = params.get('memo', '') + payer_status = params.get('payer_status', '') + + try: + mc_gross = decimal.Decimal(params['mc_gross']) + mc_fee = decimal.Decimal(params['mc_fee']) + except KeyError, decimal.InvalidOperation: + logging.error('IPN: invalid/missing mc_gross or mc_fee') + return + + payment_date = params.get('payment_date') + if payment_date is None: + logging.error('IPN: missing payment_date') + return + + # strip off the timezone + payment_date = payment_date[:-4] + try: + payment_date = datetime.datetime.strptime(payment_date, PP_DATE_FMT) + except ValueError: + logging.error('IPN: invalid payment_date "%s"', params['payment_date']) + return + + try: + donation = Donation( + user=user, + is_anonymous=is_anonymous, + test_ipn=test_ipn, + txn_id=txn_id, + txn_type=txn_type, + first_name=first_name, + last_name=last_name, + payer_email=payer_email, + payer_id=payer_id, + memo=memo, + payer_status=payer_status, + mc_gross=mc_gross, + mc_fee=mc_fee, + payment_date=payment_date) + except: + logging.exception('IPN: exception during donation creation') + else: + donation.save() + logging.info('IPN: donation saved') + diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1 @@ +import signals diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,81 @@ +""" +This file contains the automatic admin site definitions for the downloads models. +""" +import datetime + +from django.contrib import admin +from django.conf import settings + +from downloads.models import PendingDownload +from downloads.models import Download +from downloads.models import Category +from downloads.models import AllowedExtension +from downloads.models import VoteRecord + + +class CategoryAdmin(admin.ModelAdmin): + list_display = ('title', 'slug', 'description', 'count') + prepopulated_fields = {'slug': ('title', )} + readonly_fields = ('count', ) + + +class PendingDownloadAdmin(admin.ModelAdmin): + exclude = ('html', ) + list_display = ('title', 'user', 'category', 'date_added', 'ip_address', 'size') + ordering = ('date_added', ) + raw_id_fields = ('user', ) + readonly_fields = ('update_date', ) + + actions = ('approve_downloads', ) + + def approve_downloads(self, request, qs): + for pending_dl in qs: + dl = Download( + title=pending_dl.title, + category=pending_dl.category, + description=pending_dl.description, + html=pending_dl.html, + file=pending_dl.file, + user=pending_dl.user, + date_added=datetime.datetime.now(), + ip_address=pending_dl.ip_address, + hits=0, + average_score=0.0, + total_votes=0, + is_public=True) + dl.save() + + # If we don't do this, the actual file will be deleted when + # the pending download is deleted. + pending_dl.file = None + pending_dl.delete() + + approve_downloads.short_description = "Approve selected downloads" + + +class DownloadAdmin(admin.ModelAdmin): + exclude = ('html', ) + list_display = ('title', 'user', 'category', 'date_added', 'ip_address', + 'hits', 'average_score', 'size', 'is_public') + list_filter = ('date_added', 'is_public', 'category') + list_editable = ('is_public', ) + date_hierarchy = 'date_added' + ordering = ('-date_added', ) + search_fields = ('title', 'description', 'user__username') + raw_id_fields = ('user', ) + readonly_fields = ('update_date', ) + save_on_top = True + + +class VoteRecordAdmin(admin.ModelAdmin): + list_display = ('user', 'download', 'vote_date') + list_filter = ('user', 'download') + date_hierarchy = 'vote_date' + + +admin.site.register(PendingDownload, PendingDownloadAdmin) +admin.site.register(Download, DownloadAdmin) +admin.site.register(Category, CategoryAdmin) +admin.site.register(AllowedExtension) +admin.site.register(VoteRecord, VoteRecordAdmin) + diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/fixtures/downloads_categories.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/fixtures/downloads_categories.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,82 @@ +[ + { + "pk": 1, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "Jam along to backing tracks made by your fellow SG101'ers!", + "slug": "backing-tracks", + "title": "Backing Tracks" + } + }, + { + "pk": 5, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "User demos.", + "slug": "demos", + "title": "Demos" + } + }, + { + "pk": 2, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "Recordings of user gear in action.", + "slug": "gear-samples", + "title": "Gear Samples" + } + }, + { + "pk": 6, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "Interviews with surf scenesters.", + "slug": "interviews", + "title": "Interviews" + } + }, + { + "pk": 3, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "Anything else.", + "slug": "misc", + "title": "Misc" + } + }, + { + "pk": 7, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "Legal music created by members.", + "slug": "music", + "title": "Music" + } + }, + { + "pk": 4, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "Please upload original surf music ringtones here.", + "slug": "ringtones", + "title": "Ringtones" + } + }, + { + "pk": 8, + "model": "downloads.category", + "fields": { + "count": 0, + "description": "User contributed tablature. Please upload in .pdf or .txt formats only.", + "slug": "tablature", + "title": "Tablature" + } + } +] diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/fixtures/downloads_extensions.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/fixtures/downloads_extensions.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,86 @@ +[ + { + "pk": 7, + "model": "downloads.allowedextension", + "fields": { + "extension": ".gif" + } + }, + { + "pk": 9, + "model": "downloads.allowedextension", + "fields": { + "extension": ".jpeg" + } + }, + { + "pk": 8, + "model": "downloads.allowedextension", + "fields": { + "extension": ".jpg" + } + }, + { + "pk": 6, + "model": "downloads.allowedextension", + "fields": { + "extension": ".m4a" + } + }, + { + "pk": 10, + "model": "downloads.allowedextension", + "fields": { + "extension": ".mov" + } + }, + { + "pk": 3, + "model": "downloads.allowedextension", + "fields": { + "extension": ".mp3" + } + }, + { + "pk": 5, + "model": "downloads.allowedextension", + "fields": { + "extension": ".mp4" + } + }, + { + "pk": 2, + "model": "downloads.allowedextension", + "fields": { + "extension": ".pdf" + } + }, + { + "pk": 13, + "model": "downloads.allowedextension", + "fields": { + "extension": ".png" + } + }, + { + "pk": 1, + "model": "downloads.allowedextension", + "fields": { + "extension": ".txt" + } + }, + { + "pk": 4, + "model": "downloads.allowedextension", + "fields": { + "extension": ".wma" + } + }, + { + "pk": 11, + "model": "downloads.allowedextension", + "fields": { + "extension": ".zip" + } + } +] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,38 @@ +""" +Forms for the downloads application. +""" +import os + +from django import forms +from django.conf import settings + +from downloads.models import PendingDownload +from downloads.models import AllowedExtension + + +class AddDownloadForm(forms.ModelForm): + """Form to allow adding downloads.""" + title = forms.CharField(required=True, + widget=forms.TextInput(attrs={'size': 64, 'maxlength': 64})) + description = forms.CharField(required=False, + widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) + + def clean_file(self): + file = self.cleaned_data['file'] + ext = os.path.splitext(file.name)[1] + allowed_exts = AllowedExtension.objects.get_extension_list() + if ext in allowed_exts: + return file + raise forms.ValidationError('The file extension "%s" is not allowed.' % ext) + + class Meta: + model = PendingDownload + fields = ('title', 'category', 'description', 'file') + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui']) + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui']) diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/management/commands/dlcatreport.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/management/commands/dlcatreport.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,38 @@ +""" +dlcatreport - a management command to produce a HTML report of all the downloads +in a given category. + +""" +from django.core.management.base import LabelCommand, CommandError +from django.template.loader import render_to_string + +from downloads.models import Category, Download + + +class Command(LabelCommand): + help = "Produce on standard output a report of all downloads in a category." + args = "category-slug" + + def handle_label(self, slug, **options): + """ + Render a template using the downloads in a given category and send it to + stdout. + + """ + try: + category = Category.objects.get(slug=slug) + except Category.DoesNotExist: + raise CommandError("category slug '%s' does not exist" % slug) + + downloads = Download.public_objects.filter(category=category).order_by( + 'title').select_related() + + report = render_to_string('downloads/commands/category_report.html', { + 'category': category, + 'downloads': downloads, + }) + + # encode it ourselves since it can fail if you try to redirect output to + # a file and any of the content is not ASCII... + print report.encode('utf-8') + diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/management/commands/dlwgetcat.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/management/commands/dlwgetcat.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,53 @@ +""" +dlwgetcat - a management command to produce a bash script that wgets all the +files in a given category. + +""" +import os.path + +from django.core.management.base import LabelCommand, CommandError +from django.template.loader import render_to_string +from django.template.defaultfilters import slugify +from django.contrib.sites.models import Site +from django.conf import settings + +from downloads.models import Category, Download + + +class Command(LabelCommand): + help = ("Produce on standard output a bash script that wgets all the files" + " in a category. The files are downloaded with a slugified name.") + + args = "category-slug" + + def handle_label(self, slug, **options): + """ + Render a template using the downloads in a given category and send it to + stdout. + + """ + try: + category = Category.objects.get(slug=slug) + except Category.DoesNotExist: + raise CommandError("category slug '%s' does not exist" % slug) + + downloads = Download.public_objects.filter(category=category).order_by( + 'title').select_related() + + # Create new destination names for the files since the uploaders often + # give the files terrible names. The new names will be slugified + # versions of the titles, with the same extension. + + for dl in downloads: + ext = os.path.splitext(dl.file.name)[1] + dl.dest_filename = slugify(dl.title) + ext + + output = render_to_string('downloads/commands/wget_cat.html', { + 'downloads': downloads, + 'domain': Site.objects.get_current().domain, + 'MEDIA_URL': settings.MEDIA_URL, + }) + + # encode it ourselves since it can fail if you try to redirect output to + # a file and any of the content is not ASCII... + print output.encode('utf-8') diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,166 @@ +""" +Models for the downloads application. +""" +import os + +import datetime +from django.db import models +from django.contrib.auth.models import User +from django.template.defaultfilters import filesizeformat + +from core.markup import site_markup + + +class Category(models.Model): + """Downloads belong to categories.""" + title = models.CharField(max_length=64) + slug = models.SlugField(max_length=64) + description = models.TextField(blank=True) + count = models.IntegerField(default=0, blank=True) + + class Meta: + verbose_name_plural = 'Categories' + ordering = ('title', ) + + def __unicode__(self): + return self.title + + +def download_path(instance, filename): + """ + Creates a path for a download. Uses the current date to avoid filename + clashes. Uses the current microsecond also to make the directory name + harder to guess. + """ + now = datetime.datetime.now() + parts = ['downloads'] + parts.extend([str(p) for p in (now.year, now.month, now.day)]) + parts.append(hex((now.hour * 3600 + now.minute * 60 + now.second) * 1000 + ( + now.microsecond / 1000))[2:]) + parts.append(filename) + return os.path.join(*parts) + + +class PublicDownloadManager(models.Manager): + """The manager for all public downloads.""" + def get_query_set(self): + return super(PublicDownloadManager, self).get_query_set().filter( + is_public=True).select_related() + + +class DownloadBase(models.Model): + """Abstract model to collect common download fields.""" + title = models.CharField(max_length=128) + category = models.ForeignKey(Category) + description = models.TextField() + html = models.TextField(blank=True) + file = models.FileField(upload_to=download_path) + user = models.ForeignKey(User) + date_added = models.DateTimeField(db_index=True) + ip_address = models.IPAddressField('IP Address') + update_date = models.DateTimeField(db_index=True, blank=True) + + class Meta: + abstract = True + + def size(self): + return filesizeformat(self.file.size) + + +class PendingDownload(DownloadBase): + """This model represents pending downloads created by users. These pending + downloads must be approved by an admin before they turn into "real" + Downloads and are visible on site. + """ + class Meta: + ordering = ('date_added', ) + + def __unicode__(self): + return self.title + + def save(self, *args, **kwargs): + if not self.pk: + self.date_added = datetime.datetime.now() + self.update_date = self.date_added + else: + self.update_date = datetime.datetime.now() + + self.html = site_markup(self.description) + super(PendingDownload, self).save(*args, **kwargs) + + +class Download(DownloadBase): + """Model to represent a download.""" + hits = models.IntegerField(default=0) + average_score = models.FloatField(default=0.0) + total_votes = models.IntegerField(default=0) + is_public = models.BooleanField(default=False, db_index=True) + + # Managers: + objects = models.Manager() + public_objects = PublicDownloadManager() + + def __unicode__(self): + return self.title + + @models.permalink + def get_absolute_url(self): + return ('downloads-details', [str(self.id)]) + + def save(self, *args, **kwargs): + if not self.pk: + self.date_added = datetime.datetime.now() + self.update_date = self.date_added + else: + self.update_date = datetime.datetime.now() + + self.html = site_markup(self.description) + super(Download, self).save(*args, **kwargs) + + def vote(self, vote_value): + """receives a vote_value and updates internal score accordingly""" + total_score = self.average_score * self.total_votes + total_score += vote_value + self.total_votes += 1 + self.average_score = total_score / self.total_votes + return self.average_score + + def search_title(self): + return self.title + + def search_summary(self): + return self.description + + +class AllowedExtensionManager(models.Manager): + def get_extension_list(self): + return self.values_list('extension', flat=True) + + +class AllowedExtension(models.Model): + """Model to represent the list of allowed file extensions.""" + extension = models.CharField(max_length=8, help_text="e.g. .txt") + + objects = AllowedExtensionManager() + + def __unicode__(self): + return self.extension + + class Meta: + ordering = ('extension', ) + + +class VoteRecord(models.Model): + """Model to record the date that a user voted on a download.""" + download = models.ForeignKey(Download) + user = models.ForeignKey(User) + vote_date = models.DateTimeField(auto_now_add=True) + + def __unicode__(self): + return u"%s voted on '%s' on %s" % ( + self.user.username, + self.download.title, + self.vote_date.strftime('%b %d, %Y %H:%M:%S')) + + class Meta: + ordering = ('-vote_date', ) diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/search_indexes.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/search_indexes.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,23 @@ +"""Haystack search index for the downloads application.""" +from haystack.indexes import * +from haystack import site +from custom_search.indexes import CondQueuedSearchIndex + +from downloads.models import Download + + +class DownloadIndex(CondQueuedSearchIndex): + text = CharField(document=True, use_template=True) + author = CharField(model_attr='user') + pub_date = DateTimeField(model_attr='date_added') + + def index_queryset(self): + return Download.public_objects.all() + + def get_updated_field(self): + return 'update_date' + + def can_index(self, instance): + return instance.is_public + +site.register(Download, DownloadIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/signals.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/signals.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,43 @@ +"""Signals for the downloads application. +We use signals to compute the denormalized category counts whenever a download +is saved.""" +from django.db.models.signals import post_save +from django.db.models.signals import post_delete + +from downloads.models import Category, Download + + +def on_download_save(sender, **kwargs): + """This function updates the count field for all categories. + It is called whenever a download is saved via a signal. + """ + if kwargs['created']: + # we only have to update the parent category + download = kwargs['instance'] + cat = download.category + cat.count = Download.public_objects.filter(category=cat).count() + cat.save() + else: + # update all categories just to be safe (an existing download could + # have been moved from one category to another + cats = Category.objects.all() + for cat in cats: + cat.count = Download.public_objects.filter(category=cat).count() + cat.save() + + +def on_download_delete(sender, **kwargs): + """This function updates the count field for the download's parent + category. It is called when a download is deleted via a signal. + """ + # update the parent category + download = kwargs['instance'] + cat = download.category + cat.count = Download.public_objects.filter(category=cat).count() + cat.save() + + +post_save.connect(on_download_save, sender=Download, + dispatch_uid='downloads.signals') +post_delete.connect(on_download_delete, sender=Download, + dispatch_uid='downloads.signals') diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/static/css/downloads.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/static/css/downloads.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,8 @@ +#downloads-add td { + padding-bottom: 5px; +} + +#downloads-add fieldset { + margin: 1em 0 1em; + padding: 0.5em; +} diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/static/js/downloads-get.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/static/js/downloads-get.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,33 @@ +$(document).ready(function() { + $('.dl-button').each(function(n) { + var button = $(this); + var id = button.attr('id'); + var numeric_id = -1; + if (id.match(/dl-(\d+)/)) + { + numeric_id = RegExp.$1; + } + button.click(function() { + button.attr('disabled', 'disabled').val('Getting link, stand by...'); + $.ajax({ + url: '/downloads/request/', + type: 'POST', + data: { id: numeric_id }, + dataType: 'json', + success: function(result) { + var link_id = result.id; + var div = $('#link-' + link_id); + div.hide(); + div.html( + 'Thank you! Your download is now ready. Click here to download.'); + div.fadeIn(3000); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + }); + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/static/js/rating.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/static/js/rating.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,115 @@ +function dlRatingOver(event) +{ + var div = $(this).parent('div'); + var stars = $('img', div); + for (var i = 0; i <= event.data; ++i) + { + var star = $(stars[i]); + star.attr('src', '/static/icons/stars/rating_over.gif'); + } +} + +function dlRatingOut(event) +{ + var div = $(this).parent('div'); + var stars = $('img', div); + for (var i = 0; i <= event.data; ++i) + { + var star = $(stars[i]); + star.attr('src', '/static/icons/stars/rating_' + star.attr('class') + '.gif'); + } +} + +function dlRatingClick(event) +{ + var star = $(this); + var id = star.attr('id'); + if (id.match(/star-(\d+)-(\d+)/)) + { + $.ajax({ + url: '/downloads/rate/', + type: 'POST', + data: { id: RegExp.$1, rating: parseInt(RegExp.$2) + 1}, + dataType: 'text', + success: function(rating) { + rating = parseFloat(rating); + if (rating < 0) + { + alert("You've already rated this download."); + return; + } + alert('Thanks for rating this download!'); + var div = star.parent('div'); + var stars = $('img', div); + rating = parseFloat(rating); + for (var i = 0; i < 5; ++i) + { + var s = $(stars[i]); + s.removeClass(s.attr('class')); + if (rating >= 1.0) + { + s.attr('src', '/static/icons/stars/rating_on.gif'); + s.addClass('on') + rating -= 1.0; + } + else if (rating >= 0.5) + { + s.attr('src', '/static/icons/stars/rating_half.gif'); + s.addClass('half') + rating = 0; + } + else + { + s.attr('src', '/static/icons/stars/rating_off.gif'); + s.addClass('off') + } + } + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + } +} + +$(document).ready(function() { + $('.rating').each(function(n) { + var div = $(this); + var id = div.attr('id'); + var numeric_id = -1; + if (id.match(/rating-(\d+)/)) + { + numeric_id = RegExp.$1; + } + var rating = div.html(); + div.html(''); + for (var i = 0; i < 5; ++i) + { + var star = $(''); + if (rating >= 1) + { + star.attr('src', '/static/icons/stars/rating_on.gif'); + star.addClass('on') + --rating; + } + else if (rating >= 0.5) + { + star.attr('src', '/static/icons/stars/rating_half.gif'); + star.addClass('half') + rating = 0; + } + else + { + star.attr('src', '/static/icons/stars/rating_off.gif'); + star.addClass('off') + } + star.attr('alt', 'star'); + star.attr('id', 'star-' + numeric_id + '-' + i); + star.bind('mouseover', i, dlRatingOver); + star.bind('mouseout', i, dlRatingOut); + star.click(dlRatingClick); + div.append(star); + } + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/templatetags/downloads_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/templatetags/downloads_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,17 @@ +""" +Template tags for the downloads application. +""" +from django import template + +from downloads.models import Download + + +register = template.Library() + + +@register.inclusion_tag('downloads/latest_tag.html') +def latest_downloads(): + downloads = Download.public_objects.order_by('-date_added')[:10] + return { + 'downloads': downloads, + } diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,19 @@ +""" +URLs for the downloads application. +""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('downloads.views', + url(r'^$', 'index', name='downloads-index'), + url(r'^add/$', 'add', name='downloads-add'), + url(r'^category/(?P[\w\d-]+)/(?Ptitle|date|rating|hits)/$', + 'category', + name='downloads-category'), + url(r'^details/(\d+)/$', 'details', name='downloads-details'), + url(r'^new/$', 'new', name='downloads-new'), + url(r'^popular/$', 'popular', name='downloads-popular'), + url(r'^request/$', 'request_download', name='downloads-request_download'), + url(r'^rate/$', 'rate_download', name='downloads-rate'), + url(r'^rating/$', 'rating', name='downloads-rating'), + url(r'^thanks/$', 'thanks', name='downloads-add_thanks'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b downloads/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/downloads/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,244 @@ +""" +Views for the downloads application. +""" +import random + +from django.shortcuts import render_to_response, get_object_or_404 +from django.template import RequestContext +from django.contrib.auth.decorators import login_required +from django.http import Http404 +from django.http import HttpResponse +from django.http import HttpResponseRedirect +from django.http import HttpResponseForbidden +from django.http import HttpResponseBadRequest +from django.http import HttpResponseNotFound +from django.core.paginator import InvalidPage +from django.core.urlresolvers import reverse +from django.db.models import Q +from django.views.decorators.http import require_POST +import django.utils.simplejson as json + +from core.paginator import DiggPaginator +from core.functions import email_admins +from core.functions import get_page +from downloads.models import Category +from downloads.models import Download +from downloads.models import VoteRecord +from downloads.forms import AddDownloadForm + +####################################################################### + +DLS_PER_PAGE = 10 + +def create_paginator(dls): + return DiggPaginator(dls, DLS_PER_PAGE, body=5, tail=3, margin=3, padding=2) + +####################################################################### + +@login_required +def index(request): + categories = Category.objects.all() + total_dls = Download.public_objects.all().count() + return render_to_response('downloads/index.html', { + 'categories': categories, + 'total_dls': total_dls, + }, + context_instance = RequestContext(request)) + +####################################################################### +# Maps URL component to database field name for the Download table: + +DOWNLOAD_FIELD_MAP = { + 'title': 'title', + 'date': '-date_added', + 'rating': '-average_score', + 'hits': '-hits' +} + +@login_required +def category(request, slug, sort='title'): + + cat = get_object_or_404(Category, slug=slug) + + if sort not in DOWNLOAD_FIELD_MAP: + sort = 'title' + order_by = DOWNLOAD_FIELD_MAP[sort] + + downloads = Download.public_objects.filter(category=cat.pk).order_by( + order_by) + paginator = create_paginator(downloads) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + return render_to_response('downloads/download_list.html', { + 's' : sort, + 'category' : cat, + 'page' : the_page, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def new(request): + """Display new downloads with pagination.""" + + downloads = Download.public_objects.order_by('-date_added') + + paginator = create_paginator(downloads) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + return render_to_response('downloads/download_summary.html', { + 'page': the_page, + 'title': 'Newest Downloads', + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def popular(request): + """Display popular downloads with pagination.""" + + downloads = Download.public_objects.order_by('-hits') + + paginator = create_paginator(downloads) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + return render_to_response('downloads/download_summary.html', { + 'page': the_page, + 'title': 'Popular Downloads', + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def rating(request): + """Display downloads by rating with pagination.""" + + downloads = Download.public_objects.order_by('-average_score') + paginator = create_paginator(downloads) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + return render_to_response('downloads/download_summary.html', { + 'page': the_page, + 'title': 'Highest Rated Downloads', + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def details(request, id): + download = get_object_or_404(Download.public_objects, pk=id) + return render_to_response('downloads/download_detail.html', { + 'download' : download, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def add(request): + if request.method == 'POST': + form = AddDownloadForm(request.POST, request.FILES) + if form.is_valid(): + dl = form.save(commit=False) + dl.user = request.user + dl.ip_address = request.META.get('REMOTE_ADDR', None) + dl.save() + email_admins('New download for approval', """Hello, + +A user has uploaded a new download for your approval. +""") + return HttpResponseRedirect(reverse('downloads-add_thanks')) + else: + form = AddDownloadForm() + + return render_to_response('downloads/add.html', { + 'add_form': form, + }, + context_instance=RequestContext(request)) + +####################################################################### + +@login_required +def thanks(request): + return render_to_response('downloads/thanks.html', { + }, + context_instance=RequestContext(request)) + +####################################################################### + +@require_POST +def rate_download(request): + """This function is called by AJAX to rate a download.""" + if request.user.is_authenticated(): + id = request.POST.get('id', None) + rating = request.POST.get('rating', None) + if id is None or rating is None: + return HttpResponseBadRequest('Missing id or rating.') + + try: + rating = int(rating) + except ValueError: + return HttpResponseBadRequest('Invalid rating.') + + # rating will be from 0-4 + rating = min(5, max(1, rating)) + + download = get_object_or_404(Download.public_objects, pk=id) + + # prevent multiple votes from the same user + vote_record, created = VoteRecord.objects.get_or_create( + download=download, user=request.user) + if created: + new_score = download.vote(rating) + download.save() + return HttpResponse(str(new_score)) + else: + return HttpResponse('-1') + + return HttpResponseForbidden('You must be logged in to rate a download.') + +####################################################################### + +@require_POST +def request_download(request): + """ + This function is called by AJAX to request a download. We update the hit + count and then return a JSON object of the form: + { id: download-id, 'url': link-to-download } + + """ + if request.user.is_authenticated(): + dl_id = request.POST.get('id') + if dl_id: + try: + dl = Download.public_objects.get(pk=dl_id) + except Download.DoesNotExist: + return HttpResponseNotFound("Download not found") + + dl.hits += 1 + dl.save() + + s = json.dumps({'id': dl_id, 'url': dl.file.url}) + return HttpResponse(s, content_type='application/json') + + return HttpResponseForbidden('An error occurred.') diff -r c525f3e0b5d0 -r ee87ea74d46b forums/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,2 @@ +import signals +import latest diff -r c525f3e0b5d0 -r ee87ea74d46b forums/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,109 @@ +""" +This file contains the admin definitions for the forums application. +""" +from django.contrib import admin + +from forums.models import Category +from forums.models import Forum +from forums.models import Topic +from forums.models import Post +from forums.models import FlaggedPost +from forums.models import ForumLastVisit +from forums.models import TopicLastVisit +from forums.signals import (notify_new_topic, notify_updated_topic, + notify_new_post, notify_updated_post) + +import bio.badges + + +class CategoryAdmin(admin.ModelAdmin): + list_display = ('name', 'position', ) + list_editable = ('position', ) + prepopulated_fields = { 'slug': ('name', ) } + save_on_top = True + + +class ForumAdmin(admin.ModelAdmin): + list_display = ('name', 'category', 'position', 'topic_count', 'post_count') + list_editable = ('position', ) + prepopulated_fields = { 'slug': ('name', ) } + raw_id_fields = ('last_post', ) + ordering = ('category', ) + save_on_top = True + + +class TopicAdmin(admin.ModelAdmin): + list_display = ('name', 'forum', 'creation_date', 'update_date', 'user', 'sticky', 'locked', + 'post_count') + raw_id_fields = ('user', 'last_post', 'subscribers', 'bookmarkers') + search_fields = ('name', ) + date_hierarchy = 'creation_date' + list_filter = ('creation_date', 'update_date', ) + save_on_top = True + + # override save_model() to update the search index + def save_model(self, request, obj, form, change): + obj.save() + + if change: + notify_updated_topic(obj) + else: + notify_new_topic(obj) + + +class PostAdmin(admin.ModelAdmin): + list_display = ('user', 'creation_date', 'update_date', 'user_ip', 'summary') + raw_id_fields = ('topic', 'user', ) + exclude = ('html', ) + search_fields = ('body', ) + date_hierarchy = 'creation_date' + list_filter = ('creation_date', 'update_date', ) + ordering = ('-creation_date', ) + save_on_top = True + + def queryset(self, request): + return Post.objects.select_related('user') + + # override save_model() to update the search index + def save_model(self, request, obj, form, change): + obj.save() + + if change: + notify_updated_post(obj) + else: + notify_new_post(obj) + + +class FlaggedPostAdmin(admin.ModelAdmin): + list_display = ['__unicode__', 'flag_date', 'get_post_url'] + actions = ['accept_flags'] + raw_id_fields = ['post', 'user', ] + + def accept_flags(self, request, qs): + """This admin action awards a security pin to the user who reported + the post and then deletes the flagged post object. + """ + for flag in qs: + bio.badges.award_badge(bio.badges.SECURITY_PIN, flag.user) + flag.delete() + + accept_flags.short_description = "Accept selected flagged posts" + + +class ForumLastVisitAdmin(admin.ModelAdmin): + raw_id_fields = ('user', 'forum') + list_display = ('user', 'forum', 'begin_date', 'end_date') + + +class TopicLastVisitAdmin(admin.ModelAdmin): + raw_id_fields = ('user', 'topic') + list_display = ('user', 'topic', 'last_visit') + + +admin.site.register(Category, CategoryAdmin) +admin.site.register(Forum, ForumAdmin) +admin.site.register(Topic, TopicAdmin) +admin.site.register(Post, PostAdmin) +admin.site.register(FlaggedPost, FlaggedPostAdmin) +admin.site.register(ForumLastVisit, ForumLastVisitAdmin) +admin.site.register(TopicLastVisit, TopicLastVisitAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/attachments.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/attachments.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,59 @@ +""" +This module contains a class for handling attachments on forum posts. +""" +from oembed.models import Oembed +from forums.models import Attachment + + +class AttachmentProcessor(object): + """ + This class is aggregated by various form classes to handle + attachments on forum posts. New posts can receive attachments and edited + posts can have their attachments replaced, augmented, or deleted. + + """ + def __init__(self, ids): + """ + This class is constructed with a list of Oembed ids. We retrieve the + actual Oembed objects associated with these keys for use in subsequent + operations. + + """ + # ensure all ids are integers + self.pks = [] + for pk in ids: + try: + pk = int(pk) + except ValueError: + continue + self.pks.append(pk) + + self.embeds = [] + if self.pks: + self.embeds = Oembed.objects.in_bulk(self.pks) + + def save_attachments(self, post): + """ + Create and save attachments to the supplied post object. + Any existing attachments on the post are removed first. + + """ + post.attachments.clear() + + for n, pk in enumerate(self.pks): + attachment = Attachment(post=post, embed=self.embeds[pk], order=n) + attachment.save() + + def has_attachments(self): + """ + Return true if we have valid pending attachments. + + """ + return len(self.embeds) > 0 + + def get_ids(self): + """ + Return the list of Oembed ids. + + """ + return self.pks diff -r c525f3e0b5d0 -r ee87ea74d46b forums/feeds.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/feeds.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,78 @@ +""" +This file contains the feed class for the forums application. + +""" +from django.contrib.syndication.views import Feed +from django.core.exceptions import ObjectDoesNotExist +from django.shortcuts import get_object_or_404 + +from forums.models import Forum, Topic, Post +from core.functions import copyright_str +from forums.latest import get_latest_posts + + +class ForumsFeed(Feed): + """The Feed class for a specific forum""" + + ttl = '60' + author_name = 'Brian Neal' + author_email = 'admin@surfguitar101.com' + + def get_object(self, request, forum_slug): + + if forum_slug: + forum = Forum.objects.get(slug=forum_slug) + # only return public forums + if forum.id not in Forum.objects.public_forum_ids(): + raise ObjectDoesNotExist + return forum + + else: + # return None to indicate we want a combined feed + return None + + def title(self, obj): + if obj is None: + forum_name = 'Combined' + else: + forum_name = obj.name + + return 'SurfGuitar101.com %s Forum Feed' % forum_name + + def link(self, obj): + if obj is None: + bits = '' + else: + bits = obj.slug + '/' + + return '/feeds/forums/' + bits + + def description(self, obj): + if obj is None: + return "User posts to SurfGuitar101.com forums." + return obj.description + + def feed_copyright(self): + return copyright_str() + + def items(self, obj): + forum_id = obj.id if obj else None + return get_latest_posts(forum_id=forum_id) + + def item_title(self, item): + return item['title'] + + def item_description(self, item): + return item['content'] + + def item_author_name(self, item): + return item['author'] + + def item_pubdate(self, item): + return item['pubdate'] + + def item_categories(self, item): + return [item['forum_name']] + + def item_link(self, item): + return item['url'] diff -r c525f3e0b5d0 -r ee87ea74d46b forums/fixtures/forums.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/fixtures/forums.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,288 @@ +[ + { + "pk": 2, + "model": "auth.group", + "fields": { + "name": "Forum Moderators", + "permissions": [] + } + }, + { + "pk": 1, + "model": "forums.category", + "fields": { + "position": 0, + "name": "SurfGuitar101.com Site Specific", + "groups": [], + "slug": "surfguitar101com-site-specific" + } + }, + { + "pk": 2, + "model": "forums.category", + "fields": { + "position": 1, + "name": "Surf Music", + "groups": [], + "slug": "surf-music" + } + }, + { + "pk": 3, + "model": "forums.category", + "fields": { + "position": 2, + "name": "Classifieds", + "groups": [], + "slug": "classifieds" + } + }, + { + "pk": 4, + "model": "forums.category", + "fields": { + "position": 3, + "name": "Off-Topic", + "groups": [], + "slug": "off-topic" + } + }, + { + "pk": 14, + "model": "forums.forum", + "fields": { + "category": 1, + "description": "For general discussion about this site only, including news and rules. Start here. Anything relating to surf music should go to the Surf Music General Discussion forum, below.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 0, + "last_post": null, + "slug": "surfguitar101-website", + "name": "SurfGuitar101 Website" + } + }, + { + "pk": 2, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "Main surf music discussion forum. Insert glissando sound here.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 0, + "last_post": null, + "slug": "surf-music", + "name": "Surf Music General Discussion" + } + }, + { + "pk": 3, + "model": "forums.forum", + "fields": { + "category": 3, + "description": "For sale and trading of surf music related items only.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 0, + "last_post": null, + "slug": "for-sale-trade", + "name": "For Sale / Trade" + } + }, + { + "pk": 4, + "model": "forums.forum", + "fields": { + "category": 4, + "description": "General off-topic chit-chat. Grab a cool drink and hop in. New members please introduce yourselves here. This forum is dedicated to the memory of Rip Thrillby and Spanky Twangler.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 0, + "last_post": null, + "slug": "shallow-end", + "name": "The Shallow End" + } + }, + { + "pk": 6, + "model": "forums.forum", + "fields": { + "category": 3, + "description": "Need someone to play with? Starting a band? Need a gig? Post here.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 1, + "last_post": null, + "slug": "musicians-gigs-wanted", + "name": "Musicians & Gigs Wanted" + } + }, + { + "pk": 8, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "Please post show announcements here.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 1, + "last_post": null, + "slug": "gigs", + "name": "Show Announcements" + } + }, + { + "pk": 9, + "model": "forums.forum", + "fields": { + "category": 1, + "description": "Got an idea for the site? Something not working? Post here.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 1, + "last_post": null, + "slug": "suggestion-box", + "name": "Suggestion Box" + } + }, + { + "pk": 5, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "Playing, performing, and writing surf music. All instruments welcome.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 2, + "last_post": null, + "slug": "surf-musician", + "name": "Surf Musician" + } + }, + { + "pk": 10, + "model": "forums.forum", + "fields": { + "category": 1, + "description": "Feedback, suggestions, playlists, and discussions about the SurfGuitar101 podcast.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 2, + "last_post": null, + "slug": "sg101-podcast", + "name": "SG101 Podcast" + } + }, + { + "pk": 7, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "For questions and discussions about instruments, amplifiers, and yes, outboard reverb units!", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 3, + "last_post": null, + "slug": "gear", + "name": "Gear" + } + }, + { + "pk": 11, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "For discussion of recording techniques.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 4, + "last_post": null, + "slug": "recording-corner", + "name": "Recording Corner" + } + }, + { + "pk": 12, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "Got a link to a surf or surf-related video? Post it here.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 5, + "last_post": null, + "slug": "surf-videos", + "name": "Surf Videos" + } + }, + { + "pk": 13, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "Please post your reviews of surf music releases here.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 6, + "last_post": null, + "slug": "music-reviews", + "name": "Music Reviews" + } + }, + { + "pk": 16, + "model": "forums.forum", + "fields": { + "category": 2, + "description": "This forum contains some classic and important threads from our history, preserved here for historical reasons! These threads are still live, so please keep posting to them.", + "post_count": 0, + "topic_count": 0, + "moderators": [ + 2 + ], + "position": 7, + "last_post": null, + "slug": "best-sg101", + "name": "Best-Of SG101" + } + } +] diff -r c525f3e0b5d0 -r ee87ea74d46b forums/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,248 @@ +""" +Forms for the forums application. + +""" +from django import forms +from django.conf import settings + +from forums.models import Forum +from forums.models import Topic +from forums.models import Post +from forums.attachments import AttachmentProcessor +import forums.permissions as perms +from forums.signals import notify_new_topic, notify_new_post + + +class NewPostForm(forms.Form): + """Form for creating a new post.""" + body = forms.CharField(label='', + required=False, + widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) + topic_id = forms.IntegerField(widget=forms.HiddenInput) + topic = None + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui']), + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui'] + + ['js/forums.js']) + + def __init__(self, *args, **kwargs): + super(NewPostForm, self).__init__(*args, **kwargs) + attachments = args[0].getlist('attachment') if len(args) else [] + self.attach_proc = AttachmentProcessor(attachments) + + def clean_body(self): + data = self.cleaned_data['body'] + if not data and not self.attach_proc.has_attachments(): + raise forms.ValidationError("This field is required.") + return data + + def clean_topic_id(self): + id = self.cleaned_data['topic_id'] + try: + self.topic = Topic.objects.select_related().get(pk=id) + except Topic.DoesNotExist: + raise forms.ValidationError('invalid topic') + return id + + def save(self, user, ip=None): + """ + Creates a new post from the form data and supplied arguments. + """ + post = Post(topic=self.topic, user=user, body=self.cleaned_data['body'], + user_ip=ip) + post.save() + self.attach_proc.save_attachments(post) + notify_new_post(post) + return post + + +class NewTopicForm(forms.Form): + """ + Form for creating a new topic and 1st post to that topic. + Superusers and moderators can also create the topic as a sticky or initially + locked. + """ + name = forms.CharField(label='Subject', max_length=255, + widget=forms.TextInput(attrs={'size': 64})) + body = forms.CharField(label='', required=False, + widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) + user = None + forum = None + has_mod_fields = False + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui']), + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui'] + + ['js/forums.js']) + + def __init__(self, user, forum, *args, **kwargs): + super(NewTopicForm, self).__init__(*args, **kwargs) + self.user = user + self.forum = forum + + if perms.can_moderate(forum, user): + self.fields['sticky'] = forms.BooleanField(required=False) + self.fields['locked'] = forms.BooleanField(required=False) + self.has_mod_fields = True + + attachments = args[0].getlist('attachment') if len(args) else [] + self.attach_proc = AttachmentProcessor(attachments) + + # If this form is being POSTed, and the user is trying to add + # attachments, create hidden fields to list the Oembed ids. In + # case the form isn't valid, the client-side javascript will know + # which Oembed media to ask for when the form is displayed with + # errors. + if self.attach_proc.has_attachments(): + pks = self.attach_proc.get_ids() + self.fields['attachment'] = forms.MultipleChoiceField(label='', + widget=forms.MultipleHiddenInput(), + choices=[(v, v) for v in pks]) + + def clean_body(self): + data = self.cleaned_data['body'] + if not data and not self.attach_proc.has_attachments(): + raise forms.ValidationError("This field is required.") + return data + + def save(self, ip=None): + """ + Creates the new Topic and first Post from the form data and supplied + arguments. + """ + topic = Topic(forum=self.forum, + name=self.cleaned_data['name'], + user=self.user, + sticky=self.has_mod_fields and self.cleaned_data['sticky'], + locked=self.has_mod_fields and self.cleaned_data['locked']) + topic.save() + + post = Post(topic=topic, + user=self.user, + body=self.cleaned_data['body'], + user_ip=ip) + post.save() + + self.attach_proc.save_attachments(post) + + notify_new_topic(topic) + notify_new_post(post) + + return topic + + +class PostForm(forms.ModelForm): + """ + Form for editing an existing post or a new, non-quick post. + """ + body = forms.CharField(label='', + required=False, + widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) + + class Meta: + model = Post + fields = ('body', ) + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui']), + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui'] + + ['js/forums.js']) + + def __init__(self, *args, **kwargs): + topic_name = kwargs.pop('topic_name', None) + super(PostForm, self).__init__(*args, **kwargs) + + if topic_name is not None: # this is a "first post" + self.fields.insert(0, 'name', forms.CharField(label='Subject', + max_length=255, + widget=forms.TextInput(attrs={'size': 64}))) + self.initial['name'] = topic_name + + attachments = args[0].getlist('attachment') if len(args) else [] + self.attach_proc = AttachmentProcessor(attachments) + + # If this form is being used to edit an existing post, and that post + # has attachments, create a hidden post_id field. The client-side + # AJAX will use this as a cue to retrieve the HTML for the embedded + # media. + if 'instance' in kwargs: + post = kwargs['instance'] + if post.attachments.count(): + self.fields['post_id'] = forms.CharField(label='', + widget=forms.HiddenInput(attrs={'value': post.id})) + + def clean_body(self): + data = self.cleaned_data['body'] + if not data and not self.attach_proc.has_attachments(): + raise forms.ValidationError('This field is required.') + return data + + def save(self, *args, **kwargs): + commit = kwargs.get('commit', False) + post = super(PostForm, self).save(*args, **kwargs) + + # Are we saving a "first post"? + if 'name' in self.cleaned_data: + post.topic.name = self.cleaned_data['name'] + if commit: + post.topic.save() + return post + + +class MoveTopicForm(forms.Form): + """ + Form for a moderator to move a topic to a forum. + """ + forums = forms.ModelChoiceField(label='Move to forum', + queryset=Forum.objects.none()) + + def __init__(self, user, *args, **kwargs): + hide_label = kwargs.pop('hide_label', False) + required = kwargs.pop('required', True) + super(MoveTopicForm, self).__init__(*args, **kwargs) + self.fields['forums'].queryset = \ + Forum.objects.forums_for_user(user).order_by('name') + if hide_label: + self.fields['forums'].label = '' + self.fields['forums'].required = required + + +class SplitTopicForm(forms.Form): + """ + Form for a moderator to split posts from a topic to a new topic. + """ + name = forms.CharField(label='New topic title', max_length=255, + widget=forms.TextInput(attrs={'size': 64})) + forums = forms.ModelChoiceField(label='Forum for new topic', + queryset=Forum.objects.none()) + post_ids = [] + split_at = False + + def __init__(self, user, *args, **kwargs): + super(SplitTopicForm, self).__init__(*args, **kwargs) + self.fields['forums'].queryset = \ + Forum.objects.forums_for_user(user).order_by('name') + + def clean(self): + self.post_ids = self.data.getlist('post_ids') + if len(self.post_ids) == 0: + raise forms.ValidationError('Please select some posts') + + self.split_at = 'split-at' in self.data + if self.split_at and len(self.post_ids) > 1: + raise forms.ValidationError('Please select only one post to split the topic at') + + return self.cleaned_data diff -r c525f3e0b5d0 -r ee87ea74d46b forums/latest.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/latest.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,342 @@ +""" +This module maintains the latest posts datastore. The latest posts are often +needed by RSS feeds, "latest posts" template tags, etc. This module listens for +the post_content_update signal, then bundles the post up and stores it by forum +ID in Redis. We also maintain a combined forums list. This allows quick +retrieval of the latest posts and avoids some slow SQL queries. + +We also do things like send topic notification emails, auto-favorite, and +auto-subscribe functions here rather than bog the user down in the request / +response cycle. + +""" +import datetime +import logging +import time + +from django.dispatch import receiver +from django.utils import simplejson +import redis + +from forums.signals import post_content_update, topic_content_update +from forums.models import Forum, Topic, Post +from forums.views.subscriptions import notify_topic_subscribers +from forums.tools import auto_favorite, auto_subscribe +from core.services import get_redis_connection + +# This constant controls how many latest posts per forum we store +MAX_POSTS = 50 + +# This controls how many updated topics we track +MAX_UPDATED_TOPICS = 50 + +# Redis key names: +POST_COUNT_KEY = "forums:public_post_count" +TOPIC_COUNT_KEY = "forums:public_topic_count" +UPDATED_TOPICS_SET_KEY = "forums:updated_topics:set" +UPDATED_TOPIC_KEY = "forums:updated_topics:%s" + +logger = logging.getLogger(__name__) + + +@receiver(post_content_update, dispatch_uid='forums.latest_posts') +def on_post_update(sender, **kwargs): + """ + This function is our signal handler, called when a post has been updated. + We only care about newly created posts, and ignore updates. + + We kick off a Celery task to perform work outside of the request/response + cycle. + + """ + # ignore non-new posts + if not kwargs['created']: + return + + # Kick off a Celery task to process this new post + forums.tasks.new_post_task.delay(sender.id) + + +def process_new_post(post_id): + """ + This function is run on a Celery task. It performs all new-post processing. + + """ + try: + post = Post.objects.select_related().get(pk=post_id) + except Post.DoesNotExist: + logger.warning("process_new_post: post %d does not exist", post_id) + return + + # selectively process posts from non-public forums + public_forums = Forum.objects.public_forum_ids() + + if post.topic.forum.id in public_forums: + conn = get_redis_connection() + _update_post_feeds(conn, post) + _update_post_count(conn, public_forums) + _update_latest_topics(conn, post) + + # send out any email notifications + notify_topic_subscribers(post, defer=False) + + # perform any auto-favorite and auto-subscribe actions for the new post + auto_favorite(post) + auto_subscribe(post) + + +def _update_post_feeds(conn, post): + """ + Updates the forum feeds we keep in Redis so that our RSS feeds are quick. + + """ + # serialize post attributes + post_content = { + 'id': post.id, + 'title': post.topic.name, + 'content': post.html, + 'author': post.user.username, + 'pubdate': int(time.mktime(post.creation_date.timetuple())), + 'forum_name': post.topic.forum.name, + 'url': post.get_absolute_url() + } + + s = simplejson.dumps(post_content) + + # store in Redis + + pipeline = conn.pipeline() + + key = 'forums:latest:%d' % post.topic.forum.id + + pipeline.lpush(key, s) + pipeline.ltrim(key, 0, MAX_POSTS - 1) + + # store in the combined feed; yes this wastes some memory storing it twice, + # but it makes things much easier + + key = 'forums:latest:*' + + pipeline.lpush(key, s) + pipeline.ltrim(key, 0, MAX_POSTS - 1) + + pipeline.execute() + + +def _update_post_count(conn, public_forums): + """ + Updates the post count we cache in Redis. Doing a COUNT(*) on the post table + can be expensive in MySQL InnoDB. + + """ + result = conn.incr(POST_COUNT_KEY) + if result == 1: + # it is likely redis got trashed, so re-compute the correct value + + count = Post.objects.filter(topic__forum__in=public_forums).count() + conn.set(POST_COUNT_KEY, count) + + +def _update_latest_topics(conn, post): + """ + Updates the "latest topics with new posts" list we cache in Redis for speed. + There is a template tag and forum view that uses this information. + + """ + # serialize topic attributes + topic_id = post.topic.id + topic_score = int(time.mktime(post.creation_date.timetuple())) + + topic_content = { + 'title': post.topic.name, + 'author': post.user.username, + 'date': topic_score, + 'url': post.topic.get_latest_post_url() + } + json = simplejson.dumps(topic_content) + key = UPDATED_TOPIC_KEY % topic_id + + pipeline = conn.pipeline() + pipeline.set(key, json) + pipeline.zadd(UPDATED_TOPICS_SET_KEY, topic_score, topic_id) + pipeline.zcard(UPDATED_TOPICS_SET_KEY) + results = pipeline.execute() + + # delete topics beyond our maximum count + num_topics = results[-1] + num_to_del = num_topics - MAX_UPDATED_TOPICS + if num_to_del > 0: + # get the IDs of the topics we need to delete first + start = 0 + stop = num_to_del - 1 # Redis indices are inclusive + old_ids = conn.zrange(UPDATED_TOPICS_SET_KEY, start, stop) + + keys = [UPDATED_TOPIC_KEY % n for n in old_ids] + conn.delete(*keys) + + # now delete the oldest num_to_del topics + conn.zremrangebyrank(UPDATED_TOPICS_SET_KEY, start, stop) + + +def get_latest_posts(num_posts=MAX_POSTS, forum_id=None): + """ + This function retrieves num_posts latest posts for the forum with the given + forum_id. If forum_id is None, the posts are retrieved from the combined + forums datastore. A list of dictionaries is returned. Each dictionary + contains information about a post. + + """ + key = 'forums:latest:%d' % forum_id if forum_id else 'forums:latest:*' + + num_posts = max(0, min(MAX_POSTS, num_posts)) + + if num_posts == 0: + return [] + + conn = get_redis_connection() + raw_posts = conn.lrange(key, 0, num_posts - 1) + + posts = [] + for raw_post in raw_posts: + post = simplejson.loads(raw_post) + + # fix up the pubdate; turn it back into a datetime object + post['pubdate'] = datetime.datetime.fromtimestamp(post['pubdate']) + + posts.append(post) + + return posts + + +@receiver(topic_content_update, dispatch_uid='forums.latest_posts') +def on_topic_update(sender, **kwargs): + """ + This function is our signal handler, called when a topic has been updated. + We only care about newly created topics, and ignore updates. + + We kick off a Celery task to perform work outside of the request/response + cycle. + + """ + # ignore non-new topics + if not kwargs['created']: + return + + # Kick off a Celery task to process this new post + forums.tasks.new_topic_task.delay(sender.id) + + +def process_new_topic(topic_id): + """ + This function contains new topic processing. Currently we only update the + topic count statistic. + + """ + try: + topic = Topic.objects.select_related().get(pk=topic_id) + except Topic.DoesNotExist: + logger.warning("process_new_topic: topic %d does not exist", topic_id) + return + + # selectively process topics from non-public forums + public_forums = Forum.objects.public_forum_ids() + + if topic.forum.id not in public_forums: + return + + # update the topic count statistic + conn = get_redis_connection() + + result = conn.incr(TOPIC_COUNT_KEY) + if result == 1: + # it is likely redis got trashed, so re-compute the correct value + + count = Topic.objects.filter(forum__in=public_forums).count() + conn.set(TOPIC_COUNT_KEY, count) + + +def get_stats(): + """ + This function returns the topic and post count statistics as a tuple, in + that order. If a statistic is not available, its position in the tuple will + be None. + + """ + try: + conn = get_redis_connection() + result = conn.mget(TOPIC_COUNT_KEY, POST_COUNT_KEY) + except redis.RedisError, e: + logger.error(e) + return (None, None) + + topic_count = int(result[0]) if result[0] else None + post_count = int(result[1]) if result[1] else None + + return (topic_count, post_count) + + +def get_latest_topic_ids(num): + """ + Return a list of topic ids from the latest topics that have posts. The ids + will be sorted from newest to oldest. + + """ + try: + conn = get_redis_connection() + result = conn.zrevrange(UPDATED_TOPICS_SET_KEY, 0, num - 1) + except redis.RedisError, e: + logger.error(e) + return [] + + return [int(n) for n in result] + + +def get_latest_topics(num): + """ + Return a list of dictionaries with information about the latest topics that + have updated posts. The topics are sorted from newest to oldest. + + """ + try: + conn = get_redis_connection() + result = conn.zrevrange(UPDATED_TOPICS_SET_KEY, 0, num - 1) + + topic_keys = [UPDATED_TOPIC_KEY % n for n in result] + json_list = conn.mget(topic_keys) if topic_keys else [] + + except redis.RedisError, e: + logger.error(e) + return [] + + topics = [] + for s in json_list: + item = simplejson.loads(s) + item['date'] = datetime.datetime.fromtimestamp(item['date']) + topics.append(item) + + return topics + + +def notify_topic_delete(topic): + """ + This function should be called when a topic is deleted. It will remove the + topic from the updated topics set, if present, and delete any info we have + about the topic. + + Note we don't do anything like this for posts. Since they just populate RSS + feeds we'll let them 404. The updated topic list is seen in a prominent + template tag however, so it is a bit more important to get that cleaned up. + + """ + try: + conn = get_redis_connection() + pipeline = conn.pipeline() + pipeline.zrem(UPDATED_TOPICS_SET_KEY, topic.id) + pipeline.delete(UPDATED_TOPIC_KEY % topic.id) + pipeline.execute() + except redis.RedisError, e: + logger.error(e) + + +# Down here to avoid a circular import +import forums.tasks diff -r c525f3e0b5d0 -r ee87ea74d46b forums/management/commands/forum_cleanup.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/management/commands/forum_cleanup.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,26 @@ +""" +forum_cleanup.py - A management command to cleanup forum model objects. Right +now this entails deleting old forum and topic last visit records. + +""" +import datetime + +from django.core.management.base import NoArgsCommand, CommandError + +from forums.models import ForumLastVisit, TopicLastVisit +import forums.unread + + +class Command(NoArgsCommand): + help = "This command deletes old forum and topic last visit records." + + def handle_noargs(self, **opts): + + now = datetime.datetime.now() + threshold = now - forums.unread.THRESHOLD * 2 + + # delete old topic last visit records + TopicLastVisit.objects.filter(last_visit__lt=threshold).delete() + + # delete old forum visit records + ForumLastVisit.objects.filter(end_date__lt=threshold).delete() diff -r c525f3e0b5d0 -r ee87ea74d46b forums/management/commands/sync_forums.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/management/commands/sync_forums.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,42 @@ +""" +sync_forums.py - A management command to synchronize the forums by recomputing +the de-normalized fields in the forum and topic objects. + +""" +import optparse + +from django.core.management.base import NoArgsCommand, CommandError + +from forums.models import Forum +from forums.models import Topic + + +class Command(NoArgsCommand): + help = """\ +This command synchronizes the forum application's forums and topic objects +by updating their de-normalized fields. +""" + option_list = NoArgsCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 50 topics to show progress"), + ) + + def handle_noargs(self, **opts): + + show_progress = opts.get('progress', False) or False + + n = 0 + for topic in Topic.objects.iterator(): + topic.post_count_update() + topic.save() + n += 1 + if n % 50 == 0: + self.stdout.write('.') + self.stdout.flush() + + for forum in Forum.objects.all(): + forum.sync() + forum.save() + + self.stdout.write('\n') + diff -r c525f3e0b5d0 -r ee87ea74d46b forums/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,420 @@ +""" +Models for the forums application. +""" +import datetime + +from django.db import models +from django.db.models import Q +from django.contrib.auth.models import User, Group +from django.core.cache import cache + +from core.markup import site_markup +from oembed.models import Oembed + + +class Category(models.Model): + """ + Forums belong to a category, whose access may be assigned to groups. + """ + name = models.CharField(max_length=80) + slug = models.SlugField(max_length=80) + position = models.IntegerField(blank=True, default=0) + groups = models.ManyToManyField(Group, blank=True, null=True, + help_text="If groups are assigned to this category, only members" \ + " of those groups can view this category.") + + class Meta: + ordering = ('position', ) + verbose_name_plural = 'Categories' + + def __unicode__(self): + return self.name + + +class ForumManager(models.Manager): + """ + The manager for the Forum model. Provides a centralized place to + put commonly used and useful queries. + """ + + def forums_for_user(self, user): + """ + Returns a queryset containing the forums that the given user can + "see" due to authenticated status, superuser status and group membership. + """ + qs = self._for_user(user) + return qs.select_related('category', 'last_post', 'last_post__user') + + def forum_ids_for_user(self, user): + """Returns a list of forum IDs that the given user can "see".""" + qs = self._for_user(user) + return qs.values_list('id', flat=True) + + def public_forums(self): + """Returns a queryset containing the public forums.""" + return self.filter(category__groups__isnull=True) + + def public_forum_ids(self): + """ + Returns a list of ids for the public forums; the list is cached for + performance. + """ + public_forums = cache.get('public_forum_ids') + if public_forums is None: + public_forums = list(self.filter( + category__groups__isnull=True).values_list('id', flat=True)) + cache.set('public_forum_ids', public_forums, 3600) + return public_forums + + def _for_user(self, user): + """Common code for the xxx_for_user() methods.""" + if user.is_superuser: + qs = self.all() + else: + user_groups = user.groups.all() if user.is_authenticated() else [] + qs = self.filter(Q(category__groups__isnull=True) | + Q(category__groups__in=user_groups)) + return qs + + +class Forum(models.Model): + """ + A forum is a collection of topics. + """ + category = models.ForeignKey(Category, related_name='forums') + name = models.CharField(max_length=80) + slug = models.SlugField(max_length=80) + description = models.TextField(blank=True, default='') + position = models.IntegerField(blank=True, default=0) + moderators = models.ManyToManyField(Group, blank=True, null=True) + + # denormalized fields to reduce database hits + topic_count = models.IntegerField(blank=True, default=0) + post_count = models.IntegerField(blank=True, default=0) + last_post = models.OneToOneField('Post', blank=True, null=True, + related_name='parent_forum') + + objects = ForumManager() + + class Meta: + ordering = ('position', ) + + def __unicode__(self): + return self.name + + @models.permalink + def get_absolute_url(self): + return ('forums-forum_index', [self.slug]) + + def topic_count_update(self): + """Call to notify the forum that its topic count has been updated.""" + self.topic_count = Topic.objects.filter(forum=self).count() + + def post_count_update(self): + """Call to notify the forum that its post count has been updated.""" + my_posts = Post.objects.filter(topic__forum=self) + self.post_count = my_posts.count() + if self.post_count > 0: + self.last_post = my_posts[self.post_count - 1] + else: + self.last_post = None + + def sync(self): + """ + Call to notify the forum that it needs to recompute its + denormalized fields. + """ + self.topic_count_update() + self.post_count_update() + + def last_post_pre_delete(self, deleting_topic=False): + """ + Call this function prior to deleting the last post in the forum. + A new last post will be found, if one exists. + This is to avoid the Django cascading delete issue. + If deleting_topic is True, then the whole topic the last post is + part of is being deleted, so we can't pick a new last post from that + topic. + """ + try: + qs = Post.objects.filter(topic__forum=self) + if deleting_topic: + qs = qs.exclude(topic=self.last_post.topic) + else: + qs = qs.exclude(pk=self.last_post.pk) + + self.last_post = qs.latest() + + except Post.DoesNotExist: + self.last_post = None + + def catchup(self, user, flv=None): + """ + Call to mark this forum all caught up for the given user (i.e. mark all topics + read for this user). + """ + TopicLastVisit.objects.filter(user=user, topic__forum=self).delete() + if flv is None: + try: + flv = ForumLastVisit.objects.get(user=user, forum=self) + except ForumLastVisit.DoesNotExist: + flv = ForumLastVisit(user=user, forum=self) + + now = datetime.datetime.now() + flv.begin_date = now + flv.end_date = now + flv.save() + + +class Topic(models.Model): + """ + A topic is a thread of discussion, consisting of a series of posts. + """ + forum = models.ForeignKey(Forum, related_name='topics') + name = models.CharField(max_length=255) + creation_date = models.DateTimeField(db_index=True) + user = models.ForeignKey(User) + view_count = models.IntegerField(blank=True, default=0) + sticky = models.BooleanField(blank=True, default=False) + locked = models.BooleanField(blank=True, default=False) + subscribers = models.ManyToManyField(User, related_name='subscriptions', + verbose_name='subscribers', blank=True) + bookmarkers = models.ManyToManyField(User, related_name='favorite_topics', + verbose_name='bookmarkers', blank=True) + + # denormalized fields to reduce database hits + post_count = models.IntegerField(blank=True, default=0) + update_date = models.DateTimeField(db_index=True) + last_post = models.OneToOneField('Post', blank=True, null=True, + related_name='parent_topic') + + class Meta: + ordering = ('-sticky', '-update_date', ) + + def __unicode__(self): + return self.name + + @models.permalink + def get_absolute_url(self): + return ('forums-topic_index', [self.pk]) + + @models.permalink + def get_latest_post_url(self): + return ('forums-topic_latest', [self.pk]) + + def post_count_update(self): + """ + Call this function to notify the topic instance that its post count + has changed. + """ + my_posts = Post.objects.filter(topic=self) + self.post_count = my_posts.count() + if self.post_count > 0: + self.last_post = my_posts[self.post_count - 1] + self.update_date = self.last_post.creation_date + else: + self.last_post = None + self.update_date = self.creation_date + + def reply_count(self): + """ + Returns the number of replies to a topic. The first post + doesn't count as a reply. + """ + if self.post_count > 1: + return self.post_count - 1 + return 0 + + def save(self, *args, **kwargs): + if not self.id: + now = datetime.datetime.now() + self.creation_date = now + self.update_date = now + + super(Topic, self).save(*args, **kwargs) + + def last_post_pre_delete(self): + """ + Call this function prior to deleting the last post in the topic. + A new last post will be found, if one exists. + This is to avoid the Django cascading delete issue. + """ + try: + self.last_post = \ + Post.objects.filter(topic=self).exclude(pk=self.last_post.pk).latest() + except Post.DoesNotExist: + self.last_post = None + + def search_title(self): + if self.post_count == 1: + post_text = "(1 post)" + else: + post_text = "(%d posts)" % self.post_count + + return u"%s by %s; %s" % (self.name, self.user.username, post_text) + + def search_summary(self): + return u'' + + def ogp_tags(self): + """ + Returns a dict of Open Graph Protocol meta tags. + + """ + desc = 'Forum topic created by %s on %s.' % ( + self.user.username, + self.creation_date.strftime('%B %d, %Y')) + + return { + 'og:title': self.name, + 'og:type': 'article', + 'og:url': self.get_absolute_url(), + 'og:description': desc, + } + + +class Post(models.Model): + """ + A post is an instance of a user's single contribution to a topic. + """ + topic = models.ForeignKey(Topic, related_name='posts') + user = models.ForeignKey(User, related_name='posts') + creation_date = models.DateTimeField(db_index=True) + update_date = models.DateTimeField(db_index=True) + body = models.TextField() + html = models.TextField() + user_ip = models.IPAddressField(blank=True, default='', null=True) + attachments = models.ManyToManyField(Oembed, through='Attachment') + + class Meta: + ordering = ('creation_date', ) + get_latest_by = 'creation_date' + verbose_name = 'forum post' + verbose_name_plural = 'forum posts' + + @models.permalink + def get_absolute_url(self): + return ('forums-goto_post', [self.pk]) + + def summary(self): + limit = 65 + if len(self.body) < limit: + return self.body + return self.body[:limit] + '...' + + def __unicode__(self): + return self.summary() + + def save(self, *args, **kwargs): + if not self.id: + self.creation_date = datetime.datetime.now() + self.update_date = self.creation_date + + self.html = site_markup(self.body) + super(Post, self).save(*args, **kwargs) + + def delete(self, *args, **kwargs): + first_post_id = self.topic.posts.all()[0].id + super(Post, self).delete(*args, **kwargs) + if self.id == first_post_id: + self.topic.delete() + + def has_been_edited(self): + return self.update_date > self.creation_date + + def touch(self): + """Call this function to indicate the post has been edited.""" + self.update_date = datetime.datetime.now() + + def search_title(self): + return u"%s by %s" % (self.topic.name, self.user.username) + + def search_summary(self): + return self.body + + +class FlaggedPost(models.Model): + """This model represents a user flagging a post as inappropriate.""" + user = models.ForeignKey(User) + post = models.ForeignKey(Post) + flag_date = models.DateTimeField(auto_now_add=True) + + def __unicode__(self): + return u'Post ID %s flagged by %s' % (self.post.id, self.user.username) + + class Meta: + ordering = ('flag_date', ) + + def get_post_url(self): + return 'Post' % self.post.get_absolute_url() + get_post_url.allow_tags = True + + +class ForumLastVisit(models.Model): + """ + This model records the last time a user visited a forum. + It is used to compute if a user has unread topics in a forum. + We keep track of a window of time, delimited by begin_date and end_date. + Topics updated within this window are tracked, and may have TopicLastVisit + objects. + Marking a forum as all read sets the begin_date equal to the end_date. + """ + user = models.ForeignKey(User) + forum = models.ForeignKey(Forum) + begin_date = models.DateTimeField() + end_date = models.DateTimeField() + + class Meta: + unique_together = ('user', 'forum') + ordering = ('-end_date', ) + + def __unicode__(self): + return u'Forum: %d User: %d Date: %s' % (self.forum.id, self.user.id, + self.end_date.strftime('%Y-%m-%d %H:%M:%S')) + + def is_caught_up(self): + return self.begin_date == self.end_date + + +class TopicLastVisit(models.Model): + """ + This model records the last time a user read a topic. + Objects of this class exist for the window specified in the + corresponding ForumLastVisit object. + """ + user = models.ForeignKey(User) + topic = models.ForeignKey(Topic) + last_visit = models.DateTimeField(db_index=True) + + class Meta: + unique_together = ('user', 'topic') + ordering = ('-last_visit', ) + + def __unicode__(self): + return u'Topic: %d User: %d Date: %s' % (self.topic.id, self.user.id, + self.last_visit.strftime('%Y-%m-%d %H:%M:%S')) + + def save(self, *args, **kwargs): + if self.last_visit is None: + self.touch() + super(TopicLastVisit, self).save(*args, **kwargs) + + def touch(self): + self.last_visit = datetime.datetime.now() + + +class Attachment(models.Model): + """ + This model is a "through" table for the M2M relationship between forum + posts and Oembed objects. + """ + post = models.ForeignKey(Post) + embed = models.ForeignKey(Oembed) + order = models.IntegerField() + + class Meta: + ordering = ('order', ) + + def __unicode__(self): + return u'Post %d, %s' % (self.post.pk, self.embed.title) + diff -r c525f3e0b5d0 -r ee87ea74d46b forums/permissions.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/permissions.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,114 @@ +""" +This module does permissions checking for the forums application. + +""" +from django.core.cache import cache + +# How long (in secs) to cache group information for various entities: +CATEGORY_TIMEOUT = 4 * 60 * 60 +FORUM_TIMEOUT = 4 * 60 * 60 +USER_TIMEOUT = 15 * 60 + + +def can_access(category, user): + """ + This function returns True if the given user can access the forum category + and False otherwise. + + """ + if user.is_superuser: + return True + + # If this category has no groups assigned to it, return True. Else, return + # True if the user belongs to a group that has been assigned to this + # category, and False otherwise. + + # Get the groups assigned to this category. + cat_groups = get_category_groups(category) + + if len(cat_groups) == 0: + return True # No groups => public category + + user_groups = get_user_groups(user) + return bool(user_groups & cat_groups) + + +def can_moderate(forum, user): + """ + Returns True if the user can moderate the forum. + + """ + # Get the simple cases out of the way first: + if not user.is_authenticated(): + return False + elif user.is_superuser: + return True + + # If we get here, we have to see if there is an intersection between the + # user's groups and the forum's moderator groups. + + forum_groups = get_forum_groups(forum) + user_groups = get_user_groups(user) + + return bool(user_groups & forum_groups) + + +def can_post(topic, user): + """ + Returns True if the user can post in the topic and False otherwise. + + """ + if not user.is_authenticated(): + return False + if user.is_superuser or can_moderate(topic.forum, user): + return True + + return not topic.locked and can_access(topic.forum.category, user) + + +def get_user_groups(user): + """ + Returns a set of group ID's that the user belongs to. + + """ + user_groups_key = '%s_groups' % user.username + return _get_groups(user_groups_key, user.groups.all(), USER_TIMEOUT) + + +def get_forum_groups(forum): + """ + Returns a set of group ID's of the forum's moderator groups. + + """ + forum_groups_key = 'forum_%d_mods' % forum.id + return _get_groups(forum_groups_key, forum.moderators.all(), FORUM_TIMEOUT) + + +def get_category_groups(category): + """ + Returns a set of group ID's of the groups that can access this forum + category. + + """ + cat_groups_key = 'cat_%d_groups' % category.id + return _get_groups(cat_groups_key, category.groups.all(), CATEGORY_TIMEOUT) + + +def _get_groups(key, qs, timeout): + """ + This internal function contains the code common to the get_xxx_groups() + functions. Returns a set of group ID's from the cache. If the set is not + found in the cache, the set is generated from the queryset qs and cached + with the given timeout. + + key - the cache key for the set of group ID's + qs - the query set of groups to query if the set is not in the cache + timeout - the cache timeout to use + + """ + groups = cache.get(key) + if groups is None: + groups = set([g.id for g in qs]) + cache.set(key, groups, timeout) + + return groups diff -r c525f3e0b5d0 -r ee87ea74d46b forums/search_indexes.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/search_indexes.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,60 @@ +"""Haystack search index for the weblinks application.""" +from haystack.indexes import * +from haystack import site +from custom_search.indexes import CondQueuedSearchIndex + +from forums.models import Forum, Topic, Post +from forums.signals import topic_content_update, post_content_update + + +class TopicIndex(CondQueuedSearchIndex): + text = CharField(document=True, use_template=True) + author = CharField(model_attr='user') + pub_date = DateTimeField(model_attr='creation_date') + + def index_queryset(self): + return Topic.objects.filter(forum__in=Forum.objects.public_forum_ids()) + + def get_updated_field(self): + return 'update_date' + + def _setup_save(self, model): + topic_content_update.connect(self.enqueue_save) + + def _teardown_save(self, model): + topic_content_update.disconnect(self.enqueue_save) + + def enqueue_save(self, sender, **kwargs): + return self.enqueue('update', sender) + + def can_index(self, instance): + return instance.forum.id in Forum.objects.public_forum_ids() + + +class PostIndex(CondQueuedSearchIndex): + text = CharField(document=True, use_template=True) + author = CharField(model_attr='user') + pub_date = DateTimeField(model_attr='creation_date') + + def index_queryset(self): + return Post.objects.filter( + topic__forum__in=Forum.objects.public_forum_ids()) + + def get_updated_field(self): + return 'update_date' + + def _setup_save(self, model): + post_content_update.connect(self.enqueue_save) + + def _teardown_save(self, model): + post_content_update.disconnect(self.enqueue_save) + + def enqueue_save(self, sender, **kwargs): + return self.enqueue('update', sender) + + def can_index(self, instance): + return instance.topic.forum.id in Forum.objects.public_forum_ids() + + +site.register(Topic, TopicIndex) +site.register(Post, PostIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/signals.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/signals.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,114 @@ +""" +Signal handlers & signals for the forums application. + +""" +from django.db.models.signals import post_save +from django.db.models.signals import post_delete +import django.dispatch + +from forums.models import Forum, Topic, Post + + +def on_topic_save(sender, **kwargs): + if kwargs['created']: + topic = kwargs['instance'] + topic.forum.topic_count_update() + topic.forum.save() + + +def on_topic_delete(sender, **kwargs): + topic = kwargs['instance'] + topic.forum.topic_count_update() + topic.forum.save() + forums.latest.notify_topic_delete(topic) + + +def on_post_save(sender, **kwargs): + if kwargs['created']: + post = kwargs['instance'] + + # update the topic + post.topic.post_count_update() + post.topic.save() + + # update the forum + post.topic.forum.post_count_update() + post.topic.forum.save() + + +def on_post_delete(sender, **kwargs): + post = kwargs['instance'] + + # update the topic + try: + post.topic.post_count_update() + post.topic.save() + except Topic.DoesNotExist: + pass + else: + # update the forum + try: + post.topic.forum.post_count_update() + post.topic.forum.save() + except Forum.DoesNotExist: + pass + + +post_save.connect(on_topic_save, sender=Topic, dispatch_uid='forums.signals') +post_delete.connect(on_topic_delete, sender=Topic, dispatch_uid='forums.signals') + +post_save.connect(on_post_save, sender=Post, dispatch_uid='forums.signals') +post_delete.connect(on_post_delete, sender=Post, dispatch_uid='forums.signals') + + +# Signals for the forums application. +# +# This signal is sent when a topic has had its textual content (title) changed. +# The provided arguments are: +# sender - the topic model instance +# created - True if the topic is new, False if updated + +topic_content_update = django.dispatch.Signal(providing_args=['created']) + +# This signal is sent when a post has had its textual content (body) changed. +# The provided arguments are: +# sender - the post model instance +# created - True if the post is new, False if updated + +post_content_update = django.dispatch.Signal(providing_args=['created']) + + +def notify_new_topic(topic): + """ + Sends the topic_content_update signal for a new topic instance. + + """ + topic_content_update.send_robust(topic, created=True) + + +def notify_updated_topic(topic): + """ + Sends the topic_content_update signal for an updated topic instance. + + """ + topic_content_update.send_robust(topic, created=False) + + +def notify_new_post(post): + """ + Sends the post_content_update signal for a new post instance. + + """ + post_content_update.send_robust(post, created=True) + + +def notify_updated_post(post): + """ + Sends the post_content_update signal for an updated post instance. + + """ + post_content_update.send_robust(post, created=False) + + +# Avoid circular imports +import forums.latest diff -r c525f3e0b5d0 -r ee87ea74d46b forums/static/js/forums.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/static/js/forums.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,297 @@ +$(document).ready(function() { + var postText = $('#id_body'); + var postButton = $('#forums-reply-post'); + postButton.click(function () { + var text = $.trim(postText.val()); + $(this).attr('disabled', 'disabled').val('Posting reply...'); + + var attachments = new Array() + $('#attachment div input').each(function(index) { + attachments[index] = $(this).val(); + }); + + $.ajax({ + url: '/forums/quick-reply/', + type: 'POST', + data: { + body : postText.val(), + topic_id : $('#id_topic_id').val(), + attachment : attachments + }, + traditional: true, + dataType: 'html', + success: function (data, textStatus) { + postText.val(''); + var lastTr = $('#forum-topic tr:last'); + var newClass = lastTr.hasClass('odd') ? 'even' : 'odd'; + lastTr.after(data); + lastTr = $('#forum-topic tr:last'); + lastTr.addClass(newClass); + lastTr.hide(); + lastTr.fadeIn(3000); + postButton.removeAttr('disabled').val('Submit Reply'); + initAttachments(); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + postButton.removeAttr('disabled').val('Submit Reply'); + initAttachments(); + } + }); + return false; + }); + $('a.post-flag').click(function () { + var id = this.id; + if (id.match(/fp-(\d+)/)) { + id = RegExp.$1; + if (confirm('Only flag a post if you feel it is spam, abuse, violates site rules, ' + + 'or is not appropriate. ' + + 'A moderator will be notified and will review the post. ' + + 'Are you sure you want to flag this post?')) { + $.ajax({ + url: '/forums/flag-post/', + type: 'POST', + data: {id: id}, + dataType: 'text', + success: function (response, textStatus) { + alert(response); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); + } + }); + } + } + return false; + }); + $('a.post-delete').click(function () { + var id = this.id; + if (id.match(/dp-(\d+)/)) { + id = RegExp.$1; + if (confirm('Are you sure you want to delete this post?')) { + $.ajax({ + url: '/forums/delete-post/', + type: 'POST', + data: {id: id}, + dataType: 'text', + success: function (response, textStatus) { + alert(response); + $('#post-' + id).fadeOut(3000); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); + } + }); + } + } + return false; + }); + $('#forum-mod-del-topic').click(function () { + return confirm('Are you sure you want to delete this topic?\n' + + 'WARNING: all posts will be lost.'); + }); + + var vid = 0; + var vidDiv = $('#attachment'); + + function clearAttachments() + { + $('#attachment div').remove(); + $('#attach-another').remove(); + } + + function processEmbeds(data, textStatus) + { + vidDiv.find('img').remove(); + $.each(data, function(index, value) { + var html = '
' + value.html + + '' + + 'Remove ' + + 'Remove' + + ''; + '
'; + vidDiv.append(html); + $('#video-' + index + ' a').click(function() { + $('#video-' + index).remove(); + relabelAttachLink(); + return false; + }); + }); + vid = data.length; + $('#video-' + (vid-1)).after('Attach another video'); + $('#attach-another').click(function() { + addVideo(); + relabelAttachLink(); + return false; + }); + } + + function initAttachments() + { + clearAttachments(); + + var post_input = $('#id_post_id'); + var attachments = $("#forums_post_form input:hidden[name='attachment']"); + if (post_input.length == 1) + { + post_id = post_input.val(); + vidDiv.prepend('Busy'); + $.ajax({ + url: '/forums/fetch_attachments/', + type: 'GET', + data: { + pid : post_id + }, + dataType: 'json', + success: processEmbeds, + error: function (xhr, textStatus, ex) { + vidDiv.find('img').remove(); + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + } + else if (attachments.length > 0) + { + vidDiv.prepend('Busy'); + var embeds = new Array(); + attachments.each(function(index) { + embeds[index] = $(this).val(); + }); + attachments.remove(); + $.ajax({ + url: '/oembed/fetch_saved/', + type: 'GET', + data: { + embeds: embeds + }, + traditional: true, + dataType: 'json', + success: processEmbeds, + error: function (xhr, textStatus, ex) { + vidDiv.find('img').remove(); + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + } + else + { + vid = 0; + var s = '
' + + 'Add ' + + 'Attach Video
'; + vidDiv.prepend(s); + $('#attachment a').click(function () { + $('#init-add').remove(); + addVideo(); + return false; + }); + } + } + + function relabelAttachLink() + { + var another = $('#attach-another'); + var n = $('#attachment div').length; + if (n == 0) + { + another.html("Attach a video"); + } + else + { + another.html("Attach another video"); + } + } + + function addVideo() + { + var id = "video-" + vid; + + var fakeForm = '
' + + 'Attach ' + + ' ' + + 'Remove
'; + + var n = $('#attachment div').length; + + var another = $('#attach-another'); + if (n == 0) + { + if (another.length > 0) + { + another.before(fakeForm); + } + else + { + vidDiv.append(fakeForm); + } + } + else + { + $('#attachment div:last').after(fakeForm); + } + + $('#' + id + ' a').click(function() { + $('#' + id).remove(); + relabelAttachLink(); + return false; + }); + + var vidText = $('#' + id + ' input'); + + $('#' + id + ' button').click(function() { + var button = $(this); + button.attr('disabled', 'disabled'); + $.ajax({ + url: '/oembed/fetch/', + type: 'POST', + data: { + q : vidText.val() + }, + dataType: 'json', + success: function (data, textStatus) { + $('#' + id + " .r").remove(); + var myDiv = $('#' + id); + var html = '' + + 'Remove ' + + 'Remove' + + ''; + myDiv.prepend(html); + myDiv.prepend(data.embed); + $('#' + id + ' a').click(function() { + myDiv.remove(); + relabelAttachLink(); + return false; + }); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + button.removeAttr('disabled'); + } + }); + }); + + if (vid == 0) + { + $('#video-0').after('Attach another video'); + $('#attach-another').click(function() { + addVideo(); + relabelAttachLink(); + return false; + }); + } + ++vid; + } + + initAttachments(); + + $('div.forum-post-body img').fadeIn('fast', function() { + var pic = $(this); + if (pic.width() > 720) { + pic.css('width', '720px'); + } + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b forums/static/js/forums_mod.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/static/js/forums_mod.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,15 @@ +$(document).ready(function() { + var master = $('#forums-master-topic'); + var topics = $('.forums-topic_check'); + master.click(function() { + var state = this.checked; + topics.each(function() { + this.checked = state; + }); + }); + topics.click(function() { + if (master[0].checked && !this.checked) { + master[0].checked = false; + } + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b forums/tasks.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/tasks.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,25 @@ +""" +Celery tasks for the forums application. + +""" +from celery.task import task + +import forums.latest + + +@task +def new_post_task(post_id): + """ + This task performs new post processing on a Celery task. + + """ + forums.latest.process_new_post(post_id) + + +@task +def new_topic_task(topic_id): + """ + This task performs new topic processing on a Celery task. + + """ + forums.latest.process_new_topic(topic_id) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/templatetags/forum_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/templatetags/forum_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,203 @@ +""" +Template tags for the forums application. +""" +import datetime + +from pytz import timezone +from django import template +from django.conf import settings +from django.core.cache import cache +from django.contrib.auth.models import User + +from forums.models import Forum +from forums.models import Topic +from forums.models import Post +from forums.models import Category +from forums.latest import get_stats, get_latest_topics + + +register = template.Library() + +TIME_FMT_24 = "%H:%M" +TIME_FMT_12 = "%I:%M %p" + +DATE_FMT = "%b %d %Y" +SHORT_DATE_FMT = "%b %d" + +DATE_FMT_24 = ( + "%s %s" % (DATE_FMT, TIME_FMT_24), # long format + "%s %s" % (TIME_FMT_24, SHORT_DATE_FMT), # short format +) +DATE_FMT_12 = ( + "%s %s" % (DATE_FMT, TIME_FMT_12), # long format + "%s %s" % (TIME_FMT_12, SHORT_DATE_FMT), # short format +) + +SERVER_TZ = timezone(settings.TIME_ZONE) + + +@register.inclusion_tag('forums/last_post_info.html', takes_context=True) +def last_post_info(context, post): + return { + 'post': post, + 'STATIC_URL': context['STATIC_URL'], + 'user': context['user'], + } + + +@register.inclusion_tag('forums/pagination.html') +def forum_page_navigation(page): + return {'page': page} + + +@register.inclusion_tag('forums/post_edit_button.html') +def post_edit_button(post, user, can_moderate): + show_button = post.user.id == user.id or can_moderate + return { + 'post': post, + 'show_button': show_button, + 'STATIC_URL': settings.STATIC_URL, + } + + +def get_time_prefs(user): + """ + Return the supplied user's time preferences in the form of a 2-tuple: + (use_24_time, time_zone_name) + + These preferences are cached to reduce database hits. + + """ + cache_key = '%s_tz_prefs' % user.username + tz_prefs = cache.get(cache_key) + if tz_prefs is None: + profile = user.get_profile() + tz_prefs = profile.use_24_time, profile.time_zone + cache.set(cache_key, tz_prefs) + + return tz_prefs + + +@register.simple_tag +def current_forum_time(user): + """ + This tag displays the current forum time, adjusted by the user's + time zone preferences. + """ + curr_time = SERVER_TZ.localize(datetime.datetime.now()) + + if user.is_authenticated(): + tz_prefs = get_time_prefs(user) + user_tz = timezone(tz_prefs[1]) + curr_time = curr_time.astimezone(user_tz) + fmt = TIME_FMT_24 if tz_prefs[0] else TIME_FMT_12 + else: + fmt = TIME_FMT_12 + + return '

The current time is %s. All times shown are %s.

' % ( + curr_time.strftime(fmt), curr_time.strftime('%Z%z')) + + +@register.simple_tag +def forum_date(date, user, long_format=True): + """ + This tag displays an arbitrary datetime, adjusted by the user's + time zone preferences. + """ + fmt_index = 0 if long_format else 1 + + date = SERVER_TZ.localize(date) + if user.is_authenticated(): + tz_prefs = get_time_prefs(user) + user_tz = timezone(tz_prefs[1]) + date = date.astimezone(user_tz) + fmt = DATE_FMT_24 if tz_prefs[0] else DATE_FMT_12 + else: + fmt = DATE_FMT_12 + + return date.strftime(fmt[fmt_index]) + + +@register.inclusion_tag('forums/show_form.html') +def show_form(legend_text, form, submit_value, is_ajax): + """ + This tag displays the common HTML for a forum form. + """ + return { + 'legend_text': legend_text, + 'form': form, + 'submit_value': submit_value, + 'is_ajax': is_ajax, + 'STATIC_URL': settings.STATIC_URL, + } + + +@register.inclusion_tag('forums/new_posts_tag.html') +def new_posts(): + """ + This tag displays the topics that have the newest posts. + Only the "public" forums are displayed. + """ + return { + 'topics': get_latest_topics(20), + } + + +@register.inclusion_tag('forums/forum_stats_tag.html') +def forum_stats(): + """ + Displays forum statistics. + """ + topic_count, post_count = get_stats() + + return { + 'topic_count': topic_count, + 'post_count': post_count, + } + + +@register.inclusion_tag('forums/topic_icons_tag.html') +def topic_icons(topic): + """Displays the "unread", "sticky", and "locked" icons for a given topic.""" + return { + 'topic': topic, + 'STATIC_URL': settings.STATIC_URL, + } + + +@register.inclusion_tag('forums/topic_page_range_tag.html') +def topic_page_range(topic): + """Displays the page range links for a topic.""" + return { + 'topic': topic, + } + + +@register.inclusion_tag('forums/navigation_tag.html') +def forum_navigation(obj, subtitle=None): + """ + Generates forum navigation links based upon the arguments passed. + If obj is: + * a string: Index >> String Text + * a forum: Index >> Forum Name + * a topic: Index >> Forum Name >> Topic Name + + If the optional subtitle argument is passed, it is assumed to be + a string, and is added as one more "level" in the navigation. + + """ + nav_list = [] + + if isinstance(obj, str) or isinstance(obj, unicode): + nav_list.append(dict(name=obj, url=None)) + elif isinstance(obj, Forum): + nav_list.append(dict(name=obj.name, url=obj.get_absolute_url())) + elif isinstance(obj, Topic): + forum = obj.forum + nav_list.append(dict(name=forum.name, url=forum.get_absolute_url())) + nav_list.append(dict(name=obj.name, url=obj.get_absolute_url())) + + if subtitle: + nav_list.append(dict(name=subtitle, url=None)) + + return dict(nav_list=nav_list, STATIC_URL=settings.STATIC_URL) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/tests/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/tests/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1 @@ +from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b forums/tests/view_tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/tests/view_tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,135 @@ +""" +Tests for the views in the forums application. + +""" +from django.test import TestCase +from django.contrib.auth.models import User +from django.core.urlresolvers import reverse + +from forums.models import Forum, Topic, Post + + +class ForumPostTestCase(TestCase): + fixtures = ['forums.json'] + + def setUp(self): + self.username = 'test_user' + self.pw = 'password' + self.user = User.objects.create_user(self.username, '', self.pw) + self.user.save() + self.assertTrue(self.client.login(username=self.username, + password=self.pw)) + + def tearDown(self): + self.client.logout() + + def testBasicForumsTest(self): + + forum_slug = 'shallow-end' + topic_name = 'A test topic' + topic_body = 'testing 1, 2, 3...' + + response = self.client.post( + reverse('forums-new_topic', kwargs={'slug': forum_slug}), + {'name': topic_name, 'body': topic_body}, + follow=True) + + self.assertEqual(len(response.redirect_chain), 1) + + if response.redirect_chain: + self.assertEqual(response.redirect_chain[0][0], + 'http://testserver' + reverse('forums-new_topic_thanks', + kwargs={'tid': '1'})) + self.assertEqual(response.redirect_chain[0][1], 302) + + self.assertEqual(response.status_code, 200) + + forum = Forum.objects.get(slug=forum_slug) + try: + topic = Topic.objects.get(pk=1) + except Topic.DoesNotExist: + self.fail("topic doesn't exist") + + self.assertEqual(topic.forum.pk, forum.pk) + self.assertEqual(topic.user.pk, self.user.pk) + self.assertEqual(topic.name, topic_name) + self.assertEqual(topic.post_count, 1) + + post = topic.last_post + self.failIf(post is None) + + if post: + self.assertEqual(post.body, topic_body) + self.assertEqual(post.user.pk, self.user.pk) + + # post to the thread + response = self.client.get( + reverse('forums-topic_index', kwargs={'id': '1'})) + self.assertEqual(response.status_code, 200) + + post2_body = 'test quick post' + response = self.client.post( + reverse('forums-quick_reply'), + {'body': post2_body, 'topic_id': 1}) + self.assertEqual(response.status_code, 200) + try: + topic = Topic.objects.get(pk=1) + except Topic.DoesNotExist: + self.fail("topic doesn't exist") + + post = topic.last_post + self.failIf(post is None) + if post: + self.assertEqual(post.body, post2_body) + self.assertEqual(post.user.pk, self.user.pk) + self.assertEqual(topic.post_count, 2) + + # quote last post + response = self.client.get( + reverse('forums-new_post', kwargs={'topic_id': 1}), + {'quote_id': 2}) + self.assertEqual(response.status_code, 200) + + post3_body = 'new post 3 content' + response = self.client.post( + reverse('forums-new_post', kwargs={'topic_id': 1}), + {'body': post3_body, 'post_id': 2}, + follow=True) + self.assertEqual(response.status_code, 200) + try: + topic = Topic.objects.get(pk=1) + except Topic.DoesNotExist: + self.fail("topic doesn't exist") + + post = topic.last_post + self.failIf(post is None) + if post: + self.assertEqual(post.body, post3_body) + self.assertEqual(post.user.pk, self.user.pk) + self.assertEqual(topic.post_count, 3) + + # edit last post + response = self.client.get( + reverse('forums-edit_post', kwargs={'id': 3})) + self.assertEqual(response.status_code, 200) + + post3_body = 'edited post 3 content' + response = self.client.post( + reverse('forums-edit_post', kwargs={'id': 3}), + {'body': post3_body}, + follow=True) + self.assertEqual(response.status_code, 200) + try: + topic = Topic.objects.get(pk=1) + except Topic.DoesNotExist: + self.fail("topic doesn't exist") + + post = topic.last_post + self.failIf(post is None) + if post: + self.assertEqual(post.body, post3_body) + self.assertEqual(post.user.pk, self.user.pk) + self.assertEqual(topic.post_count, 3) + + profile = self.user.get_profile() + self.assertEqual(profile.forum_post_count, 3) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/tools.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/tools.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,130 @@ +""" +This module contains misc. utility functions for forum management. +""" +import logging + +from forums.models import Post, Topic, Forum, ForumLastVisit, TopicLastVisit + + +def delete_user_posts(user): + """ + This function deletes all the posts for a given user. + It also cleans up any last visit database records for the user. + This function adjusts the last post foreign keys before deleting + the posts to avoid the cascading delete behavior. + """ + posts = Post.objects.filter(user=user).select_related() + + # delete attachments + for post in posts: + post.attachments.clear() + + # build a set of topics and forums affected by the post deletions + + topics = set(post.topic for post in posts) + forums = set(topic.forum for topic in topics) + + post_ids = [post.pk for post in posts] + pending_delete = [] + + for topic in topics: + if topic.last_post.pk in post_ids: + topic_posts = Post.objects.filter(topic=topic).exclude( + pk__in=post_ids) + topic.post_count = topic_posts.count() + if topic.post_count > 0: + topic.last_post = topic_posts.latest() + topic.update_date = topic.last_post.creation_date + topic.save() + else: + # Topic should be deleted, it has no posts; + # We can't delete it now as it could cascade and take out a + # forum. Remember it for later deletion. + pending_delete.append(topic) + + for forum in forums: + if forum.last_post.pk in post_ids: + forum_posts = Post.objects.filter(topic__forum=forum).exclude( + pk__in=post_ids) + forum.post_count = forum_posts.count() + if forum.post_count > 0: + forum.last_post = forum_posts.latest() + else: + forum.last_post = None + forum.save() + + # Delete pending topics now because forums have just adjusted their + # foreign keys into Post + if pending_delete: + topic_ids = [topic.pk for topic in pending_delete] + Topic.objects.filter(pk__in=topic_ids).delete() + + # Topics have been deleted, re-compute topic counts for forums + for forum in forums: + forum.topic_count = Topic.objects.filter(forum=forum).count() + forum.save() + + # All foreign keys are accounted for, we can now delete the posts in bulk. + # Since some posts in our original queryset may have been deleted already, + # run a new query (although it may be ok) + Post.objects.filter(pk__in=post_ids).delete() + + # delete all the last visit records for this user + TopicLastVisit.objects.filter(user=user).delete() + ForumLastVisit.objects.filter(user=user).delete() + + +def create_topic(forum_slug, user, topic_name, post_body, ip='', sticky=False, + locked=False): + """Programmatically create a topic & first post in a given forum. + + This function creates a new topic in the forum that has the slug + specified by the 'forum_slug' argument. Other arguments are as follows: + 'user' - create the topic and post with this user as the owner + 'topic_name' - topic name (title) + 'post_body' - topic post body (as markup, not HTML) + 'ip' - IP address for the post (as a string) + 'sticky' - if True, the post will be stickied + 'locked' - if True, the post will be locked + + """ + try: + forum = Forum.objects.get(slug=forum_slug) + except Forum.DoesNotExist: + logging.error('could not create_topic for forum_slug=%s', forum_slug) + raise + + topic = Topic(forum=forum, + name=topic_name, + user=user, + sticky=sticky, + locked=locked) + topic.save() + + post = Post(topic=topic, + user=user, + body=post_body, + user_ip=ip) + post.save() + + +def auto_favorite(post): + """ + Given a newly created post, perform an auto-favorite action if the post + creator has that option set in their profile. + + """ + profile = post.user.get_profile() + if profile.auto_favorite: + post.topic.bookmarkers.add(post.user) + + +def auto_subscribe(post): + """ + Given a newly created post, perform an auto-subscribe action if the post + creator has that option set in their profile. + + """ + profile = post.user.get_profile() + if profile.auto_subscribe: + post.topic.subscribers.add(post.user) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/unread.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/unread.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,257 @@ +""" +This file contains routines for implementing the "has unread" feature. +Forums, topics, and posts are displayed with a visual indication if they have +been read or not. +""" +import datetime +import logging + +from django.db import IntegrityError + +from forums.models import ForumLastVisit, TopicLastVisit, Topic, Forum + + +THRESHOLD = datetime.timedelta(days=14) + +####################################################################### + +def get_forum_unread_status(qs, user): + if not user.is_authenticated(): + for forum in qs: + forum.has_unread = False + return + + now = datetime.datetime.now() + min_date = now - THRESHOLD + + # retrieve ForumLastVisit records in one SQL query + forum_ids = [forum.id for forum in qs] + flvs = ForumLastVisit.objects.filter(user=user, + forum__in=forum_ids).select_related() + flvs = dict([(flv.forum.id, flv) for flv in flvs]) + + for forum in qs: + # Edge case: forum has no posts + if forum.last_post is None: + forum.has_unread = False + continue + + # Get the ForumLastVisit record + if forum.id in flvs: + flv = flvs[forum.id] + else: + # One doesn't exist, create a default one for next time, + # mark it as having no unread topics, and bail. + flv = ForumLastVisit(user=user, forum=forum) + flv.begin_date = now + flv.end_date = now + + # There is a race condition and sometimes another thread + # saves a record before we do; just log this if it happens. + try: + flv.save() + except IntegrityError: + logging.exception('get_forum_unread_status') + + forum.has_unread = False + continue + + # If the last visit record was too far in the past, + # catch that user up and mark as no unreads. + if now - flv.end_date > THRESHOLD: + forum.catchup(user, flv) + forum.has_unread = False + continue + + # Check the easy cases first. Check the last_post in the + # forum. If created after the end_date in our window, there + # are new posts. Likewise, if before the begin_date in our window, + # there are no new posts. + if forum.last_post.creation_date > flv.end_date: + forum.has_unread = True + elif forum.last_post.creation_date < flv.begin_date: + if not flv.is_caught_up(): + forum.catchup(user, flv) + forum.has_unread = False + else: + # Going to have to examine the topics in our window. + # First adjust our window if it is too old. + if now - flv.begin_date > THRESHOLD: + flv.begin_date = min_date + flv.save() + TopicLastVisit.objects.filter(user=user, topic__forum=forum, + last_visit__lt=min_date).delete() + + topics = Topic.objects.filter(forum=forum, + update_date__gt=flv.begin_date) + tracked_topics = TopicLastVisit.objects.filter( + user=user, + topic__forum=forum, + last_visit__gt=flv.begin_date).select_related('topic') + + # If the number of topics created since our window was started + # is greater than the tracked topic records, then there are new + # posts. + if topics.count() > tracked_topics.count(): + forum.has_unread = True + continue + + tracked_dict = dict((t.topic.id, t) for t in tracked_topics) + + for topic in topics: + if topic.id in tracked_dict: + if topic.update_date > tracked_dict[topic.id].last_visit: + forum.has_unread = True + break + else: + forum.has_unread = True + break + else: + # If we made it through the above loop without breaking out, + # then we are all caught up. + forum.catchup(user, flv) + forum.has_unread = False + +####################################################################### + +def get_topic_unread_status(forum, topics, user): + + # Edge case: no topics + if forum.last_post is None: + return + + # This service isn't provided to unauthenticated users + if not user.is_authenticated(): + for topic in topics: + topic.has_unread = False + return + + now = datetime.datetime.now() + + # Get the ForumLastVisit record + try: + flv = ForumLastVisit.objects.get(forum=forum, user=user) + except ForumLastVisit.DoesNotExist: + # One doesn't exist, create a default one for next time, + # mark it as having no unread topics, and bail. + flv = ForumLastVisit(user=user, forum=forum) + flv.begin_date = now + flv.end_date = now + + # There is a race condition and sometimes another thread + # saves a record before we do; just log this if it happens. + try: + flv.save() + except IntegrityError: + logging.exception('get_topic_unread_status') + + for topic in topics: + topic.has_unread = False + return + + # Are all the posts before our window? If so, all have been read. + if forum.last_post.creation_date < flv.begin_date: + for topic in topics: + topic.has_unread = False + return + + topic_ids = [topic.id for topic in topics] + tlvs = TopicLastVisit.objects.filter(user=user, topic__id__in=topic_ids) + tlvs = dict([(tlv.topic.id, tlv) for tlv in tlvs]) + + # Otherwise we have to go through the topics one by one: + for topic in topics: + if topic.update_date < flv.begin_date: + topic.has_unread = False + elif topic.update_date > flv.end_date: + topic.has_unread = True + elif topic.id in tlvs: + topic.has_unread = topic.update_date > tlvs[topic.id].last_visit + else: + topic.has_unread = True + +####################################################################### + +def get_post_unread_status(topic, posts, user): + # This service isn't provided to unauthenticated users + if not user.is_authenticated(): + for post in posts: + post.unread = False + return + + # Get the ForumLastVisit record + try: + flv = ForumLastVisit.objects.get(forum=topic.forum, user=user) + except ForumLastVisit.DoesNotExist: + # One doesn't exist, all posts are old. + for post in posts: + post.unread = False + return + + # Are all the posts before our window? If so, all have been read. + if topic.last_post.creation_date < flv.begin_date: + for post in posts: + post.unread = False + return + + # Do we have a topic last visit record for this topic? + + try: + tlv = TopicLastVisit.objects.get(user=user, topic=topic) + except TopicLastVisit.DoesNotExist: + # No we don't, we could be all caught up, or all are new + for post in posts: + post.unread = post.creation_date > flv.end_date + else: + for post in posts: + post.unread = post.creation_date > tlv.last_visit + +####################################################################### + +def get_unread_topics(user): + """Returns a list of topics the user hasn't read yet.""" + + # This is only available to authenticated users + if not user.is_authenticated(): + return [] + + now = datetime.datetime.now() + + # Obtain list of forums the user can view + forums = Forum.objects.forums_for_user(user) + + # Get forum last visit records for the forum ids + flvs = ForumLastVisit.objects.filter(user=user, + forum__in=forums).select_related() + flvs = dict([(flv.forum.id, flv) for flv in flvs]) + + unread_topics = [] + topics = Topic.objects.none() + for forum in forums: + # if the user hasn't visited the forum, create a last + # visit record set to "now" + if not forum.id in flvs: + flv = ForumLastVisit(user=user, forum=forum, begin_date=now, + end_date=now) + flv.save() + else: + flv = flvs[forum.id] + topics |= Topic.objects.filter(forum=forum, + update_date__gt=flv.begin_date).order_by('-update_date').select_related( + 'forum', 'user', 'last_post', 'last_post__user') + + if topics is not None: + # get all topic last visit records for the topics of interest + + tlvs = TopicLastVisit.objects.filter(user=user, topic__in=topics) + tlvs = dict([(tlv.topic.id, tlv) for tlv in tlvs]) + + for topic in topics: + if topic.id in tlvs: + tlv = tlvs[topic.id] + if topic.update_date > tlv.last_visit: + unread_topics.append(topic) + else: + unread_topics.append(topic) + + return unread_topics diff -r c525f3e0b5d0 -r ee87ea74d46b forums/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,58 @@ +""" +URLs for the forums application. +""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('forums.views.main', + url(r'^$', 'index', name='forums-index'), + url(r'^catchup/$', 'catchup_all', name='forums-catchup_all'), + url(r'^new-topic-success/(?P\d+)$', 'new_topic_thanks', name='forums-new_topic_thanks'), + url(r'^topic/(?P\d+)/$', 'topic_index', name='forums-topic_index'), + url(r'^topic/(?P\d+)/unread/$', 'topic_unread', name='forums-topic_unread'), + url(r'^topic/(?P\d+)/latest/$', 'topic_latest', name='forums-topic_latest'), + url(r'^topic/active/(\d+)/$', 'active_topics', name='forums-active_topics'), + url(r'^delete-post/$', 'delete_post', name='forums-delete_post'), + url(r'^edit/(?P\d+)/$', 'edit_post', name='forums-edit_post'), + url(r'^flag-post/$', 'flag_post', name='forums-flag_post'), + url(r'^forum/(?P[\w\d-]+)/$', 'forum_index', name='forums-forum_index'), + url(r'^forum/(?P[\w\d-]+)/catchup/$', 'forum_catchup', name='forums-catchup'), + url(r'^forum/(?P[\w\d-]+)/new-topic/$', 'new_topic', name='forums-new_topic'), + url(r'^mod/forum/(?P[\w\d-]+)/$', 'mod_forum', name='forums-mod_forum'), + url(r'^mod/topic/delete/(\d+)/$', 'mod_topic_delete', name='forums-mod_topic_delete'), + url(r'^mod/topic/lock/(\d+)/$', 'mod_topic_lock', name='forums-mod_topic_lock'), + url(r'^mod/topic/move/(\d+)/$', 'mod_topic_move', name='forums-mod_topic_move'), + url(r'^mod/topic/split/(\d+)/$', 'mod_topic_split', name='forums-mod_topic_split'), + url(r'^mod/topic/stick/(\d+)/$', 'mod_topic_stick', name='forums-mod_topic_stick'), + url(r'^my-posts/$', 'my_posts', name='forums-my_posts'), + url(r'^post/(\d+)/$', 'goto_post', name='forums-goto_post'), + url(r'^post/ip/(\d+)/$', 'post_ip_info', name='forums-post_ip_info'), + url(r'^post/new/(?P\d+)/$', 'new_post', name='forums-new_post'), + url(r'^posts/(?P[\w.@+-]{1,30})/$', 'posts_for_user', name='forums-posts_for_user'), + url(r'^quick-reply/$', 'quick_reply_ajax', name='forums-quick_reply'), + url(r'^unanswered/$', 'unanswered_topics', name='forums-unanswered_topics'), + url(r'^unread/$', 'unread_topics', name='forums-unread_topics'), +) + +urlpatterns += patterns('forums.views.favorites', + url(r'^favorite/(\d+)/$', 'favorite_topic', name='forums-favorite_topic'), + url(r'^favorites/$', 'manage_favorites', name='forums-manage_favorites'), + url(r'^favorites/(\d+)/$', 'favorites_status', name='forums-favorites_status'), + url(r'^unfavorite/(\d+)/$', 'unfavorite_topic', name='forums-unfavorite_topic'), +) + +urlpatterns += patterns('forums.views.subscriptions', + url(r'^subscribe/(\d+)/$', 'subscribe_topic', name='forums-subscribe_topic'), + url(r'^subscriptions/$', 'manage_subscriptions', name='forums-manage_subscriptions'), + url(r'^subscriptions/(\d+)/$', 'subscription_status', name='forums-subscription_status'), + url(r'^unsubscribe/(\d+)/$', 'unsubscribe_topic', name='forums-unsubscribe_topic'), +) + +urlpatterns += patterns('forums.views.spam', + url(r'^spammer/(\d+)/$', 'spammer', name='forums-spammer'), + url(r'^spammer/nailed/(\d+)/$', 'spammer_nailed', name='forums-spammer_nailed'), + url(r'^stranger/(\d+)/$', 'stranger', name='forums-stranger'), +) + +urlpatterns += patterns('forums.views.attachments', + url(r'^fetch_attachments/$', 'fetch_attachments', name='forums-fetch_attachments'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/views/attachments.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/views/attachments.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,35 @@ +""" +This module contains views for working with post attachments. +""" +from django.http import HttpResponse +from django.http import HttpResponseForbidden +from django.http import HttpResponseBadRequest +from django.http import HttpResponseNotFound +import django.utils.simplejson as json + +from forums.models import Post + + +def fetch_attachments(request): + """ + This view is the target of an AJAX GET request to retrieve the + attachment embed data for a given forum post. + + """ + if not request.user.is_authenticated(): + return HttpResponseForbidden('Please login or register.') + + post_id = request.GET.get('pid') + if post_id is None: + return HttpResponseBadRequest('Missing post ID.') + + try: + post = Post.objects.get(pk=post_id) + except Post.DoesNotExist: + return HttpResponseNotFound("That post doesn't exist.") + + embeds = post.attachments.all().select_related('embed') + data = [{'id': embed.id, 'html': embed.html} for embed in embeds] + + return HttpResponse(json.dumps(data), content_type='application/json') + diff -r c525f3e0b5d0 -r ee87ea74d46b forums/views/favorites.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/views/favorites.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,92 @@ +""" +This module contains view functions related to forum favorites (bookmarks). +""" +from django.contrib.auth.decorators import login_required +from django.core.urlresolvers import reverse +from django.views.decorators.http import require_POST +from django.shortcuts import get_object_or_404 +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.http import HttpResponseRedirect +from django.http import HttpResponseForbidden +from django.http import Http404 + +from core.paginator import DiggPaginator +from forums.models import Topic +import forums.permissions as perms + + +@login_required +@require_POST +def favorite_topic(request, topic_id): + """ + This function handles the "favoriting" (bookmarking) of a forum topic by a + user. + """ + topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) + if perms.can_access(topic.forum.category, request.user): + topic.bookmarkers.add(request.user) + return HttpResponseRedirect( + reverse("forums-favorites_status", args=[topic.id])) + return HttpResponseForbidden() + + +@login_required +def manage_favorites(request): + """Display a user's favorite topics and allow them to be deleted.""" + + user = request.user + if request.method == "POST": + if request.POST.get('delete_all'): + user.favorite_topics.clear() + else: + delete_ids = request.POST.getlist('delete_ids') + try: + delete_ids = [int(id) for id in delete_ids] + except ValueError: + raise Http404 + for topic in user.favorite_topics.filter(id__in=delete_ids): + user.favorite_topics.remove(topic) + + return HttpResponseRedirect(reverse("forums-manage_favorites")) + + page_num = request.GET.get('page', 1) + topics = user.favorite_topics.select_related().order_by('-update_date') + paginator = DiggPaginator(topics, 20, body=5, tail=2, margin=3, padding=2) + try: + page_num = int(page_num) + except ValueError: + page_num = 1 + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + return render_to_response('forums/manage_topics.html', { + 'page_title': 'Favorite Topics', + 'description': 'Your favorite topics are listed below.', + 'page': page, + }, + context_instance=RequestContext(request)) + +@login_required +def favorites_status(request, topic_id): + """Display the favorite status for the given topic.""" + topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) + is_favorite = request.user in topic.bookmarkers.all() + return render_to_response('forums/favorite_status.html', { + 'topic': topic, + 'is_favorite': is_favorite, + }, + context_instance=RequestContext(request)) + +@login_required +@require_POST +def unfavorite_topic(request, topic_id): + """ + Un-favorite the user from the requested topic. + """ + topic = get_object_or_404(Topic, id=topic_id) + topic.bookmarkers.remove(request.user) + return HttpResponseRedirect( + reverse("forums-favorites_status", args=[topic.id])) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/views/main.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/views/main.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1126 @@ +""" +Views for the forums application. +""" +import collections +import datetime + +from django.contrib.auth.decorators import login_required +from django.contrib.auth.models import User +from django.http import Http404 +from django.http import HttpResponse +from django.http import HttpResponseBadRequest +from django.http import HttpResponseForbidden +from django.http import HttpResponseRedirect +from django.core.urlresolvers import reverse +from django.core.paginator import InvalidPage +from django.shortcuts import get_object_or_404 +from django.shortcuts import render_to_response +from django.template.loader import render_to_string +from django.template import RequestContext +from django.views.decorators.http import require_POST +from django.db.models import F + +import antispam +import antispam.utils +from bio.models import UserProfile, BadgeOwnership +from core.paginator import DiggPaginator +from core.functions import email_admins, quote_message + +from forums.models import (Forum, Topic, Post, FlaggedPost, TopicLastVisit, + ForumLastVisit, Attachment) +from forums.forms import (NewTopicForm, NewPostForm, PostForm, MoveTopicForm, + SplitTopicForm) +from forums.unread import (get_forum_unread_status, get_topic_unread_status, + get_post_unread_status, get_unread_topics) + +import forums.permissions as perms +from forums.signals import (notify_new_topic, notify_updated_topic, + notify_new_post, notify_updated_post) +from forums.latest import get_latest_topic_ids + +####################################################################### + +TOPICS_PER_PAGE = 50 +POSTS_PER_PAGE = 20 +FEED_BASE = '/feeds/forums/' +FORUM_FEED = FEED_BASE + '%s/' + + +def get_page_num(request): + """Returns the value of the 'page' variable in GET if it exists, or 1 + if it does not.""" + + try: + page_num = int(request.GET.get('page', 1)) + except ValueError: + page_num = 1 + + return page_num + + +def create_topic_paginator(topics): + return DiggPaginator(topics, TOPICS_PER_PAGE, body=5, tail=2, margin=3, padding=2) + +def create_post_paginator(posts): + return DiggPaginator(posts, POSTS_PER_PAGE, body=5, tail=2, margin=3, padding=2) + + +def attach_topic_page_ranges(topics): + """Attaches a page_range attribute to each topic in the supplied list. + This attribute will be None if it is a single page topic. This is used + by the templates to generate "goto page x" links. + """ + for topic in topics: + if topic.post_count > POSTS_PER_PAGE: + pp = DiggPaginator(range(topic.post_count), POSTS_PER_PAGE, + body=2, tail=3, margin=1) + topic.page_range = pp.page(1).page_range + else: + topic.page_range = None + +####################################################################### + +def index(request): + """ + This view displays all the forums available, ordered in each category. + """ + public_forums = Forum.objects.public_forums() + feeds = [{'name': 'All Forums', 'feed': FEED_BASE}] + + forums = Forum.objects.forums_for_user(request.user) + get_forum_unread_status(forums, request.user) + cats = {} + for forum in forums: + forum.has_feed = forum in public_forums + if forum.has_feed: + feeds.append({ + 'name': '%s Forum' % forum.name, + 'feed': FORUM_FEED % forum.slug, + }) + + cat = cats.setdefault(forum.category.id, { + 'cat': forum.category, + 'forums': [], + }) + cat['forums'].append(forum) + + cmpdef = lambda a, b: cmp(a['cat'].position, b['cat'].position) + cats = sorted(cats.values(), cmpdef) + + return render_to_response('forums/index.html', { + 'cats': cats, + 'feeds': feeds, + }, + context_instance=RequestContext(request)) + + +def forum_index(request, slug): + """ + Displays all the topics in a forum. + """ + forum = get_object_or_404(Forum.objects.select_related(), slug=slug) + + if not perms.can_access(forum.category, request.user): + return HttpResponseForbidden() + + feed = None + if not forum.category.groups.all(): + feed = { + 'name': '%s Forum' % forum.name, + 'feed': FORUM_FEED % forum.slug, + } + + topics = forum.topics.select_related('user', 'last_post', 'last_post__user') + paginator = create_topic_paginator(topics) + page_num = get_page_num(request) + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + get_topic_unread_status(forum, page.object_list, request.user) + attach_topic_page_ranges(page.object_list) + + # we do this for the template since it is rendered twice + page_nav = render_to_string('forums/pagination.html', {'page': page}) + + can_moderate = perms.can_moderate(forum, request.user) + + return render_to_response('forums/forum_index.html', { + 'forum': forum, + 'feed': feed, + 'page': page, + 'page_nav': page_nav, + 'can_moderate': can_moderate, + }, + context_instance=RequestContext(request)) + + +def topic_index(request, id): + """ + Displays all the posts in a topic. + """ + topic = get_object_or_404(Topic.objects.select_related( + 'forum', 'forum__category', 'last_post'), pk=id) + + if not perms.can_access(topic.forum.category, request.user): + return HttpResponseForbidden() + + topic.view_count = F('view_count') + 1 + topic.save(force_update=True) + + posts = topic.posts.select_related(depth=1) + + paginator = create_post_paginator(posts) + page_num = get_page_num(request) + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + get_post_unread_status(topic, page.object_list, request.user) + + # Attach user profiles to each post's user to avoid using + # get_user_profile() in the template. + users = set(post.user.id for post in page.object_list) + + profiles = UserProfile.objects.filter(user__id__in=users).select_related() + profile_keys = [profile.id for profile in profiles] + user_profiles = dict((profile.user.id, profile) for profile in profiles) + + last_post_on_page = None + for post in page.object_list: + post.user.user_profile = user_profiles[post.user.id] + post.attach_list = [] + last_post_on_page = post + + # Attach badge ownership info to the user profiles to avoid lots + # of database hits in the template: + bos_qs = BadgeOwnership.objects.filter( + profile__id__in=profile_keys).select_related() + bos = collections.defaultdict(list) + for bo in bos_qs: + bos[bo.profile.id].append(bo) + + for user_id, profile in user_profiles.iteritems(): + profile.badge_ownership = bos[profile.id] + + # Attach any attachments + post_ids = [post.pk for post in page.object_list] + attachments = Attachment.objects.filter(post__in=post_ids).select_related( + 'embed').order_by('order') + + post_dict = dict((post.pk, post) for post in page.object_list) + for item in attachments: + post_dict[item.post.id].attach_list.append(item.embed) + + last_page = page_num == paginator.num_pages + + if request.user.is_authenticated(): + if last_page or last_post_on_page is None: + visit_time = datetime.datetime.now() + else: + visit_time = last_post_on_page.creation_date + _update_last_visit(request.user, topic, visit_time) + + # we do this for the template since it is rendered twice + page_nav = render_to_string('forums/pagination.html', {'page': page}) + + can_moderate = perms.can_moderate(topic.forum, request.user) + + can_reply = request.user.is_authenticated() and ( + not topic.locked or can_moderate) + + is_favorite = request.user.is_authenticated() and ( + topic in request.user.favorite_topics.all()) + + is_subscribed = request.user.is_authenticated() and ( + topic in request.user.subscriptions.all()) + + return render_to_response('forums/topic.html', { + 'forum': topic.forum, + 'topic': topic, + 'page': page, + 'page_nav': page_nav, + 'last_page': last_page, + 'can_moderate': can_moderate, + 'can_reply': can_reply, + 'form': NewPostForm(initial={'topic_id': topic.id}), + 'is_favorite': is_favorite, + 'is_subscribed': is_subscribed, + }, + context_instance=RequestContext(request)) + + +def topic_unread(request, id): + """ + This view redirects to the first post the user hasn't read, if we can + figure that out. Otherwise we redirect to the topic. + + """ + topic_url = reverse('forums-topic_index', kwargs={'id': id}) + + if request.user.is_authenticated(): + topic = get_object_or_404(Topic.objects.select_related(depth=1), pk=id) + try: + tlv = TopicLastVisit.objects.get(user=request.user, topic=topic) + except TopicLastVisit.DoesNotExist: + try: + flv = ForumLastVisit.objects.get(user=request.user, + forum=topic.forum) + except ForumLastVisit.DoesNotExist: + return HttpResponseRedirect(topic_url) + else: + last_visit = flv.begin_date + else: + last_visit = tlv.last_visit + + posts = Post.objects.filter(topic=topic, creation_date__gt=last_visit) + if posts: + return _goto_post(posts[0]) + else: + # just go to the last post in the topic + return _goto_post(topic.last_post) + + # user isn't authenticated, just go to the topic + return HttpResponseRedirect(topic_url) + + +def topic_latest(request, id): + """ + This view shows the latest (last) post in a given topic. + + """ + topic = get_object_or_404(Topic.objects.select_related(depth=1), pk=id) + + if topic.last_post: + return _goto_post(topic.last_post) + + raise Http404 + + +@login_required +def new_topic(request, slug): + """ + This view handles the creation of new topics. + """ + forum = get_object_or_404(Forum.objects.select_related(), slug=slug) + + if not perms.can_access(forum.category, request.user): + return HttpResponseForbidden() + + if request.method == 'POST': + form = NewTopicForm(request.user, forum, request.POST) + if form.is_valid(): + if antispam.utils.spam_check(request, form.cleaned_data['body']): + return HttpResponseRedirect(reverse('antispam-suspended')) + + topic = form.save(request.META.get("REMOTE_ADDR")) + _bump_post_count(request.user) + return HttpResponseRedirect(reverse('forums-new_topic_thanks', + kwargs={'tid': topic.pk})) + else: + form = NewTopicForm(request.user, forum) + + return render_to_response('forums/new_topic.html', { + 'forum': forum, + 'form': form, + }, + context_instance=RequestContext(request)) + + +@login_required +def new_topic_thanks(request, tid): + """ + This view displays the success page for a newly created topic. + """ + topic = get_object_or_404(Topic.objects.select_related(), pk=tid) + return render_to_response('forums/new_topic_thanks.html', { + 'forum': topic.forum, + 'topic': topic, + }, + context_instance=RequestContext(request)) + + +@require_POST +def quick_reply_ajax(request): + """ + This function handles the quick reply to a thread function. This + function is meant to be the target of an AJAX post, and returns + the HTML for the new post, which the client-side script appends + to the document. + """ + if not request.user.is_authenticated(): + return HttpResponseForbidden('Please login or register to post.') + + form = NewPostForm(request.POST) + if form.is_valid(): + if not perms.can_post(form.topic, request.user): + return HttpResponseForbidden("You don't have permission to post in this topic.") + if antispam.utils.spam_check(request, form.cleaned_data['body']): + return HttpResponseForbidden(antispam.BUSTED_MESSAGE) + + post = form.save(request.user, request.META.get("REMOTE_ADDR", "")) + post.unread = True + post.user.user_profile = request.user.get_profile() + post.attach_list = post.attachments.all() + _bump_post_count(request.user) + _update_last_visit(request.user, form.topic, datetime.datetime.now()) + + return render_to_response('forums/display_post.html', { + 'post': post, + 'can_moderate': perms.can_moderate(form.topic.forum, request.user), + 'can_reply': True, + }, + context_instance=RequestContext(request)) + + return HttpResponseBadRequest("Oops, did you forget some text?"); + + +def _goto_post(post): + """ + Calculate what page the given post is on in its parent topic, then + return a redirect to it. + + """ + count = post.topic.posts.filter(creation_date__lt=post.creation_date).count() + page = count / POSTS_PER_PAGE + 1 + url = (reverse('forums-topic_index', kwargs={'id': post.topic.id}) + + '?page=%s#p%s' % (page, post.id)) + return HttpResponseRedirect(url) + + +def goto_post(request, post_id): + """ + This function calculates what page a given post is on, then redirects + to that URL. This function is the target of get_absolute_url() for + Post objects. + """ + post = get_object_or_404(Post.objects.select_related(), pk=post_id) + return _goto_post(post) + + +@require_POST +def flag_post(request): + """ + This function handles the flagging of posts by users. This function should + be the target of an AJAX post. + """ + if not request.user.is_authenticated(): + return HttpResponseForbidden('Please login or register to flag a post.') + + id = request.POST.get('id') + if id is None: + return HttpResponseBadRequest('No post id') + + try: + post = Post.objects.get(pk=id) + except Post.DoesNotExist: + return HttpResponseBadRequest('No post with id %s' % id) + + flag = FlaggedPost(user=request.user, post=post) + flag.save() + email_admins('A Post Has Been Flagged', """Hello, + +A user has flagged a forum post for review. +""") + return HttpResponse('The post was flagged. A moderator will review the post shortly. ' \ + 'Thanks for helping to improve the discussions on this site.') + + +@login_required +def edit_post(request, id): + """ + This view function allows authorized users to edit posts. + The superuser, forum moderators, and original author can edit posts. + """ + post = get_object_or_404(Post.objects.select_related(), pk=id) + + can_moderate = perms.can_moderate(post.topic.forum, request.user) + can_edit = can_moderate or request.user == post.user + + if not can_edit: + return HttpResponseForbidden("You don't have permission to edit that post.") + + topic_name = None + first_post = Post.objects.filter(topic=post.topic).order_by('creation_date')[0] + if first_post.id == post.id: + topic_name = post.topic.name + + if request.method == "POST": + form = PostForm(request.POST, instance=post, topic_name=topic_name) + if form.is_valid(): + if antispam.utils.spam_check(request, form.cleaned_data['body']): + return HttpResponseRedirect(reverse('antispam-suspended')) + post = form.save(commit=False) + post.touch() + post.save() + notify_updated_post(post) + + # if we are editing a first post, save the parent topic as well + if topic_name: + post.topic.save() + notify_updated_topic(post.topic) + + # Save any attachments + form.attach_proc.save_attachments(post) + + return HttpResponseRedirect(post.get_absolute_url()) + else: + form = PostForm(instance=post, topic_name=topic_name) + + post.user.user_profile = post.user.get_profile() + + return render_to_response('forums/edit_post.html', { + 'forum': post.topic.forum, + 'topic': post.topic, + 'post': post, + 'form': form, + 'can_moderate': can_moderate, + }, + context_instance=RequestContext(request)) + + +@require_POST +def delete_post(request): + """ + This view function allows superusers and forum moderators to delete posts. + This function is the target of AJAX calls from the client. + """ + if not request.user.is_authenticated(): + return HttpResponseForbidden('Please login to delete a post.') + + id = request.POST.get('id') + if id is None: + return HttpResponseBadRequest('No post id') + + post = get_object_or_404(Post.objects.select_related(), pk=id) + + if not perms.can_moderate(post.topic.forum, request.user): + return HttpResponseForbidden("You don't have permission to delete that post.") + + delete_single_post(post) + return HttpResponse("The post has been deleted.") + + +def delete_single_post(post): + """ + This function deletes a single post. It handles the case of where + a post is the sole post in a topic by deleting the topic also. It + adjusts any foreign keys in Topic or Forum objects that might be pointing + to this post before deleting the post to avoid a cascading delete. + """ + if post.topic.post_count == 1 and post == post.topic.last_post: + _delete_topic(post.topic) + else: + _delete_post(post) + + +def _delete_post(post): + """ + Internal function to delete a single post object. + Decrements the post author's post count. + Adjusts the parent topic and forum's last_post as needed. + """ + # Adjust post creator's post count + profile = post.user.get_profile() + if profile.forum_post_count > 0: + profile.forum_post_count -= 1 + profile.save(content_update=False) + + # If this post is the last_post in a topic, we need to update + # both the topic and parent forum's last post fields. If we don't + # the cascading delete will delete them also! + + topic = post.topic + if topic.last_post == post: + topic.last_post_pre_delete() + topic.save() + + forum = topic.forum + if forum.last_post == post: + forum.last_post_pre_delete() + forum.save() + + # delete any attachments + post.attachments.clear() + + # Should be safe to delete the post now: + post.delete() + + +def _delete_topic(topic): + """ + Internal function to delete an entire topic. + Deletes the topic and all posts contained within. + Adjusts the parent forum's last_post as needed. + Note that we don't bother adjusting all the users' + post counts as that doesn't seem to be worth the effort. + """ + parent_forum = topic.forum + if parent_forum.last_post and parent_forum.last_post.topic == topic: + parent_forum.last_post_pre_delete(deleting_topic=True) + parent_forum.save() + + # delete subscriptions to this topic + topic.subscribers.clear() + topic.bookmarkers.clear() + + # delete all attachments + posts = Post.objects.filter(topic=topic) + for post in posts: + post.attachments.clear() + + # Null out the topic's last post so we don't have a foreign key pointing + # to a post when we delete posts. + topic.last_post = None + topic.save() + + # delete all posts in bulk + posts.delete() + + # It should be safe to just delete the topic now. + topic.delete() + + # Resync parent forum's post and topic counts + parent_forum.sync() + parent_forum.save() + + +@login_required +def new_post(request, topic_id): + """ + This function is the view for creating a normal, non-quick reply + to a topic. + """ + topic = get_object_or_404(Topic.objects.select_related(), pk=topic_id) + can_post = perms.can_post(topic, request.user) + + if can_post: + if request.method == 'POST': + form = PostForm(request.POST) + if form.is_valid(): + if antispam.utils.spam_check(request, form.cleaned_data['body']): + return HttpResponseRedirect(reverse('antispam-suspended')) + post = form.save(commit=False) + post.topic = topic + post.user = request.user + post.user_ip = request.META.get("REMOTE_ADDR", "") + post.save() + notify_new_post(post) + + # Save any attachments + form.attach_proc.save_attachments(post) + + _bump_post_count(request.user) + _update_last_visit(request.user, topic, datetime.datetime.now()) + return HttpResponseRedirect(post.get_absolute_url()) + else: + quote_id = request.GET.get('quote') + if quote_id: + quote_post = get_object_or_404(Post.objects.select_related(), + pk=quote_id) + form = PostForm(initial={'body': quote_message(quote_post.user.username, + quote_post.body)}) + else: + form = PostForm() + else: + form = None + + return render_to_response('forums/new_post.html', { + 'forum': topic.forum, + 'topic': topic, + 'form': form, + 'can_post': can_post, + }, + context_instance=RequestContext(request)) + + +@login_required +def mod_topic_stick(request, id): + """ + This view function is for moderators to toggle the sticky status of a topic. + """ + topic = get_object_or_404(Topic.objects.select_related(), pk=id) + if perms.can_moderate(topic.forum, request.user): + topic.sticky = not topic.sticky + topic.save() + return HttpResponseRedirect(topic.get_absolute_url()) + + return HttpResponseForbidden() + + +@login_required +def mod_topic_lock(request, id): + """ + This view function is for moderators to toggle the locked status of a topic. + """ + topic = get_object_or_404(Topic.objects.select_related(), pk=id) + if perms.can_moderate(topic.forum, request.user): + topic.locked = not topic.locked + topic.save() + return HttpResponseRedirect(topic.get_absolute_url()) + + return HttpResponseForbidden() + + +@login_required +def mod_topic_delete(request, id): + """ + This view function is for moderators to delete an entire topic. + """ + topic = get_object_or_404(Topic.objects.select_related(), pk=id) + if perms.can_moderate(topic.forum, request.user): + forum_url = topic.forum.get_absolute_url() + _delete_topic(topic) + return HttpResponseRedirect(forum_url) + + return HttpResponseForbidden() + + +@login_required +def mod_topic_move(request, id): + """ + This view function is for moderators to move a topic to a different forum. + """ + topic = get_object_or_404(Topic.objects.select_related(), pk=id) + if not perms.can_moderate(topic.forum, request.user): + return HttpResponseForbidden() + + if request.method == 'POST': + form = MoveTopicForm(request.user, request.POST) + if form.is_valid(): + new_forum = form.cleaned_data['forums'] + old_forum = topic.forum + _move_topic(topic, old_forum, new_forum) + return HttpResponseRedirect(topic.get_absolute_url()) + else: + form = MoveTopicForm(request.user) + + return render_to_response('forums/move_topic.html', { + 'forum': topic.forum, + 'topic': topic, + 'form': form, + }, + context_instance=RequestContext(request)) + + +@login_required +def mod_forum(request, slug): + """ + Displays a view to allow moderators to perform various operations + on topics in a forum in bulk. We currently support mass locking/unlocking, + stickying and unstickying, moving, and deleting topics. + """ + forum = get_object_or_404(Forum.objects.select_related(), slug=slug) + if not perms.can_moderate(forum, request.user): + return HttpResponseForbidden() + + topics = forum.topics.select_related('user', 'last_post', 'last_post__user') + paginator = create_topic_paginator(topics) + page_num = get_page_num(request) + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + # we do this for the template since it is rendered twice + page_nav = render_to_string('forums/pagination.html', {'page': page}) + form = None + + if request.method == 'POST': + topic_ids = request.POST.getlist('topic_ids') + url = reverse('forums-mod_forum', kwargs={'slug':forum.slug}) + url += '?page=%s' % page_num + + if len(topic_ids): + if request.POST.get('sticky'): + _bulk_sticky(forum, topic_ids) + return HttpResponseRedirect(url) + elif request.POST.get('lock'): + _bulk_lock(forum, topic_ids) + return HttpResponseRedirect(url) + elif request.POST.get('delete'): + _bulk_delete(forum, topic_ids) + return HttpResponseRedirect(url) + elif request.POST.get('move'): + form = MoveTopicForm(request.user, request.POST, hide_label=True) + if form.is_valid(): + _bulk_move(topic_ids, forum, form.cleaned_data['forums']) + return HttpResponseRedirect(url) + + if form is None: + form = MoveTopicForm(request.user, hide_label=True) + + return render_to_response('forums/mod_forum.html', { + 'forum': forum, + 'page': page, + 'page_nav': page_nav, + 'form': form, + }, + context_instance=RequestContext(request)) + + +@login_required +@require_POST +def catchup_all(request): + """ + This view marks all forums as being read. + """ + forum_ids = Forum.objects.forum_ids_for_user(request.user) + + tlvs = TopicLastVisit.objects.filter(user=request.user, + topic__forum__id__in=forum_ids).delete() + + now = datetime.datetime.now() + ForumLastVisit.objects.filter(user=request.user, + forum__in=forum_ids).update(begin_date=now, end_date=now) + + return HttpResponseRedirect(reverse('forums-index')) + + +@login_required +@require_POST +def forum_catchup(request, slug): + """ + This view marks all the topics in the forum as being read. + """ + forum = get_object_or_404(Forum.objects.select_related(), slug=slug) + + if not perms.can_access(forum.category, request.user): + return HttpResponseForbidden() + + forum.catchup(request.user) + return HttpResponseRedirect(forum.get_absolute_url()) + + +@login_required +def mod_topic_split(request, id): + """ + This view function allows moderators to split posts off to a new topic. + """ + topic = get_object_or_404(Topic.objects.select_related(), pk=id) + if not perms.can_moderate(topic.forum, request.user): + return HttpResponseRedirect(topic.get_absolute_url()) + + if request.method == "POST": + form = SplitTopicForm(request.user, request.POST) + if form.is_valid(): + if form.split_at: + _split_topic_at(topic, form.post_ids[0], + form.cleaned_data['forums'], + form.cleaned_data['name']) + else: + _split_topic(topic, form.post_ids, + form.cleaned_data['forums'], + form.cleaned_data['name']) + + return HttpResponseRedirect(topic.get_absolute_url()) + else: + form = SplitTopicForm(request.user) + + posts = topic.posts.select_related() + + return render_to_response('forums/mod_split_topic.html', { + 'forum': topic.forum, + 'topic': topic, + 'posts': posts, + 'form': form, + }, + context_instance=RequestContext(request)) + + +@login_required +def unread_topics(request): + """Displays the topics with unread posts for a given user.""" + + topics = get_unread_topics(request.user) + + paginator = create_topic_paginator(topics) + page_num = get_page_num(request) + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + attach_topic_page_ranges(page.object_list) + + # we do this for the template since it is rendered twice + page_nav = render_to_string('forums/pagination.html', {'page': page}) + + return render_to_response('forums/topic_list.html', { + 'title': 'Topics With Unread Posts', + 'page': page, + 'page_nav': page_nav, + 'unread': True, + }, + context_instance=RequestContext(request)) + + +def unanswered_topics(request): + """Displays the topics with no replies.""" + + forum_ids = Forum.objects.forum_ids_for_user(request.user) + topics = Topic.objects.filter(forum__id__in=forum_ids, + post_count=1).select_related( + 'forum', 'user', 'last_post', 'last_post__user') + + paginator = create_topic_paginator(topics) + page_num = get_page_num(request) + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + attach_topic_page_ranges(page.object_list) + + # we do this for the template since it is rendered twice + page_nav = render_to_string('forums/pagination.html', {'page': page}) + + return render_to_response('forums/topic_list.html', { + 'title': 'Unanswered Topics', + 'page': page, + 'page_nav': page_nav, + 'unread': False, + }, + context_instance=RequestContext(request)) + + +def active_topics(request, num): + """Displays the last num topics that have been posted to.""" + + # sanity check num + num = min(50, max(10, int(num))) + + # MySQL didn't do this query very well unfortunately... + # + #public_forum_ids = Forum.objects.public_forum_ids() + #topics = Topic.objects.filter(forum__in=public_forum_ids).select_related( + # 'forum', 'user', 'last_post', 'last_post__user').order_by( + # '-update_date')[:num] + + # Save 1 query by using forums.latest to give us a list of the most recent + # topics; forums.latest doesn't save enough info to give us everything we + # need so we hit the database for the rest. + + topic_ids = get_latest_topic_ids(num) + topics = Topic.objects.filter(id__in=topic_ids).select_related( + 'forum', 'user', 'last_post', 'last_post__user').order_by( + '-update_date') + + paginator = create_topic_paginator(topics) + page_num = get_page_num(request) + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + attach_topic_page_ranges(page.object_list) + + # we do this for the template since it is rendered twice + page_nav = render_to_string('forums/pagination.html', {'page': page}) + + title = 'Last %d Active Topics' % num + + return render_to_response('forums/topic_list.html', { + 'title': title, + 'page': page, + 'page_nav': page_nav, + 'unread': False, + }, + context_instance=RequestContext(request)) + + +@login_required +def my_posts(request): + """Displays a list of posts the requesting user made.""" + return _user_posts(request, request.user, request.user, 'My Posts') + + +@login_required +def posts_for_user(request, username): + """Displays a list of posts by the given user. + Only the forums that the requesting user can see are examined. + """ + target_user = get_object_or_404(User, username=username) + return _user_posts(request, target_user, request.user, 'Posts by %s' % username) + + +@login_required +def post_ip_info(request, post_id): + """Displays information about the IP address the post was made from.""" + post = get_object_or_404(Post.objects.select_related(), pk=post_id) + + if not perms.can_moderate(post.topic.forum, request.user): + return HttpResponseForbidden("You don't have permission for this post.") + + ip_users = sorted(set(Post.objects.filter( + user_ip=post.user_ip).values_list('user__username', flat=True))) + + return render_to_response('forums/post_ip.html', { + 'post': post, + 'ip_users': ip_users, + }, + context_instance=RequestContext(request)) + + +def _user_posts(request, target_user, req_user, page_title): + """Displays a list of posts made by the target user. + req_user is the user trying to view the posts. Only the forums + req_user can see are searched. + """ + forum_ids = Forum.objects.forum_ids_for_user(req_user) + posts = Post.objects.filter(user=target_user, + topic__forum__id__in=forum_ids).order_by( + '-creation_date').select_related() + + paginator = create_post_paginator(posts) + page_num = get_page_num(request) + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + # we do this for the template since it is rendered twice + page_nav = render_to_string('forums/pagination.html', {'page': page}) + + return render_to_response('forums/post_list.html', { + 'title': page_title, + 'page': page, + 'page_nav': page_nav, + }, + context_instance=RequestContext(request)) + + +def _bump_post_count(user): + """ + Increments the forum_post_count for the given user. + """ + profile = user.get_profile() + profile.forum_post_count += 1 + profile.save(content_update=False) + + +def _move_topic(topic, old_forum, new_forum): + if new_forum != old_forum: + topic.forum = new_forum + topic.save() + # Have to adjust foreign keys to last_post, denormalized counts, etc.: + old_forum.sync() + old_forum.save() + new_forum.sync() + new_forum.save() + + +def _bulk_sticky(forum, topic_ids): + """ + Performs a toggle on the sticky status for a given list of topic ids. + """ + topics = Topic.objects.filter(pk__in=topic_ids) + for topic in topics: + if topic.forum == forum: + topic.sticky = not topic.sticky + topic.save() + + +def _bulk_lock(forum, topic_ids): + """ + Performs a toggle on the locked status for a given list of topic ids. + """ + topics = Topic.objects.filter(pk__in=topic_ids) + for topic in topics: + if topic.forum == forum: + topic.locked = not topic.locked + topic.save() + + +def _bulk_delete(forum, topic_ids): + """ + Deletes the list of topics. + """ + # Because we are deleting stuff, retrieve each topic one at a + # time since we are going to be adjusting de-normalized fields + # during deletes. In particular, we can't do this: + # topics = Topic.objects.filter(pk__in=topic_ids).select_related() + # for topic in topics: + # since topic.forum.last_post can go stale after a delete. + + for id in topic_ids: + try: + topic = Topic.objects.select_related().get(pk=id) + except Topic.DoesNotExist: + continue + _delete_topic(topic) + + +def _bulk_move(topic_ids, old_forum, new_forum): + """ + Moves the list of topics to a new forum. + """ + topics = Topic.objects.filter(pk__in=topic_ids).select_related() + for topic in topics: + if topic.forum == old_forum: + _move_topic(topic, old_forum, new_forum) + + +def _update_last_visit(user, topic, visit_time): + """ + Does the bookkeeping for the last visit status for the user to the + topic/forum. + """ + now = datetime.datetime.now() + try: + flv = ForumLastVisit.objects.get(user=user, forum=topic.forum) + except ForumLastVisit.DoesNotExist: + flv = ForumLastVisit(user=user, forum=topic.forum) + flv.begin_date = now + + flv.end_date = now + flv.save() + + if topic.update_date > flv.begin_date: + try: + tlv = TopicLastVisit.objects.get(user=user, topic=topic) + except TopicLastVisit.DoesNotExist: + tlv = TopicLastVisit(user=user, topic=topic, last_visit=datetime.datetime.min) + + if visit_time > tlv.last_visit: + tlv.last_visit = visit_time + tlv.save() + + +def _split_topic_at(topic, post_id, new_forum, new_name): + """ + This function splits the post given by post_id and all posts that come + after it in the given topic to a new topic in a new forum. + It is assumed the caller has been checked for moderator rights. + """ + post = get_object_or_404(Post, id=post_id) + if post.topic == topic: + post_ids = Post.objects.filter(topic=topic, + creation_date__gte=post.creation_date).values_list('id', flat=True) + _split_topic(topic, post_ids, new_forum, new_name) + + +def _split_topic(topic, post_ids, new_forum, new_name): + """ + This function splits the posts given by the post_ids list in the + given topic to a new topic in a new forum. + It is assumed the caller has been checked for moderator rights. + """ + posts = Post.objects.filter(topic=topic, id__in=post_ids) + if len(posts) > 0: + new_topic = Topic(forum=new_forum, name=new_name, user=posts[0].user) + new_topic.save() + notify_new_topic(new_topic) + for post in posts: + post.topic = new_topic + post.save() + + topic.post_count_update() + topic.save() + new_topic.post_count_update() + new_topic.save() + topic.forum.sync() + topic.forum.save() + new_forum.sync() + new_forum.save() diff -r c525f3e0b5d0 -r ee87ea74d46b forums/views/spam.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/views/spam.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,119 @@ +""" +This module contains views for dealing with spam and spammers. +""" +import datetime +import logging +import textwrap + +from django.contrib.auth.decorators import login_required +from django.core.urlresolvers import reverse +from django.http import HttpResponseRedirect +from django.shortcuts import get_object_or_404 +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.contrib.auth.models import User + +from forums.models import Post +import forums.permissions as perms +import bio.models +from core.functions import email_admins +from antispam.utils import deactivate_spammer + + +SPAMMER_NAILED_SUBJECT = "Spammer Nailed: %s" +SPAMMER_NAILED_MSG_BODY = """ +The admin/moderator user %s has just deactivated the account of %s for spam. +""" + + +def promote_stranger(user): + """This function upgrades the user from stranger status to a regular user. + """ + profile = user.get_profile() + if user.is_active and profile.status == bio.models.STA_STRANGER: + profile.status = bio.models.STA_ACTIVE + profile.status_date = datetime.datetime.now() + profile.save(content_update=False) + + +@login_required +def spammer(request, post_id): + """This view allows moderators to deactivate spammer accounts.""" + + post = get_object_or_404(Post.objects.select_related(), pk=post_id) + poster = post.user + poster_profile = poster.get_profile() + + can_moderate = perms.can_moderate(post.topic.forum, request.user) + can_deactivate = (poster_profile.status == bio.models.STA_STRANGER and not + poster.is_superuser) + + if request.method == "POST" and can_moderate and can_deactivate: + deactivate_spammer(poster) + + email_admins(SPAMMER_NAILED_SUBJECT % poster.username, + SPAMMER_NAILED_MSG_BODY % ( + request.user.username, poster.username)) + + logging.info(textwrap.dedent("""\ + SPAMMER DEACTIVATED: %s nailed %s for spam. + IP: %s + Message: + %s + """), + request.user.username, poster.username, post.user_ip, post.body) + + return HttpResponseRedirect(reverse('forums-spammer_nailed', args=[ + poster.id])) + + return render_to_response('forums/spammer.html', { + 'can_moderate': can_moderate, + 'can_deactivate': can_deactivate, + 'post': post, + }, + context_instance=RequestContext(request)) + + +@login_required +def spammer_nailed(request, spammer_id): + """This view presents a confirmation screen that the spammer has been + deactivated. + """ + user = get_object_or_404(User, pk=spammer_id) + profile = user.get_profile() + + success = not user.is_active and profile.status == bio.models.STA_SPAMMER + + return render_to_response('forums/spammer_nailed.html', { + 'spammer': user, + 'success': success, + }, + context_instance=RequestContext(request)) + + +@login_required +def stranger(request, post_id): + """This view allows a forum moderator or super user to promote a user from + stranger status to regular user. + """ + post = get_object_or_404(Post.objects.select_related(), pk=post_id) + poster = post.user + poster_profile = poster.get_profile() + + can_moderate = perms.can_moderate(post.topic.forum, request.user) + can_promote = poster_profile.status == bio.models.STA_STRANGER + + if request.method == "POST" and can_moderate and can_promote: + promote_stranger(poster) + + logging.info("STRANGER PROMOTED: %s promoted %s.", + request.user.username, poster.username) + + return HttpResponseRedirect(post.get_absolute_url()) + + return render_to_response('forums/stranger.html', { + 'can_moderate': can_moderate, + 'can_promote': can_promote, + 'post': post, + }, + context_instance=RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b forums/views/subscriptions.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/forums/views/subscriptions.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,122 @@ +"""This module handles the subscriptions of users to forum topics.""" +from django.conf import settings +from django.contrib.auth.decorators import login_required +from django.contrib.sites.models import Site +from django.core.paginator import InvalidPage +from django.core.urlresolvers import reverse +from django.http import HttpResponseRedirect +from django.http import Http404 +from django.template.loader import render_to_string +from django.shortcuts import get_object_or_404 +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.views.decorators.http import require_POST + +from forums.models import Topic +import forums.permissions as perms +from core.functions import send_mail +from core.paginator import DiggPaginator + + +def notify_topic_subscribers(post, defer=True): + """ + The argument post is a newly created post. Send out an email + notification to all subscribers of the post's parent Topic. + + The defer flag is passed to core.functions.send_mail. If True, the mail is + sent on a Celery task. If False, the mail is sent on the caller's thread. + """ + topic = post.topic + recipients = topic.subscribers.exclude(id=post.user.id).values_list( + 'email', flat=True) + + if recipients: + site = Site.objects.get_current() + subject = "[%s] Topic Reply: %s" % (site.name, topic.name) + url_prefix = "http://%s" % site.domain + post_url = url_prefix + post.get_absolute_url() + unsubscribe_url = url_prefix + reverse("forums-manage_subscriptions") + msg = render_to_string("forums/topic_notify_email.txt", { + 'poster': post.user.username, + 'topic_name': topic.name, + 'message': post.body, + 'post_url': post_url, + 'unsubscribe_url': unsubscribe_url, + }) + for recipient in recipients: + send_mail(subject, msg, settings.DEFAULT_FROM_EMAIL, [recipient], + defer=defer) + + +@login_required +@require_POST +def subscribe_topic(request, topic_id): + """Subscribe the user to the requested topic.""" + topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) + if perms.can_access(topic.forum.category, request.user): + topic.subscribers.add(request.user) + return HttpResponseRedirect( + reverse("forums-subscription_status", args=[topic.id])) + raise Http404 + + +@login_required +@require_POST +def unsubscribe_topic(request, topic_id): + """Unsubscribe the user to the requested topic.""" + topic = get_object_or_404(Topic, id=topic_id) + topic.subscribers.remove(request.user) + return HttpResponseRedirect( + reverse("forums-subscription_status", args=[topic.id])) + + +@login_required +def subscription_status(request, topic_id): + """Display the subscription status for the given topic.""" + topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) + is_subscribed = request.user in topic.subscribers.all() + return render_to_response('forums/subscription_status.html', { + 'topic': topic, + 'is_subscribed': is_subscribed, + }, + context_instance=RequestContext(request)) + + +@login_required +def manage_subscriptions(request): + """Display a user's topic subscriptions, and allow them to be deleted.""" + + user = request.user + if request.method == "POST": + if request.POST.get('delete_all'): + user.subscriptions.clear() + else: + delete_ids = request.POST.getlist('delete_ids') + try: + delete_ids = [int(id) for id in delete_ids] + except ValueError: + raise Http404 + + for topic in user.subscriptions.filter(id__in=delete_ids): + user.subscriptions.remove(topic) + + return HttpResponseRedirect(reverse("forums-manage_subscriptions")) + + page_num = request.GET.get('page', 1) + topics = user.subscriptions.select_related().order_by('-update_date') + paginator = DiggPaginator(topics, 20, body=5, tail=2, margin=3, padding=2) + try: + page_num = int(page_num) + except ValueError: + page_num = 1 + try: + page = paginator.page(page_num) + except InvalidPage: + raise Http404 + + return render_to_response('forums/manage_topics.html', { + 'page_title': 'Topic Subscriptions', + 'description': 'The forum topics you are currently subscribed to are listed below.', + 'page': page, + }, + context_instance=RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,152 @@ +""" +This file contains the automatic admin site definitions for the gcalendar application. + +""" +from django.conf import settings +from django.conf.urls import patterns, url +from django.contrib import admin +from django.contrib import messages +from django.contrib.sites.models import Site +from django.core.urlresolvers import reverse +from django.http import HttpResponseRedirect +from django.shortcuts import render + +import gdata.client + +from gcalendar.models import Event, AccessToken +from gcalendar.calendar import Calendar, CalendarError +from gcalendar import oauth + +import bio.badges + + +SCOPES = ['https://www.google.com/calendar/feeds/'] + + +class EventAdmin(admin.ModelAdmin): + list_display = ('what', 'user', 'start_date', 'where', 'date_submitted', + 'status', 'is_approved', 'google_html') + list_filter = ('start_date', 'status') + date_hierarchy = 'start_date' + search_fields = ('what', 'where', 'description') + raw_id_fields = ('user', ) + exclude = ('html', 'google_id', 'google_url') + save_on_top = True + actions = ('approve_events', ) + + pending_states = { + Event.NEW: Event.NEW_APRV, + Event.EDIT_REQ: Event.EDIT_APRV, + Event.DEL_REQ: Event.DEL_APRV, + } + + def get_urls(self): + urls = super(EventAdmin, self).get_urls() + my_urls = patterns('', + url(r'^google_sync/$', + self.admin_site.admin_view(self.google_sync), + name="gcalendar-google_sync"), + url(r'^fetch_auth/$', + self.admin_site.admin_view(self.fetch_auth), + name="gcalendar-fetch_auth"), + url(r'^get_access_token/$', + self.admin_site.admin_view(self.get_access_token), + name="gcalendar-get_access_token"), + ) + return my_urls + urls + + def approve_events(self, request, qs): + """ + Ratchets the selected events forward to the approved state. + Ignores events that aren't in the proper state. + """ + count = 0 + for event in qs: + if event.status in self.pending_states: + event.status = self.pending_states[event.status] + event.save() + count += 1 + + if event.status == Event.NEW_APRV: + bio.badges.award_badge(bio.badges.CALENDAR_PIN, event.user) + + msg = "1 event was" if count == 1 else "%d events were" % count + msg += " approved." + self.message_user(request, msg) + + approve_events.short_description = "Approve selected events" + + def google_sync(self, request): + """ + View to synchronize approved event changes with Google calendar. + + """ + # Get pending events + events = Event.pending_events.all() + + # Attempt to get saved access token to the Google calendar + access_token = AccessToken.objects.get_token().access_token() + + messages = [] + err_msg = '' + if request.method == 'POST': + if access_token: + try: + cal = Calendar(source=oauth.USER_AGENT, + calendar_id=settings.GCAL_CALENDAR_ID, + access_token=access_token) + cal.sync_events(events) + except CalendarError, e: + err_msg = e.msg + events = Event.pending_events.all() + else: + messages.append('All events processed successfully.') + events = Event.objects.none() + + return render(request, 'gcalendar/google_sync.html', { + 'current_app': self.admin_site.name, + 'access_token': access_token, + 'messages': messages, + 'err_msg': err_msg, + 'events': events, + }) + + def fetch_auth(self, request): + """ + This view fetches a request token and then redirects the user to + authorize it. + + """ + site = Site.objects.get_current() + callback_url = 'http://%s%s' % (site.domain, + reverse('admin:gcalendar-get_access_token')) + try: + auth_url = oauth.fetch_auth(request, SCOPES, callback_url) + except gdata.client.Error, e: + messages.error(request, str(e)) + return HttpResponseRedirect(reverse('admin:gcalendar-google_sync')) + else: + return HttpResponseRedirect(auth_url) + + def get_access_token(self, request): + """ + This view is called by Google after the user has authorized us access to + their data. We call into the oauth module to upgrade the oauth token to + an access token. We then save the access token in the database and + redirect back to our admin Google sync view. + + """ + try: + access_token = oauth.get_access_token(request) + except gdata.client.Error, e: + messages.error(request, str(e)) + else: + token = AccessToken.objects.get_token() + token.update(access_token) + token.save() + + return HttpResponseRedirect(reverse('admin:gcalendar-google_sync')) + + +admin.site.register(Event, EventAdmin) +admin.site.register(AccessToken) diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/calendar.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/calendar.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,148 @@ +""" +This file contains the calendar class wich abstracts the Google gdata API for working with +Google Calendars. + +""" +import datetime +import pytz + +from django.utils.tzinfo import FixedOffset +from gdata.calendar.client import CalendarClient +from gdata.calendar.data import (CalendarEventEntry, CalendarEventFeed, + CalendarWhere, When, EventWho) +import atom.data + +from gcalendar.models import Event + + +class CalendarError(Exception): + def __init__(self, msg): + self.msg = msg + + def __str__(self): + return repr(self.msg) + + +class Calendar(object): + DATE_FMT = '%Y-%m-%d' + DATE_TIME_FMT = DATE_FMT + 'T%H:%M:%S' + DATE_TIME_TZ_FMT = DATE_TIME_FMT + '.000Z' + + def __init__(self, source=None, calendar_id='default', access_token=None): + self.client = CalendarClient(source=source, auth_token=access_token) + + self.insert_feed = ('https://www.google.com/calendar/feeds/' + '%s/private/full' % calendar_id) + self.batch_feed = '%s/batch' % self.insert_feed + + def sync_events(self, qs): + request_feed = CalendarEventFeed() + for model in qs: + if model.status == Event.NEW_APRV: + event = CalendarEventEntry() + request_feed.AddInsert(entry=self._populate_event(model, event)) + elif model.status == Event.EDIT_APRV: + event = self._retrieve_event(model) + request_feed.AddUpdate(entry=self._populate_event(model, event)) + elif model.status == Event.DEL_APRV: + event = self._retrieve_event(model) + request_feed.AddDelete(entry=event) + else: + assert False, 'unexpected status in sync_events' + + try: + response_feed = self.client.ExecuteBatch(request_feed, self.batch_feed) + except Exception, e: + raise CalendarError('ExecuteBatch exception: %s' % e) + + err_msgs = [] + for entry in response_feed.entry: + i = int(entry.batch_id.text) + code = int(entry.batch_status.code) + + error = False + if qs[i].status == Event.NEW_APRV: + if code == 201: + qs[i].status = Event.ON_CAL + qs[i].google_id = entry.GetEditLink().href + qs[i].google_url = entry.GetHtmlLink().href + qs[i].save() + qs[i].notify_on_calendar() + else: + error = True + + elif qs[i].status == Event.EDIT_APRV: + if code == 200: + qs[i].status = Event.ON_CAL + qs[i].save() + else: + error = True + + elif qs[i].status == Event.DEL_APRV: + if code == 200: + qs[i].delete() + else: + error = True + + if error: + err_msgs.append('%s - (%d) %s' % ( + qs[i].what, code, entry.batch_status.reason)) + + if len(err_msgs) > 0: + raise CalendarError(', '.join(err_msgs)) + + def _retrieve_event(self, model): + try: + event = self.client.GetEventEntry(model.google_id) + except Exception, e: + raise CalendarError('Could not retrieve event from Google: %s, %s' \ + % (model.what, e)) + return event + + def _populate_event(self, model, event): + """Populates a gdata event from an Event model object.""" + event.title = atom.data.Title(text=model.what) + event.content = atom.data.Content(text=model.html) + event.where = [CalendarWhere(value=model.where)] + event.who = [EventWho(email=model.user.email)] + + if model.all_day: + start_time = self._make_time(model.start_date) + if model.start_date == model.end_date: + end_time = None + else: + end_time = self._make_time(model.end_date) + else: + start_time = self._make_time(model.start_date, model.start_time, model.time_zone) + end_time = self._make_time(model.end_date, model.end_time, model.time_zone) + + event.when = [When(start=start_time, end=end_time)] + return event + + def _make_time(self, date, time=None, tz_name=None): + """ + Returns the gdata formatted date/time string given a date, optional time, + and optional time zone name (e.g. 'US/Pacific'). If the time zone name is None, + no time zone info will be added to the string. + """ + + if time is not None: + d = datetime.datetime.combine(date, time) + else: + d = datetime.datetime(date.year, date.month, date.day) + + if time is None: + s = d.strftime(self.DATE_FMT) + elif tz_name is None: + s = d.strftime(self.DATE_TIME_FMT) + else: + try: + tz = pytz.timezone(tz_name) + except pytz.UnknownTimeZoneError: + raise CalendarError('Invalid time zone: %s' (tz_name,)) + local = tz.localize(d) + zulu = local.astimezone(FixedOffset(0)) + s = zulu.strftime(self.DATE_TIME_TZ_FMT) + + return s + diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,157 @@ +""" +Forms for the gcalendar application. +""" +import datetime +import pytz +from django import forms +from django.conf import settings + +from gcalendar.models import Event + + +TIME_CHOICES = ( + ('00:00', '12:00 am (00:00)'), + ('00:30', '12:30 am (00:30)'), + ('01:00', '1:00 am (01:00)'), + ('01:30', '1:30 am (01:30)'), + ('02:00', '2:00 am (02:00)'), + ('02:30', '2:30 am (02:30)'), + ('03:00', '3:00 am (03:00)'), + ('03:30', '3:30 am (03:30)'), + ('04:00', '4:00 am (04:00)'), + ('04:30', '4:30 am (04:30)'), + ('05:00', '5:00 am (05:00)'), + ('05:30', '5:30 am (05:30)'), + ('06:00', '6:00 am (06:00)'), + ('06:30', '6:30 am (06:30)'), + ('07:00', '7:00 am (07:00)'), + ('07:30', '7:30 am (07:30)'), + ('08:00', '8:00 am (08:00)'), + ('08:30', '8:30 am (08:30)'), + ('09:00', '9:00 am (09:00)'), + ('09:30', '9:30 am (09:30)'), + ('10:00', '10:00 am (10:00)'), + ('10:30', '10:30 am (10:30)'), + ('11:00', '11:00 am (11:00)'), + ('11:30', '11:30 am (11:30)'), + ('12:00', '12:00 am (12:00)'), + ('12:30', '12:30 am (12:30)'), + ('13:00', '1:00 pm (13:00)'), + ('13:30', '1:30 pm (13:30)'), + ('14:00', '2:00 pm (14:00)'), + ('14:30', '2:30 pm (14:30)'), + ('15:00', '3:00 pm (15:00)'), + ('15:30', '3:30 pm (15:30)'), + ('16:00', '4:00 pm (16:00)'), + ('16:30', '4:30 pm (16:30)'), + ('17:00', '5:00 pm (17:00)'), + ('17:30', '5:30 pm (17:30)'), + ('18:00', '6:00 pm (18:00)'), + ('18:30', '6:30 pm (18:30)'), + ('19:00', '7:00 pm (19:00)'), + ('19:30', '7:30 pm (19:30)'), + ('20:00', '8:00 pm (20:00)'), + ('20:30', '8:30 pm (20:30)'), + ('21:00', '9:00 pm (21:00)'), + ('21:30', '9:30 pm (21:30)'), + ('22:00', '10:00 pm (22:00)'), + ('22:30', '10:30 pm (22:30)'), + ('23:00', '11:00 pm (23:00)'), + ('23:30', '11:30 pm (23:30)'), +) + + +class EventEntryForm(forms.ModelForm): + what = forms.CharField(widget=forms.TextInput(attrs={'size': 60})) + start_date = forms.DateField(widget=forms.TextInput(attrs={'size': 10})) + start_time = forms.TimeField(required=False, widget=forms.Select(choices=TIME_CHOICES)) + end_date = forms.DateField(widget=forms.TextInput(attrs={'size': 10})) + end_time = forms.TimeField(required=False, widget=forms.Select(choices=TIME_CHOICES)) + time_zone = forms.CharField(required=False, widget=forms.HiddenInput()) + where = forms.CharField(required=False, widget=forms.TextInput(attrs={'size': 60})) + description = forms.CharField(required=False, + widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) + + DATE_FORMAT = '%m/%d/%Y' # must match the jQuery UI datepicker config + TIME_FORMAT = '%H:%M' + DEFAULT_START_TIME = '19:00' + DEFAULT_END_TIME = '20:00' + + class Meta: + model = Event + fields = ('what', 'start_date', 'start_time', 'end_date', 'end_time', + 'all_day', 'time_zone', 'where', 'description', 'create_forum_thread') + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui'] + + ['css/gcalendar.css']) + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui'] + + ['js/timezone.js', 'js/gcalendar.js']) + + def __init__(self, *args, **kwargs): + initial = kwargs.get('initial', {}) + instance = kwargs.get('instance', None) + + if len(args) == 0: # no POST arguments + if instance is None: + init_day = datetime.date.today().strftime(self.DATE_FORMAT) + if 'start_date' not in initial: + initial['start_date'] = init_day + if 'end_date' not in initial: + initial['end_date'] = init_day + if 'start_time' not in initial: + initial['start_time'] = self.DEFAULT_START_TIME + if 'end_time' not in initial: + initial['end_time'] = self.DEFAULT_END_TIME + else: + initial['start_date'] = instance.start_date.strftime(self.DATE_FORMAT) + initial['end_date'] = instance.end_date.strftime(self.DATE_FORMAT) + if instance.all_day: + initial['start_time'] = self.DEFAULT_START_TIME + initial['end_time'] = self.DEFAULT_END_TIME + else: + if 'start_time' not in initial: + initial['start_time'] = instance.start_time.strftime(self.TIME_FORMAT) + if 'end_time' not in initial: + initial['end_time'] = instance.end_time.strftime(self.TIME_FORMAT) + + kwargs['initial'] = initial + + super(EventEntryForm, self).__init__(*args, **kwargs) + + # We don't want the user to create a forum thread on an existing event + if instance is not None: + del self.fields['create_forum_thread'] + + def clean(self): + start_date = self.cleaned_data.get('start_date') + start_time = self.cleaned_data.get('start_time') + all_day = self.cleaned_data.get('all_day') + end_date = self.cleaned_data.get('end_date') + end_time = self.cleaned_data.get('end_time') + + if start_date and start_time and (all_day or (end_date and end_time)): + if all_day: + start = start_date + end = end_date + else: + start = datetime.datetime.combine(start_date, start_time) + end = datetime.datetime.combine(end_date, end_time) + if start > end: + raise forms.ValidationError("The start date of the event " + "is after the ending time!") + + return self.cleaned_data + + def clean_time_zone(self): + tz = self.cleaned_data['time_zone'] + try: + pytz.timezone(tz) + except pytz.UnknownTimeZoneError: + raise forms.ValidationError("Invalid timezone.") + return tz + diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,159 @@ +""" +Models for the gcalendar application. + +""" +import datetime + +from django.db import models +from django.db.models import Q +from django.contrib.auth.models import User + +from core.markup import site_markup +import forums.tools +from gcalendar.oauth import serialize_token, deserialize_token + + +GIG_FORUM_SLUG = "gigs" + +class PendingEventManager(models.Manager): + """A manager for pending events.""" + + def get_query_set(self): + """Returns a queryset of events that have been approved to update + the Google calendar.""" + return super(PendingEventManager, self).get_query_set().filter( + Q(status=Event.NEW_APRV) | + Q(status=Event.EDIT_APRV) | + Q(status=Event.DEL_APRV) + ) + + +class Event(models.Model): + """Model to represent calendar events.""" + + # Event status codes: + (NEW, NEW_APRV, EDIT_REQ, EDIT_APRV, DEL_REQ, DEL_APRV, ON_CAL) = range(7) + + STATUS_CHOICES = ( + (NEW, 'New'), + (NEW_APRV, 'New Approved'), + (EDIT_REQ, 'Edit Request'), + (EDIT_APRV, 'Edit Approved'), + (DEL_REQ, 'Delete Request'), + (DEL_APRV, 'Delete Approved'), + (ON_CAL, 'On Calendar'), + ) + + user = models.ForeignKey(User) + what = models.CharField(max_length=255) + start_date = models.DateField() + start_time = models.TimeField(null=True, blank=True) + end_date = models.DateField() + end_time = models.TimeField(null=True, blank=True) + time_zone = models.CharField(max_length=64, blank=True) + all_day = models.BooleanField(default=False) + where = models.CharField(max_length=255, blank=True) + description = models.TextField(blank=True) + html = models.TextField(blank=True) + date_submitted = models.DateTimeField(auto_now_add=True) + google_id = models.CharField(max_length=255, blank=True) + google_url = models.URLField(max_length=255, blank=True) + status = models.SmallIntegerField(choices=STATUS_CHOICES, default=NEW, + db_index=True) + create_forum_thread = models.BooleanField(default=False) + + objects = models.Manager() + pending_events = PendingEventManager() + + def __unicode__(self): + return self.what + + class Meta: + ordering = ('-date_submitted', ) + + def save(self, *args, **kwargs): + self.html = site_markup(self.description) + super(Event, self).save(*args, **kwargs) + + def is_approved(self): + return self.status not in (self.NEW, self.EDIT_REQ, self.DEL_REQ) + is_approved.boolean = True + + def google_html(self): + """Returns a HTML tag to the event if it exits.""" + if self.google_url: + return u'On Google' % self.google_url + return u'' + google_html.allow_tags = True + google_html.short_description = 'Google Link' + + def notify_on_calendar(self): + """ + This function should be called when the event has been added to the + Google calendar for the first time. This gives us a chance to perform + any first-time processing, like creating a forum thread. + """ + if self.create_forum_thread: + topic_name = '%s: %s' % (self.start_date.strftime('%m/%d/%Y'), + self.what) + post_body = "%s\n\n[Link to event on Google Calendar](%s)" % ( + self.description, self.google_url) + + forums.tools.create_topic( + forum_slug=GIG_FORUM_SLUG, + user=self.user, + topic_name=topic_name, + post_body=post_body) + + self.create_forum_thread = False + self.save() + + +class AccessTokenManager(models.Manager): + """ + A manager for the AccessToken table. Only one access token is saved in the + database. This manager provides a convenience method to either return that + access token or a brand new one. + + """ + def get_token(self): + try: + token = self.get(pk=1) + except AccessToken.DoesNotExist: + token = AccessToken() + + return token + + +class AccessToken(models.Model): + """ + This model represents serialized OAuth access tokens for reading and + updating the Google Calendar. + + """ + auth_date = models.DateTimeField() + token = models.TextField() + + objects = AccessTokenManager() + + def __unicode__(self): + return u'Access token created on ' + unicode(self.auth_date) + + def update(self, access_token, auth_date=None): + """ + This function updates the AccessToken object with the input parameters: + access_token - an access token from Google's OAuth dance + auth_date - a datetime or None. If None, now() is used. + + """ + self.auth_date = auth_date if auth_date else datetime.datetime.now() + self.token = serialize_token(access_token) + + def access_token(self): + """ + This function returns a Google OAuth access token by deserializing the + token field from the database. + If the token attribute is empty, None is returned. + + """ + return deserialize_token(self.token) if self.token else None diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/oauth.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/oauth.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,99 @@ +""" +This module handles the OAuth integration with Google. + +""" +from __future__ import with_statement +import logging + +import gdata.gauth +from gdata.calendar_resource.client import CalendarResourceClient + +from django.conf import settings + + +logger = logging.getLogger(__name__) +USER_AGENT = 'surfguitar101-gcalendar-v1' +REQ_TOKEN_SESSION_KEY = 'gcalendar oauth request token' + + +def fetch_auth(request, scopes, callback_url): + """ + This function fetches a request token from Google and stores it in the + session. It then returns the authorization URL as a string. + + request - the HttpRequest object for the user requesting the token. The + token is stored in the session object attached to this request. + + scopes - a list of scope strings that the request token is for. See + http://code.google.com/apis/gdata/faq.html#AuthScopes + + callback_url - a string that is the URL that Google should redirect the user + to after the user has authorized our application access to their data. + + This function only supports RSA-SHA1 authentication. Settings in the Django + settings module determine the consumer key and path to the RSA private key. + """ + logger.info("fetch_auth started; callback url='%s'", callback_url) + client = CalendarResourceClient(None, source=USER_AGENT) + + with open(settings.GOOGLE_OAUTH_PRIVATE_KEY_PATH, 'r') as f: + rsa_key = f.read() + logger.info("read RSA key; now getting request token") + + request_token = client.GetOAuthToken( + scopes, + callback_url, + settings.GOOGLE_OAUTH_CONSUMER_KEY, + rsa_private_key=rsa_key) + + logger.info("received token") + request.session[REQ_TOKEN_SESSION_KEY] = request_token + + auth_url = request_token.generate_authorization_url() + logger.info("generated auth url '%s'", str(auth_url)) + + return str(auth_url) + + +def get_access_token(request): + """ + This function should be called after Google has sent the user back to us + after the user authorized us. We retrieve the oauth token from the request + URL and then upgrade it to an access token. We then return the access token. + + """ + logger.info("get_access_token called as '%s'", request.get_full_path()) + + saved_token = request.session.get(REQ_TOKEN_SESSION_KEY) + if saved_token is None: + logger.error("saved request token not found in session!") + return None + + logger.info("extracting token...") + request_token = gdata.gauth.AuthorizeRequestToken(saved_token, + request.build_absolute_uri()) + + logger.info("upgrading to access token...") + + client = CalendarResourceClient(None, source=USER_AGENT) + access_token = client.GetAccessToken(request_token) + + logger.info("upgraded to access token...") + return access_token + + +def serialize_token(token): + """ + This function turns a token into a string and returns it. + + """ + return gdata.gauth.TokenToBlob(token) + + +def deserialize_token(s): + """ + This function turns a string into a token returns it. The string must have + previously been created with serialize_token(). + + """ + return gdata.gauth.TokenFromBlob(s) diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/static/css/gcalendar.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/static/css/gcalendar.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,7 @@ +.markItUp { + width: 600px; +} +.markItUpEditor { + width:543px; + height:200px; +} diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/static/js/gcalendar.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/static/js/gcalendar.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,33 @@ +$(document).ready(function() { + $('#id_start_date').datepicker({constrainInput: true, + dateFormat: 'mm/dd/yy', + onClose: function () { + var end = $('#id_end_date'); + if (this.value > end.val()) + { + end.val(this.value); + } + } + }); + $('#id_end_date').datepicker({constrainInput: true, + dateFormat: 'mm/dd/yy', + onClose: function () { + var start = $('#id_start_date'); + if (this.value < start.val()) + { + start.val(this.value); + } + } + }); + if ($('#id_all_day:checked').length) + { + $('#id_start_time').hide(); + $('#id_end_time').hide(); + $('#id_tz_stuff').hide(); + } + $('#id_all_day').click(function () { + $('#id_start_time').toggle(); + $('#id_end_time').toggle(); + $('#id_tz_stuff').toggle(); + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/static/js/gcalendar_edit.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/static/js/gcalendar_edit.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,24 @@ +$(document).ready(function() { + $('.gcal-del').click(function () { + if (confirm('Really delete this event?')) { + var id = this.id; + if (id.match(/gcal-(\d+)/)) { + $.ajax({ + url: '/calendar/delete/', + type: 'POST', + data: { id : RegExp.$1 }, + dataType: 'text', + success: function (id) { + var id = '#gcal-' + id; + $(id).parents('li').hide('normal'); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + } + } + return false; + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,14 @@ +""" +URLs for the gcalendar application. +""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('gcalendar.views', + url(r'^$', 'index', name='gcalendar-index'), + url(r'^add/$', 'add_event', name='gcalendar-add'), + url(r'^change/$', 'edit_events', name='gcalendar-edit_events'), + url(r'^change/(\d+)/$', 'edit_event', name='gcalendar-edit_event'), + url(r'^delete/$', 'delete_event', name='gcalendar-delete'), + url(r'^thanks/add/$', 'add_thanks', name='gcalendar-add_thanks'), + url(r'^thanks/change/$', 'edit_thanks', name='gcalendar-edit_thanks'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b gcalendar/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/gcalendar/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,122 @@ +""" +Views for the gcalendar application. +""" + +from django.contrib.auth.decorators import login_required +from django.core.urlresolvers import reverse +from django.http import HttpResponse +from django.http import HttpResponseBadRequest +from django.http import HttpResponseForbidden +from django.http import HttpResponseRedirect +from django.http import Http404 +from django.shortcuts import render_to_response +from django.shortcuts import get_object_or_404 +from django.template import RequestContext + +from gcalendar.forms import EventEntryForm +from gcalendar.models import Event + + +def index(request): + user = request.user + if user.is_authenticated(): + profile = user.get_profile() + tz = profile.time_zone + else: + tz = 'US/Pacific' + + return render_to_response('gcalendar/index.html', { + 'tz': tz, + }, + context_instance = RequestContext(request)) + + +@login_required +def add_event(request): + if request.method == 'POST': + form = EventEntryForm(request.POST) + if form.is_valid(): + event = form.save(commit=False) + event.user = request.user + event.repeat = 'none' + event.save() + return HttpResponseRedirect(reverse('gcalendar-add_thanks')) + else: + form = EventEntryForm() + + return render_to_response('gcalendar/event.html', { + 'title': 'Add Calendar Event', + 'form': form, + }, + context_instance = RequestContext(request)) + + +@login_required +def add_thanks(request): + return render_to_response('gcalendar/thanks_add.html', { + }, + context_instance = RequestContext(request)) + + +@login_required +def edit_events(request): + events = Event.objects.filter(user=request.user, status=Event.ON_CAL).order_by('start_date') + return render_to_response('gcalendar/edit.html', { + 'events': events, + }, + context_instance = RequestContext(request)) + + +@login_required +def edit_event(request, event_id): + event = get_object_or_404(Event, pk=event_id) + if event.user != request.user: + raise Http404 + + if request.method == 'POST': + form = EventEntryForm(request.POST, instance=event) + if form.is_valid(): + event = form.save(commit=False) + event.user = request.user + event.repeat = 'none' + event.status = Event.EDIT_REQ + event.save() + return HttpResponseRedirect(reverse('gcalendar-edit_thanks')) + else: + form = EventEntryForm(instance=event) + + return render_to_response('gcalendar/event.html', { + 'title': 'Change Calendar Event', + 'form': form, + }, + context_instance = RequestContext(request)) + + +@login_required +def edit_thanks(request): + return render_to_response('gcalendar/thanks_edit.html', { + }, + context_instance = RequestContext(request)) + + +def delete_event(request): + """This view marks an event for deletion. It is called via AJAX.""" + if request.user.is_authenticated(): + id = request.POST.get('id', None) + if id is None or not id.isdigit(): + return HttpResponseBadRequest() + try: + event = Event.objects.get(pk=id) + except Event.DoesNotExist: + return HttpResponseBadRequest() + if request.user != event.user: + return HttpResponseForbidden() + + event.status = Event.DEL_REQ + event.save() + return HttpResponse(id) + + return HttpResponseForbidden() + + +# vim: ts=4 sw=4 diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/__init__.py --- a/gpp/accounts/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,37 +0,0 @@ -import datetime -import logging - -from django.contrib.auth.models import User - - -def create_new_user(pending_user, ip=None, admin_activation=False): - """ - This function contains the code to create a new user from a - pending user. The pending user is deleted and the new user - is saved. A log message is produced. If admin_activation is false, - then ip should be the user's IP they confirmed from, if available. - - """ - new_user = User() - - new_user.username = pending_user.username - new_user.first_name = '' - new_user.last_name = '' - new_user.email = pending_user.email - new_user.password = pending_user.password # already been hashed - new_user.is_staff = False - new_user.is_active = True - new_user.is_superuser = False - new_user.last_login = datetime.datetime.now() - new_user.date_joined = new_user.last_login - - new_user.save() - pending_user.delete() - - if admin_activation: - msg = 'Accounts registration confirmed by ADMIN for %s' % new_user.username - else: - msg = 'Accounts registration confirmed by USER for %s from %s' % ( - new_user.username, ip) - - logging.info(msg) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/admin.py --- a/gpp/accounts/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,29 +0,0 @@ -"""This file contains the automatic admin site definitions for the accounts Models""" - -from django.contrib import admin -from accounts.models import IllegalUsername -from accounts.models import IllegalEmail -from accounts.models import PendingUser -from accounts import create_new_user - - -class PendingUserAdmin(admin.ModelAdmin): - list_display = ('username', 'email', 'date_joined') - actions = ('activate_account', ) - - def activate_account(self, request, qs): - """ - Activate the accounts of the selected pending users. - - """ - for pending_user in qs: - create_new_user(pending_user, admin_activation=True) - - self.message_user(request, "%s accounts activated" % qs.count()) - - activate_account.short_description = "Activate accounts for selected users" - - -admin.site.register(IllegalUsername) -admin.site.register(IllegalEmail) -admin.site.register(PendingUser, PendingUserAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/fixtures/accounts.json --- a/gpp/accounts/fixtures/accounts.json Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,30 +0,0 @@ -[ - { - "pk": 1, - "model": "accounts.illegalusername", - "fields": { - "username": "root" - } - }, - { - "pk": 2, - "model": "accounts.illegalusername", - "fields": { - "username": "sg101" - } - }, - { - "pk": 3, - "model": "accounts.illegalusername", - "fields": { - "username": "surfguitar101" - } - }, - { - "pk": 4, - "model": "accounts.illegalusername", - "fields": { - "username": "webmaster" - } - } -] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/forms.py --- a/gpp/accounts/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,152 +0,0 @@ -"""forms for the accounts application""" - -import logging - -from django import forms -from django.contrib.auth.models import User -from django.core.urlresolvers import reverse -from django.template.loader import render_to_string -from django.contrib.sites.models import Site -from django.conf import settings - -from core.functions import send_mail -from accounts.models import PendingUser -from accounts.models import IllegalUsername -from accounts.models import IllegalEmail -from antispam.rate_limit import block_ip - - -class RegisterForm(forms.Form): - """Form used to register with the website""" - username = forms.RegexField( - max_length=30, - regex=r'^\w+$', - error_messages={'invalid': ('Your username must be 30 characters or' - ' less and contain only letters, numbers and underscores.')}, - widget=forms.TextInput(attrs={'class': 'text'}), - ) - email = forms.EmailField(widget=forms.TextInput(attrs={'class': 'text'})) - password1 = forms.CharField(label="Password", - widget=forms.PasswordInput(attrs={'class': 'text'})) - password2 = forms.CharField(label="Password confirmation", - widget=forms.PasswordInput(attrs={'class': 'text'})) - agree_age = forms.BooleanField(required=True, - label='I certify that I am over the age of 13', - error_messages={ - 'required': 'Sorry, but you must be over the age of 13 to ' - 'register at our site.', - }) - agree_tos = forms.BooleanField(required=True, - label='I agree to the Terms of Service', - error_messages={ - 'required': 'You have not agreed to our Terms of Service.', - }) - agree_privacy = forms.BooleanField(required=True, - label='I agree to the Privacy Policy', - error_messages={ - 'required': 'You have not agreed to our Privacy Policy.', - }) - question1 = forms.CharField(label="What number appears in the site name?", - widget=forms.TextInput(attrs={'class': 'text'})) - question2 = forms.CharField(label='', required=False, - widget=forms.TextInput(attrs={'style': 'display: none;'})) - - def __init__(self, *args, **kwargs): - self.ip = kwargs.pop('ip', '?') - super(RegisterForm, self).__init__(*args, **kwargs) - - def clean_username(self): - username = self.cleaned_data['username'] - try: - User.objects.get(username=username) - except User.DoesNotExist: - try: - PendingUser.objects.get(username=username) - except PendingUser.DoesNotExist: - try: - IllegalUsername.objects.get(username=username) - except IllegalUsername.DoesNotExist: - return username - self._validation_error("That username is not allowed.", username) - self._validation_error("A pending user with that username already exists.", username) - self._validation_error("A user with that username already exists.", username) - - def clean_email(self): - email = self.cleaned_data['email'] - - if User.objects.filter(email=email).count(): - self._validation_error("A user with that email address already exists.", email) - elif PendingUser.objects.filter(email=email).count(): - self._validation_error("A pending user with that email address already exists.", email) - elif IllegalEmail.objects.filter(email=email).count(): - self._validation_error("That email address is not allowed.", email) - - # email is ok - return email - - def clean_password2(self): - password1 = self.cleaned_data.get("password1", "") - password2 = self.cleaned_data["password2"] - if password1 != password2: - self._validation_error("The two password fields didn't match.") - if len(password1) < 6: - self._validation_error("Please choose a password of 6 characters or more.") - return password2 - - def clean_question1(self): - answer = self.cleaned_data.get('question1') - success = False - if answer: - try: - val = int(answer) - except ValueError: - pass - else: - success = val == 101 - if not success: - self._validation_error("Incorrect answer to our anti-spam question.", answer) - return answer - - def clean_question2(self): - """ - Honeypot field should be empty. - """ - answer = self.cleaned_data.get('question2') - if answer: - block_ip(self.ip) - self._validation_error('Wrong answer #2: %s' % answer) - return answer - - def save(self): - pending_user = PendingUser.objects.create_pending_user(self.cleaned_data['username'], - self.cleaned_data['email'], - self.cleaned_data['password1']) - - # Send the confirmation email - - site = Site.objects.get_current() - admin_email = settings.ADMINS[0][1] - - activation_link = 'http://%s%s' % (site.domain, reverse('accounts.views.register_confirm', - kwargs = {'username' : pending_user.username, 'key' : pending_user.key})) - - msg = render_to_string('accounts/registration_email.txt', - { - 'site_name' : site.name, - 'site_domain' : site.domain, - 'user_email' : pending_user.email, - 'activation_link' : activation_link, - 'username' : pending_user.username, - 'admin_email' : admin_email, - }) - - subject = 'Registration Confirmation for ' + site.name - send_mail(subject, msg, admin_email, [self.cleaned_data['email']]) - logging.info('Accounts/registration conf. email sent to %s for user %s; IP = %s', - self.cleaned_data['email'], pending_user.username, self.ip) - - return pending_user - - def _validation_error(self, msg, param=None): - logging.error('Accounts/registration [%s]: %s (%s)', self.ip, msg, param) - raise forms.ValidationError(msg) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/management/commands/rate_limit_clear.py --- a/gpp/accounts/management/commands/rate_limit_clear.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,54 +0,0 @@ -""" -The rate_limit_clear command is used to clear IP addresses out from our rate -limit protection database. - -""" -from optparse import make_option -import re - -from django.core.management.base import BaseCommand -import redis - -from core.services import get_redis_connection - - -IP_RE = re.compile(r'^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$') - - -class Command(BaseCommand): - help = """Remove IP addresses from the rate limit protection datastore.""" - option_list = list(BaseCommand.option_list) + [ - make_option("--purge", action="store_true", - help="Purge all IP addresses"), - ] - - def handle(self, *args, **kwargs): - try: - con = get_redis_connection() - - # get all rate-limit keys - keys = con.keys('rate-limit-*') - - # if purging, delete them all... - if kwargs['purge']: - if keys: - con.delete(*keys) - return - - # otherwise delete the ones the user asked for - ips = [] - for ip in args: - if IP_RE.match(ip): - key = 'rate-limit-%s' % ip - if key in keys: - ips.append(key) - else: - self.stdout.write('%s not found\n' % ip) - else: - self.stderr.write('invalid IP address %s\n' % ip) - - if ips: - con.delete(*ips) - - except redis.RedisError, e: - self.stderr.write('%s\n' % e) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/models.py --- a/gpp/accounts/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,90 +0,0 @@ -"""Contains models for the accounts application""" - -import datetime -import random -import string -import hashlib -import base64 - -from django.db import models -from django.contrib.auth.models import User -from django.conf import settings - - -class IllegalUsername(models.Model): - """model to represent the list of illegal usernames""" - username = models.CharField(max_length=30, db_index=True) - - def __unicode__(self): - return self.username - - class Meta: - ordering = ('username', ) - - -class IllegalEmail(models.Model): - """model to represent the list of illegal/restricted email addresses""" - email = models.EmailField(db_index=True) - - def __unicode__(self): - return self.email - - class Meta: - ordering = ('email', ) - - -class PendingUserManager(models.Manager): - """user manager for PendingUser model""" - - create_count = 0 - - def create_pending_user(self, username, email, password): - '''creates a new pending user and saves it to the database''' - - temp_user = User() - temp_user.set_password(password) - - now = datetime.datetime.now() - pending_user = self.model(None, - username, - email, - temp_user.password, - now, - self._make_key()) - - pending_user.save() - self.create_count += 1 - return pending_user - - - def purge_expired(self): - expire_time = datetime.datetime.now() - datetime.timedelta(days=1) - expired_pending_users = self.filter(date_joined__lt=expire_time) - expired_pending_users.delete() - - - def _make_key(self): - s = ''.join(random.sample(string.printable, 8)) - delta = datetime.date.today() - datetime.date(1846, 12, 28) - days = base64.urlsafe_b64encode(str(delta * 10)) - key = hashlib.sha1(settings.SECRET_KEY + - unicode(self.create_count) + - unicode(s) + - unicode(days)).hexdigest()[::2] - return key - - -class PendingUser(models.Model): - """model for holding users while they go through the email registration cycle""" - - username = models.CharField(max_length=30, db_index=True) - email = models.EmailField() - password = models.CharField(max_length=128) - date_joined = models.DateTimeField(default=datetime.datetime.now, db_index=True) - key = models.CharField(max_length=20, editable=True) - - objects = PendingUserManager() - - def __unicode__(self): - return self.username - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/static/js/ajax_login.js --- a/gpp/accounts/static/js/ajax_login.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,68 +0,0 @@ -$(function() { - var loginError = $('#login-error'); - var userBox = $('#ajax-login-username'); - var passBox = $('#ajax-login-password'); - var loginDialog = $('#login-dialog').dialog({ - autoOpen: false, - height: 375, - width: 380, - modal: true, - buttons: { - "Login": function() { - loginError.text('').hide(); - $.ajax({ - url: '/accounts/login/ajax/', - type: 'POST', - data: { - username: userBox.val(), - password: passBox.val(), - csrfmiddlewaretoken: csrf_token - }, - dataType: 'json', - success: function(data, textStatus) { - if (data.success) { - loginDialog.dialog("close"); - if (window.location.pathname == "/accounts/logout/") { - window.location.replace("/"); - } - else { - $('#header-nav').html(data.navbar_html); - } - } - else { - loginError.text(data.error).show(); - userBox.val(''); - passBox.val(''); - userBox.focus(); - } - }, - error: function (xhr, textStatus, ex) { - if (xhr.status == 403) { - loginDialog.dialog("close"); - alert("Oops, we are detecting some strange behavior and are blocking this action. If you feel this is an error, please feel free to contact us. Thank you."); - window.location.href = "/"; - } - else { - loginError.text('Oops, an error occurred. If this problem persists, please contact us.').show(); - } - } - }); - }, - "Cancel": function() { - loginDialog.dialog("close"); - } - }, - focus: function() { - $(':input', this).keyup(function(event) { - if (event.keyCode == 13) { - $('.ui-dialog-buttonpane button:first').click(); - } - }); - } - }); - $('#login-link').click(function() { - loginError.text('').hide(); - loginDialog.dialog("open"); - return false; - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/stats.py --- a/gpp/accounts/stats.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,97 +0,0 @@ -""" -This module performs user account related statistics. - -""" -import logging - -from django.db.models.signals import post_save -from django.contrib.auth.models import User - -from core.services import get_redis_connection - - -# Redis key names -USER_COUNT_KEY = "accounts:user_count" -NEW_USERS_KEY = "accounts:new_users" - - -logger = logging.getLogger(__name__) - - -def on_user_save(sender, **kwargs): - """ - This function is our signal handler for when a User object is saved. - - """ - from accounts.tasks import user_stats_task - - if kwargs['created']: - user = kwargs['instance'] - - # kick off a task to update user stats - - user_stats_task.delay(user.id) - - -def update_user_stats(user_id): - """ - This function is given a new user id and is responsible for updating various - user account statistics. - - """ - try: - user = User.objects.get(pk=user_id) - except User.DoesNotExist: - logger.warning("update_user_stats: user id %d doesn't exist", user_id) - return - - redis = get_redis_connection() - - # update the count of registered users - - count = redis.incr(USER_COUNT_KEY) - if count == 1: - # it is likely redis got wiped out; update it now - count = User.objects.all().count() - redis.set(USER_COUNT_KEY, count) - - # update the list of new users - - pipeline = redis.pipeline() - pipeline.lpush(NEW_USERS_KEY, user.username) - pipeline.ltrim(NEW_USERS_KEY, 0, 9) - pipeline.execute() - - -def get_user_count(redis=None): - """ - This function returns the current count of users. - - """ - if redis is None: - redis = get_redis_connection() - return redis.get(USER_COUNT_KEY) - - -def get_new_users(redis=None): - """ - This function returns a list of new usernames. - - """ - if redis is None: - redis = get_redis_connection() - return redis.lrange(NEW_USERS_KEY, 0, -1) - - -def get_user_stats(redis=None): - """ - This function returns a tuple of the user stats. Element 0 is the user count - and element 1 is the list of new users. - - """ - if redis is None: - redis = get_redis_connection() - return get_user_count(redis), get_new_users(redis) - - -post_save.connect(on_user_save, sender=User, dispatch_uid='accounts.stats') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/tasks.py --- a/gpp/accounts/tasks.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,16 +0,0 @@ -""" -Celery tasks for the accounts application. - -""" -from celery.task import task - -from accounts.stats import update_user_stats - - -@task -def user_stats_task(user_id): - """ - Run the update_user_stats() function on a new task. - - """ - update_user_stats(user_id) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/templatetags/accounts_tags.py --- a/gpp/accounts/templatetags/accounts_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,20 +0,0 @@ -""" -Template tags for the accounts applications. - -""" -from django import template - -from accounts.stats import get_user_stats - - -register = template.Library() - - -@register.inclusion_tag('accounts/user_stats_tag.html') -def user_stats(): - """ - This tag renders the total number of site users and a list of new users. - - """ - num_users, new_users = get_user_stats() - return {'num_users': num_users, 'new_users': new_users} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/tests/__init__.py --- a/gpp/accounts/tests/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1 +0,0 @@ -from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/tests/view_tests.py --- a/gpp/accounts/tests/view_tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,254 +0,0 @@ -""" -View tests for the accounts application. - -""" -import datetime - -from django.test import TestCase -from django.core.urlresolvers import reverse -from django.contrib.auth.models import User -from django.contrib.auth.hashers import check_password - -from antispam.rate_limit import unblock_ip -from accounts.models import PendingUser -from accounts.models import IllegalUsername -from accounts.models import IllegalEmail - - -class RegistrationTest(TestCase): - - def setUp(self): - u = User.objects.create_user('existing_user', 'existing_user@example.com', 'pw') - u.save() - - # a 2nd user has the same email as another - u = User.objects.create_user('existing_user2', 'existing_user@example.com', 'pw') - u.save() - - PendingUser.objects.create(username='pending_user', - email='pending_user@example.com', - password='pw', - date_joined=datetime.datetime.now(), - key='key') - - IllegalUsername.objects.create(username='illegalusername') - IllegalEmail.objects.create(email='illegal@example.com') - - def tearDown(self): - unblock_ip('127.0.0.1') - - def test_get_view(self): - """ - Test a simple get of the registration view - - """ - response = self.client.get(reverse('accounts-register')) - self.assertEqual(response.status_code, 200) - - def test_existing_user(self): - """ - Ensure we can't register with an existing username. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'existing_user', - 'email': 'test@example.com', - 'password1': 'my_password', - 'password2': 'my_password', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, 'A user with that username already exists') - - def test_pending_user(self): - """ - Ensure we can't register with a pending username. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'pending_user', - 'email': 'test@example.com', - 'password1': 'my_password', - 'password2': 'my_password', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, 'A pending user with that username already exists') - - def test_illegal_username(self): - """ - Ensure we can't register with a banned username. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'illegalusername', - 'email': 'test@example.com', - 'password1': 'my_password', - 'password2': 'my_password', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, 'That username is not allowed') - - def test_duplicate_existing_email(self): - """ - Ensure we can't register with a duplicate email address. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'a_new_user', - 'email': 'existing_user@example.com', - 'password1': 'my_password', - 'password2': 'my_password', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, 'A user with that email address already exists') - - def test_duplicate_pending_email(self): - """ - Ensure we can't register with a duplicate email address. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'a_new_user', - 'email': 'pending_user@example.com', - 'password1': 'my_password', - 'password2': 'my_password', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, 'A pending user with that email address already exists') - - def test_illegal_email(self): - """ - Ensure we can't register with a banned email address. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'a_new_user', - 'email': 'illegal@example.com', - 'password1': 'my_password', - 'password2': 'my_password', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, 'That email address is not allowed') - - def test_password_match(self): - """ - Ensure the passwords match. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'a_new_user', - 'email': 'test@example.com', - 'password1': 'my_password', - 'password2': 'my_password_doesnt match', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, "The two password fields didn't match") - - def test_question1(self): - """ - Ensure our anti-spam question is answered. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'a_new_user', - 'email': 'test@example.com', - 'password1': 'my_password', - 'password2': 'my_password_doesnt match', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': 'huh', - 'question2': '', - }) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, "Incorrect answer to our anti-spam question") - - def test_question2(self): - """ - Ensure our honeypot question check works. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'a_new_user', - 'email': 'test@example.com', - 'password1': 'my_password', - 'password2': 'my_password_doesnt match', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': 'non blank', - }) - - self.assertEqual(response.status_code, 403) - - def test_success(self): - """ - Ensure we can successfully register. - - """ - response = self.client.post(reverse('accounts-register'), { - 'username': 'a_new_user', - 'email': 'test@example.com', - 'password1': 'my_password', - 'password2': 'my_password', - 'agree_age': 'on', - 'agree_tos': 'on', - 'agree_privacy': 'on', - 'question1': '101', - 'question2': '', - }) - - self.assertEqual(response.status_code, 302) - - try: - pending = PendingUser.objects.get(username='a_new_user') - except PendingUser.DoesNotExist: - self.fail("PendingUser was not created") - - self.assertEqual(pending.email, 'test@example.com') - self.assertTrue(datetime.datetime.now() - pending.date_joined < - datetime.timedelta(minutes=1)) - self.assertTrue(check_password('my_password', pending.password)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/urls.py --- a/gpp/accounts/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,47 +0,0 @@ -"""urls for the accounts application""" -from django.conf.urls import patterns, url -from django.conf import settings - -urlpatterns = patterns('accounts.views', - url(r'^login/ajax/$', 'login_ajax', name='accounts-login_ajax'), - url(r'^register/$', 'register', name='accounts-register'), - (r'^register/thanks/$', 'register_thanks'), - (r'^register/confirm/(?P[\w.@+-]{1,30})/(?P[a-zA-Z0-9]{20})/$', 'register_confirm'), -) - -urlpatterns += patterns('', - url(r'^login/$', - 'django.contrib.auth.views.login', - kwargs={'template_name': 'accounts/login.html'}, - name='accounts-login'), - url(r'^logout/$', - 'django.contrib.auth.views.logout', - kwargs={'template_name': 'accounts/logout.html'}, - name='accounts-logout'), - (r'^password/$', - 'django.contrib.auth.views.password_change', - {'template_name': 'accounts/password_change.html', - 'post_change_redirect': settings.LOGIN_REDIRECT_URL}), - url(r'^password/reset/$', - 'django.contrib.auth.views.password_reset', - kwargs={'template_name': 'accounts/password_reset.html', - 'email_template_name': 'accounts/password_reset_email.txt', - 'post_reset_redirect': '/accounts/password/reset/sent/'}, - name='accounts-password_reset'), - url(r'^password/reset/sent/$', - 'django.contrib.auth.views.password_reset_done', - kwargs={'template_name': 'accounts/password_reset_sent.html'}, - name='accounts-password_reset_sent'), - url(r'^password/reset/confirm/(?P[0-9a-z]+)/(?P[0-9a-z]+-\w+)/$', - 'django.contrib.auth.views.password_reset_confirm', - kwargs={ - 'template_name': 'accounts/password_reset_confirm.html', - 'post_reset_redirect': '/accounts/password/reset/success/', - }, - name='accounts-password_reset_confirm'), - url(r'^password/reset/success/$', - 'django.contrib.auth.views.password_reset_complete', - kwargs={'template_name': 'accounts/password_reset_complete.html'}, - name='accounts-password_reset_success'), -) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/accounts/views.py --- a/gpp/accounts/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,117 +0,0 @@ -""" -Views for the accounts application. - -""" -import datetime -import logging - -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.template.loader import render_to_string -from django.contrib.auth.models import User -from django.http import HttpResponse, HttpResponseRedirect -from django.core.urlresolvers import reverse -from django.conf import settings -from django.contrib.auth.forms import AuthenticationForm -from django.contrib.auth import login -from django.utils import simplejson - -from accounts.models import PendingUser -from accounts.forms import RegisterForm -from accounts import create_new_user -from antispam.decorators import rate_limit - - -####################################################################### - -@rate_limit(count=10, interval=datetime.timedelta(minutes=1)) -def register(request): - if request.user.is_authenticated(): - return HttpResponseRedirect(settings.LOGIN_REDIRECT_URL) - - if request.method == 'POST': - form = RegisterForm(request.POST, ip=request.META.get('REMOTE_ADDR', '?')) - if form.is_valid(): - form.save() - return HttpResponseRedirect(reverse('accounts.views.register_thanks')) - else: - form = RegisterForm() - - return render_to_response('accounts/register.html', { - 'form': form, - }, - context_instance = RequestContext(request)) - -####################################################################### - -def register_thanks(request): - if request.user.is_authenticated(): - return HttpResponseRedirect(settings.LOGIN_REDIRECT_URL) - - return render_to_response('accounts/register_thanks.html', - context_instance = RequestContext(request)) - -####################################################################### - -def register_confirm(request, username, key): - if request.user.is_authenticated(): - return HttpResponseRedirect(settings.LOGIN_REDIRECT_URL) - - # purge expired users - - PendingUser.objects.purge_expired() - - ip = request.META.get('REMOTE_ADDR', '?') - try: - pending_user = PendingUser.objects.get(username = username) - except PendingUser.DoesNotExist: - logging.error('Accounts register_confirm [%s]: user does not exist: %s', ip, username) - return render_to_response('accounts/register_failure.html', { - 'username': username, - }, - context_instance = RequestContext(request)) - - if pending_user.key != key: - logging.error('Accounts register_confirm [%s]: key error: %s', ip, username) - return render_to_response('accounts/register_failure.html', { - 'username': username, - }, - context_instance = RequestContext(request)) - - create_new_user(pending_user, ip) - - return render_to_response('accounts/register_success.html', { - 'username': username, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@rate_limit(count=10, interval=datetime.timedelta(minutes=1), - lockout=datetime.timedelta(minutes=2)) -def login_ajax(request): - """ - This view function handles a login via AJAX. - - """ - if not request.is_ajax(): - return HttpResponseRedirect(reverse('accounts-login')) - - response = { - 'success': False, - 'error': '', - 'navbar_html': '' - } - - if request.method == "POST": - form = AuthenticationForm(data=request.POST) - if form.is_valid(): - login(request, form.get_user()) - response['success'] = True - response['navbar_html'] = render_to_string('navbar.html', - {'user': request.user}, RequestContext(request)) - else: - response['error'] = 'Invalid username or password' - - return HttpResponse(simplejson.dumps(response), - content_type='application/json') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/__init__.py --- a/gpp/antispam/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,13 +0,0 @@ -import datetime - -from django.contrib.auth import views as auth_views - -from antispam.decorators import rate_limit - -SPAM_PHRASE_KEY = "antispam.spam_phrases" -BUSTED_MESSAGE = ("Your post has tripped our spam filter. Your account has " - "been suspended pending a review of your post. If this was a mistake " - "then we apologize; your account will be restored shortly.") - -# Install rate limiting on auth login -auth_views.login = rate_limit(lockout=datetime.timedelta(minutes=2))(auth_views.login) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/admin.py --- a/gpp/antispam/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,12 +0,0 @@ -"""Admin definitions for the antispam application.""" - -from django.contrib import admin - -from antispam.models import SpamPhrase - - -class SpamPhraseAdmin(admin.ModelAdmin): - search_fields = ('phrase', ) - - -admin.site.register(SpamPhrase, SpamPhraseAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/decorators.py --- a/gpp/antispam/decorators.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,59 +0,0 @@ -""" -This module contains decorators for the antispam application. - -""" -from datetime import timedelta -from functools import wraps - -from django.shortcuts import render -from django.utils import simplejson - -from antispam.rate_limit import RateLimiter, RateLimiterUnavailable - - -def rate_limit(count=10, interval=timedelta(minutes=1), - lockout=timedelta(hours=8)): - - def decorator(fn): - - @wraps(fn) - def wrapped(request, *args, **kwargs): - - ip = request.META.get('REMOTE_ADDR') - try: - rate_limiter = RateLimiter(ip, count, interval, lockout) - if rate_limiter.is_blocked(): - return render(request, 'antispam/blocked.html', status=403) - - except RateLimiterUnavailable: - # just call the function and return the result - return fn(request, *args, **kwargs) - - response = fn(request, *args, **kwargs) - - if request.method == 'POST': - - # Figure out if the view succeeded; if it is a non-ajax view, - # then success means a redirect is about to occur. If it is - # an ajax view, we have to decode the json response. - success = False - if not request.is_ajax(): - success = (response and response.has_header('location') and - response.status_code == 302) - elif response: - json_resp = simplejson.loads(response.content) - success = json_resp['success'] - - if not success: - try: - blocked = rate_limiter.incr() - except RateLimiterUnavailable: - blocked = False - - if blocked: - return render(request, 'antispam/blocked.html', status=403) - - return response - - return wrapped - return decorator diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/models.py --- a/gpp/antispam/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,23 +0,0 @@ -"""Models for the antispam application.""" -from django.db import models -from django.core.cache import cache - -from antispam import SPAM_PHRASE_KEY - - -class SpamPhrase(models.Model): - """A SpamPhrase is a string that is checked for in user input. User input - containing a SpamPhrase should be blocked and flagged. - """ - phrase = models.CharField(max_length=64) - - class Meta: - ordering = ('phrase', ) - - def __unicode__(self): - return self.phrase - - def save(self, *args, **kwargs): - cache.delete(SPAM_PHRASE_KEY) - self.phrase = self.phrase.lower() - super(SpamPhrase, self).save(*args, **kwargs) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/rate_limit.py --- a/gpp/antispam/rate_limit.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,152 +0,0 @@ -""" -This module contains the rate limiting functionality. - -""" -import datetime -import logging - -import redis - -from core.services import get_redis_connection - - -logger = logging.getLogger(__name__) - - -# This exception is thrown upon any Redis error. This insulates client code from -# knowing that we are using Redis and will allow us to use something else in the -# future. -class RateLimiterUnavailable(Exception): - pass - - -def _make_key(ip): - """ - Creates and returns a key string from a given IP address. - - """ - return 'rate-limit-' + ip - - -def _get_connection(): - """ - Create and return a Redis connection. Returns None on failure. - """ - try: - conn = get_redis_connection() - except redis.RedisError, e: - logger.error("rate limit: %s" % e) - raise RateLimiterUnavailable - - return conn - - -def _to_seconds(interval): - """ - Converts the timedelta interval object into a count of seconds. - - """ - return interval.days * 24 * 3600 + interval.seconds - - -def block_ip(ip, count=1000000, interval=datetime.timedelta(weeks=2)): - """ - This function jams the rate limit record for the given IP so that the IP is - blocked for the given interval. If the record doesn't exist, it is created. - This is useful for manually blocking an IP after detecting suspicious - behavior. - This function may throw RateLimiterUnavailable. - - """ - key = _make_key(ip) - conn = _get_connection() - - try: - conn.setex(key, time=_to_seconds(interval), value=count) - except redis.RedisError, e: - logger.error("rate limit (block_ip): %s" % e) - raise RateLimiterUnavailable - - logger.info("Rate limiter blocked IP %s; %d / %s", ip, count, interval) - - -def unblock_ip(ip): - """ - This function removes the block for the given IP address. - - """ - key = _make_key(ip) - conn = _get_connection() - try: - conn.delete(key) - except redis.RedisError, e: - logger.error("rate limit (unblock_ip): %s" % e) - raise RateLimiterUnavailable - - logger.info("Rate limiter unblocked IP %s", ip) - - -class RateLimiter(object): - """ - This class encapsulates the rate limiting logic for a given IP address. - - """ - def __init__(self, ip, set_point, interval, lockout): - self.ip = ip - self.set_point = set_point - self.interval = interval - self.lockout = lockout - self.key = _make_key(ip) - self.conn = _get_connection() - - def is_blocked(self): - """ - Return True if the IP is blocked, and false otherwise. - - """ - try: - val = self.conn.get(self.key) - except redis.RedisError, e: - logger.error("RateLimiter (is_blocked): %s" % e) - raise RateLimiterUnavailable - - try: - val = int(val) if val else 0 - except ValueError: - return False - - blocked = val >= self.set_point - if blocked: - logger.info("Rate limiter blocking %s", self.ip) - - return blocked - - def incr(self): - """ - One is added to a counter associated with the IP address. If the - counter exceeds set_point per interval, True is returned, and False - otherwise. If the set_point is exceeded for the first time, the counter - associated with the IP is set to expire according to the lockout - parameter. - - """ - try: - val = self.conn.incr(self.key) - - # Set expire time, if necessary. - # If this is the first time, set it according to interval. - # If the set_point has just been exceeded, set it according to lockout. - if val == 1: - self.conn.expire(self.key, _to_seconds(self.interval)) - elif val == self.set_point: - self.conn.expire(self.key, _to_seconds(self.lockout)) - - tripped = val >= self.set_point - - if tripped: - logger.info("Rate limiter tripped for %s; counter = %d", self.ip, val) - return tripped - - except redis.RedisError, e: - logger.error("RateLimiter (incr): %s" % e) - raise RateLimiterUnavailable diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/tests/__init__.py --- a/gpp/antispam/tests/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,2 +0,0 @@ -from rate_limit_tests import * -from utils_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/tests/rate_limit_tests.py --- a/gpp/antispam/tests/rate_limit_tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,77 +0,0 @@ -""" -Tests for the rate limiting function in the antispam application. - -""" -from django.test import TestCase -from django.core.urlresolvers import reverse - -from antispam.rate_limit import _make_key -from core.services import get_redis_connection - - -class RateLimitTestCase(TestCase): - KEY = _make_key('127.0.0.1') - - def setUp(self): - self.conn = get_redis_connection() - self.conn.delete(self.KEY) - - def tearDown(self): - self.conn.delete(self.KEY) - - def testRegistrationLockout(self): - - for i in range(1, 11): - response = self.client.post( - reverse('accounts-register'), - {}, - follow=True) - - if i < 10: - self.assertEqual(response.status_code, 200) - self.assertTemplateUsed(response, 'accounts/register.html') - elif i >= 10: - self.assertEqual(response.status_code, 403) - self.assertTemplateUsed(response, 'antispam/blocked.html') - - def testLoginLockout(self): - - for i in range(1, 11): - response = self.client.post( - reverse('accounts-login'), - {}, - follow=True) - - if i < 10: - self.assertEqual(response.status_code, 200) - self.assertTemplateUsed(response, 'accounts/login.html') - elif i >= 10: - self.assertEqual(response.status_code, 403) - self.assertTemplateUsed(response, 'antispam/blocked.html') - - def testHoneypotLockout(self): - - response = self.client.post( - reverse('accounts-register'), { - 'username': u'test_user', - 'email': u'test_user@example.com', - 'password1': u'password', - 'password2': u'password', - 'agree_age': u'on', - 'agree_tos': u'on', - 'agree_privacy': u'on', - 'question1': u'101', - 'question2': u'DsjkdE$', - }, - follow=True) - - val = self.conn.get(self.KEY) - self.assertEqual(val, '1000001') - - response = self.client.post( - reverse('accounts-login'), - {}, - follow=True) - - self.assertEqual(response.status_code, 403) - self.assertTemplateUsed(response, 'antispam/blocked.html') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/antispam/tests/utils_tests.py --- a/gpp/antispam/tests/utils_tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,37 +0,0 @@ -""" -Tests for the antispam application. -""" -from django.test import TestCase -from django.core.cache import cache - -from antispam import SPAM_PHRASE_KEY -from antispam.models import SpamPhrase -from antispam.utils import contains_spam - - -class AntispamCase(TestCase): - - def test_no_phrases(self): - """ - Tests that an empty spam phrase table works. - """ - cache.delete(SPAM_PHRASE_KEY) - self.assertFalse(contains_spam("Here is some random text.")) - - def test_phrases(self): - """ - Simple test of some phrases. - """ - SpamPhrase.objects.create(phrase="grytner") - SpamPhrase.objects.create(phrase="allday.ru") - SpamPhrase.objects.create(phrase="stefa.pl") - - self.assert_(contains_spam("grytner")) - self.assert_(contains_spam("11grytner")) - self.assert_(contains_spam("11grytner>")) - self.assert_(contains_spam("1djkl jsd stefa.pl")) - self.assert_(contains_spam("1djkl jsd ' % (obj.image.url, obj.description) - image_tag.allow_tags = True - -admin.site.register(Campaign, CampaignAdmin) -admin.site.register(Banner, BannerAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/banners/models.py --- a/gpp/banners/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,60 +0,0 @@ -""" -Models for the banners application. - -""" -import datetime - -from django.db import models - - -class Campaign(models.Model): - """ - A model to represent an ad or banner campaign. - - """ - name = models.CharField(max_length=128) - slug = models.SlugField() - creation_date = models.DateTimeField(blank=True) - - def __unicode__(self): - return self.name - - class Meta: - ordering = ['name'] - - def save(self, *args, **kwargs): - if not self.pk and not self.creation_date: - self.creation_date = datetime.datetime.now() - - super(Campaign, self).save(*args, **kwargs) - - -def banner_upload_to(instance, filename): - """ - An "upload_to" function for the Banner model. - - """ - return "banners/%s/%s" % (instance.campaign.slug, filename) - - -class Banner(models.Model): - """ - A model to represent a banner. - - """ - campaign = models.ForeignKey(Campaign) - image = models.ImageField(upload_to=banner_upload_to) - description = models.CharField(max_length=128) - creation_date = models.DateTimeField(blank=True) - - def __unicode__(self): - return self.description - - class Meta: - ordering = ['-creation_date'] - - def save(self, *args, **kwargs): - if not self.pk and not self.creation_date: - self.creation_date = datetime.datetime.now() - - super(Banner, self).save(*args, **kwargs) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/banners/templatetags/banner_tags.py --- a/gpp/banners/templatetags/banner_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,60 +0,0 @@ -""" -Template tags for the banners application. - -""" -import logging - -from django import template -import redis - -from core.services import get_redis_connection -from banners.models import Banner - - -register = template.Library() -logger = logging.getLogger(__name__) - -BANNER_URL_KEY = 'banners:url:%s' - - -@register.simple_tag -def banner_url(slug): - """ - Returns the URL for the next banner in the campaign whose slug is 'slug'. - - For each campaign, a list of banner URLs are kept in Redis. Each time this - tag is called, the front banner is popped off the list. When the list is - empty, we refresh the list from the database. In this way the banners for a - campaign are cycled through. - """ - key = BANNER_URL_KEY % slug - - try: - conn = get_redis_connection() - url = conn.lpop(key) - except redis.RedisError, e: - logger.error("banner_url: '%s' on lpop", e) - return u'' - - if url: - return url - - # list not found or empty, rebuild it from the database - - qs = Banner.objects.filter(campaign__slug=slug) - urls = [banner.image.url for banner in qs] - if not urls: - logger.warning("banner_url: no banners for campaign '%s'", slug) - return u'' - - url = urls[0] - urls = urls[1:] - - if urls: - try: - conn.rpush(key, *urls) - except redis.RedisError, e: - logger.error("banner_url: '%s' on rpush", e) - pass - - return url diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/__init__.py --- a/gpp/bio/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1 +0,0 @@ -import signals diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/admin.py --- a/gpp/bio/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,151 +0,0 @@ -""" -This file contains the admin definitions for the bio application. -""" -import datetime - -from django.contrib import admin - -import django.contrib.auth.models -import django.contrib.auth.admin - -import bio.models -import bio.badges -from antispam.utils import deactivate_spammer - - -class BadgeOwnerInline(admin.TabularInline): - model = bio.models.BadgeOwnership - extra = 1 - - -class UserProfileAdmin(admin.ModelAdmin): - search_fields = ('user__username', 'user__first_name', 'user__last_name', - 'user__email') - exclude = ('profile_html', 'signature_html') - list_display = ('__unicode__', 'user_is_active', 'get_status_display', 'status_date') - readonly_fields = ('status', 'status_date', 'update_date') - list_filter = ('status', ) - date_hierarchy = 'status_date' - inlines = (BadgeOwnerInline, ) - actions = ( - 'mark_active', - 'mark_resigned', - 'mark_removed', - 'mark_suspended', - 'mark_spammer', - 'mark_stranger', - ) - - def get_status_display(self, obj): - return obj.get_status_display() - get_status_display.short_description = 'Status' - - def mark_user_status(self, request, qs, status): - """ - Common code for the admin actions. Updates the status field in the - profiles to 'status'. Updates the status_date. Sets the is_active - field to True if the status is STA_ACTIVE and False otherwise. - """ - now = datetime.datetime.now() - for profile in qs: - profile.user.is_active = (status == bio.models.STA_ACTIVE or - status == bio.models.STA_STRANGER) - profile.user.save() - profile.status = status - profile.status_date = now - profile.save(content_update=False) - - count = len(qs) - msg = "1 user" if count == 1 else "%d users" % count - self.message_user(request, "%s successfully marked as %s." % (msg, - bio.models.USER_STATUS_CHOICES[status][1])) - - def mark_active(self, request, qs): - """ - Marks users as active. Updates their profile status to STA_ACTIVE. - """ - self.mark_user_status(request, qs, bio.models.STA_ACTIVE) - mark_active.short_description = "Mark selected users as active" - - def mark_resigned(self, request, qs): - """ - Marks users as inactive. Updates their profile status to STA_RESIGNED. - """ - self.mark_user_status(request, qs, bio.models.STA_RESIGNED) - mark_resigned.short_description = "Mark selected users as resigned" - - def mark_removed(self, request, qs): - """ - Marks users as inactive. Updates their profile status to STA_REMOVED. - """ - self.mark_user_status(request, qs, bio.models.STA_REMOVED) - mark_removed.short_description = "Mark selected users as removed" - - def mark_suspended(self, request, qs): - """ - Marks users as inactive. Updates their profile status to STA_SUSPENDED. - """ - self.mark_user_status(request, qs, bio.models.STA_SUSPENDED) - mark_suspended.short_description = "Mark selected users as suspended" - - def mark_spammer(self, request, qs): - """ - Calls deactivate_spammer() on each user in the profile queryset. - - """ - count = qs.count() - for profile in qs: - deactivate_spammer(profile.user) - - self.message_user(request, - "%s profile(s) successfully marked as spammers." % count) - - mark_spammer.short_description = "Mark selected users as spammers" - - def mark_stranger(self, request, qs): - """ - Marks users as strangers. Updates their profile status to STA_STRANGER. - """ - self.mark_user_status(request, qs, bio.models.STA_STRANGER) - mark_stranger.short_description = "Mark selected users as strangers" - - -class UserProfileFlagAdmin(admin.ModelAdmin): - list_display = ['__unicode__', 'flag_date', 'get_profile_url'] - actions = ['accept_flags'] - raw_id_fields = ['user', 'profile'] - - def accept_flags(self, request, qs): - """ - This action awards a security pin to the user that reported the - profile, deletes the flags, then deactivates the spammers. - """ - count = qs.count() - for flag in qs: - deactivate_spammer(flag.profile.user) - bio.badges.award_badge(bio.badges.SECURITY_PIN, flag.user) - flag.delete() - - self.message_user(request, - "%s profile(s) successfully marked as spammers." % count) - - accept_flags.short_description = "Mark selected profiles as spammers" - - -class BadgeAdmin(admin.ModelAdmin): - list_display = ('name', 'html', 'order', 'numeric_id', 'description') - list_editable = ('order', 'numeric_id') - - -# We like the User admin but would like a date hierarcy on date_joined. -class UserAdmin(django.contrib.auth.admin.UserAdmin): - date_hierarchy = 'date_joined' - - -admin.site.register(bio.models.UserProfile, UserProfileAdmin) -admin.site.register(bio.models.UserProfileFlag, UserProfileFlagAdmin) -admin.site.register(bio.models.Badge, BadgeAdmin) - -# Unregister existing ModelAdmin for User, then register ours -admin.site.unregister(django.contrib.auth.models.User) -admin.site.register(django.contrib.auth.models.User, UserAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/badges.py --- a/gpp/bio/badges.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,37 +0,0 @@ -"""This module contains user profile badge-related functionality.""" -import logging - -from bio.models import Badge -from bio.models import BadgeOwnership - - -# Numeric ID's for badges that are awarded for user actions: -(CONTRIBUTOR_PIN, CALENDAR_PIN, NEWS_PIN, LINK_PIN, DOWNLOAD_PIN, - SECURITY_PIN, POTD_PIN) = range(7) - - -def award_badge(badge_id, user): - """This function awards the badge specified by badge_id - to the given user. If the user already has the badge, - the badge count is incremented by one. - """ - try: - badge = Badge.objects.get(numeric_id=badge_id) - except Badge.DoesNotExist: - logging.error("Can't award badge with numeric_id = %d", badge_id) - return - - profile = user.get_profile() - - # Does the user already have badges of this type? - try: - bo = BadgeOwnership.objects.get(profile=profile, badge=badge) - except BadgeOwnership.DoesNotExist: - # No badge of this type, yet - bo = BadgeOwnership(profile=profile, badge=badge, count=1) - else: - # Already have this badge - bo.count += 1 - bo.save() - - logging.info('Awarded %s with the badge: %s', user.username, badge.name) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/fixtures/badges.json --- a/gpp/bio/fixtures/badges.json Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,112 +0,0 @@ -[ - { - "pk": 7, - "model": "bio.badge", - "fields": { - "numeric_id": 2, - "image": "badges/newspaper.png", - "order": 0, - "name": "News Pin", - "description": "For submitting a news article to the site news." - } - }, - { - "pk": 4, - "model": "bio.badge", - "fields": { - "numeric_id": 1, - "image": "badges/date.png", - "order": 1, - "name": "Calendar Pin", - "description": "For adding an event to the site calendar." - } - }, - { - "pk": 9, - "model": "bio.badge", - "fields": { - "numeric_id": 3, - "image": "badges/world_link.png", - "order": 2, - "name": "Link Pin", - "description": "For submitting a link to the site web links database." - } - }, - { - "pk": 5, - "model": "bio.badge", - "fields": { - "numeric_id": 4, - "image": "badges/disk.png", - "order": 3, - "name": "Download Pin", - "description": "For uploading a file to the site downloads library." - } - }, - { - "pk": 6, - "model": "bio.badge", - "fields": { - "numeric_id": 0, - "image": "badges/money_dollar.png", - "order": 4, - "name": "Contributor Pin", - "description": "For making a donation to the site." - } - }, - { - "pk": 8, - "model": "bio.badge", - "fields": { - "numeric_id": 5, - "image": "badges/shield.png", - "order": 5, - "name": "Security Pin", - "description": "For reporting spam or abuse." - } - }, - { - "pk": 10, - "model": "bio.badge", - "fields": { - "numeric_id": 6, - "image": "badges/camera.png", - "order": 6, - "name": "POTD Pin", - "description": "For submitting a photo of the day." - } - }, - { - "pk": 1, - "model": "bio.badge", - "fields": { - "numeric_id": 100, - "image": "badges/award_star_bronze_1.png", - "order": 7, - "name": "Bronze Star", - "description": "For service to the site and community." - } - }, - { - "pk": 2, - "model": "bio.badge", - "fields": { - "numeric_id": 101, - "image": "badges/award_star_silver_2.png", - "order": 8, - "name": "Silver Star", - "description": "For distinguished and dedicated service to the site and community." - } - }, - { - "pk": 3, - "model": "bio.badge", - "fields": { - "numeric_id": 102, - "image": "badges/award_star_gold_3.png", - "order": 9, - "name": "Gold Star", - "description": "For exceptional and meritorious service to the site and community." - } - } -] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/forms.py --- a/gpp/bio/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,124 +0,0 @@ -""" -This file contains the forms used by the bio application. -""" -try: - from cStringIO import StringIO -except: - from StringIO import StringIO - -from django import forms -from django.conf import settings -from django.core.files.base import ContentFile -from django.contrib.auth.models import User - -from bio.models import UserProfile -from core.widgets import AutoCompleteUserInput -from core.image import parse_image, downscale_image_square - - -class EditUserForm(forms.ModelForm): - """Form for editing the fields of the User model.""" - email = forms.EmailField(label='Email', required=True) - class Meta: - model = User - fields = ('first_name', 'last_name', 'email') - - -class EditUserProfileForm(forms.ModelForm): - """Form for editing the fields of the UserProfile model.""" - location = forms.CharField(required=False, widget=forms.TextInput(attrs={'size' : 64 })) - occupation = forms.CharField(required=False, widget=forms.TextInput(attrs={'size' : 64 })) - interests = forms.CharField(required=False, widget=forms.TextInput(attrs={'size' : 64 })) - time_zone = forms.CharField(required=False, widget=forms.HiddenInput()) - use_24_time = forms.BooleanField(label='Show times in 24-hour mode', required=False) - profile_text = forms.CharField(required=False, - widget=forms.Textarea(attrs={'class': 'markItUp'})) - signature = forms.CharField(required=False, - widget=forms.Textarea(attrs={'class': 'markItUp'})) - auto_favorite = forms.BooleanField( - label='Automatically favorite every forum topic I create or reply to', required=False) - auto_subscribe = forms.BooleanField( - label='Automatically subscribe to every forum topic I create or reply to', required=False) - - class Meta: - model = UserProfile - fields = ('location', 'birthday', 'occupation', 'interests', - 'profile_text', 'hide_email', 'signature', 'time_zone', - 'use_24_time', 'auto_favorite', 'auto_subscribe') - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui']) - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui'] + - ['js/bio.js', 'js/timezone.js']) - - -class UploadAvatarForm(forms.Form): - """Form used to change a user's avatar""" - avatar_file = forms.ImageField(required=False) - image = None - - def clean_avatar_file(self): - f = self.cleaned_data['avatar_file'] - if f is not None: - if f.size > settings.MAX_AVATAR_SIZE_BYTES: - raise forms.ValidationError("Please upload a file smaller than " - "%s bytes." % settings.MAX_AVATAR_SIZE_BYTES) - try: - self.image = parse_image(f) - except IOError: - raise forms.ValidationError("Please upload a valid image. " - "The file you uploaded was either not an image or a " - "corrupted image.") - self.file_type = self.image.format - return f - - def save(self): - """ - Perform any down-scaling needed on the new file, then return a tuple of - (filename, file object). Note that the file object returned may not - have a name; use the returned filename instead. - - """ - if not self.cleaned_data['avatar_file']: - return None, None - - name = self.cleaned_data['avatar_file'].name - dim = settings.MAX_AVATAR_SIZE_PIXELS - max_size = (dim, dim) - if self.image and self.image.size > max_size: - self.image = downscale_image_square(self.image, dim) - - # We need to return a Django File now. To get that from here, - # write the image data info a StringIO and then construct a - # Django ContentFile from that. The ContentFile has no name, - # that is why we return one ourselves explicitly. - s = StringIO() - self.image.save(s, self.file_type) - return name, ContentFile(s.getvalue()) - - return name, self.cleaned_data['avatar_file'] - - -class SearchUsersForm(forms.Form): - """ - A form to search for users. - """ - username = forms.CharField(max_length=30, widget=AutoCompleteUserInput()) - - class Media: - css = { - 'all': settings.GPP_THIRD_PARTY_CSS['jquery-ui'] - } - js = settings.GPP_THIRD_PARTY_JS['jquery-ui'] - - def clean_username(self): - username = self.cleaned_data['username'].strip() - try: - User.objects.get(username=username, is_active=True) - except User.DoesNotExist: - raise forms.ValidationError("That username does not exist.") - return username diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/models.py --- a/gpp/bio/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,216 +0,0 @@ -""" -Contains models for the bio application. -I would have picked profile for this application, but that is already taken, apparently. -""" -import datetime -import os.path - -from django.db import models -from django.contrib.auth.models import User -from django.conf import settings -from django.core.cache import cache -from django.template.loader import render_to_string - -from core.markup import SiteMarkup - - -# These are the secondary user status enumeration values. -(STA_ACTIVE, # User is a full member in good standing. - STA_RESIGNED, # User has voluntarily asked to be removed. - STA_REMOVED, # User was removed for bad behavior. - STA_SUSPENDED, # User is temporarily suspended; e.g. a stranger tripped - # the spam filter. - STA_SPAMMER, # User has been removed for spamming. - STA_STRANGER, # New member, isn't fully trusted yet. Their comments and - # forum posts are scanned for spam. They can have their - # accounts deactivated by moderators for spamming. - ) = range(6) - -USER_STATUS_CHOICES = ( - (STA_ACTIVE, "Active"), - (STA_RESIGNED, "Resigned"), - (STA_REMOVED, "Removed"), - (STA_SUSPENDED, "Suspended"), - (STA_SPAMMER, "Spammer"), - (STA_STRANGER, "Stranger") -) - - -class Badge(models.Model): - """This model represents badges that users can earn.""" - image = models.ImageField(upload_to='badges') - name = models.CharField(max_length=64) - description = models.TextField(blank=True) - order = models.IntegerField() - numeric_id = models.IntegerField(db_index=True) - - class Meta: - ordering = ('order', ) - - def __unicode__(self): - return self.name - - def get_absolute_url(self): - return self.image.url - - def html(self): - """Returns a HTML img tag representation of the badge.""" - if self.image: - return u'%s' % ( - self.get_absolute_url(), self.name, self.name) - return u'' - html.allow_tags = True - - -def avatar_file_path(instance, filename): - ext = os.path.splitext(filename)[1] - if not ext: - ext = '.jpg' - avatar_name = instance.user.username + ext - return os.path.join(settings.AVATAR_DIR, 'users', avatar_name) - - -class UserProfile(models.Model): - """model to represent additional information about users""" - - user = models.ForeignKey(User, unique=True) - location = models.CharField(max_length=128, blank=True) - birthday = models.DateField(blank=True, null=True, - help_text='Optional; the year is not shown to others') - occupation = models.CharField(max_length=128, blank=True) - interests = models.CharField(max_length=255, blank=True) - profile_text = models.TextField(blank=True) - profile_html = models.TextField(blank=True) - hide_email = models.BooleanField(default=True) - signature = models.TextField(blank=True) - signature_html = models.TextField(blank=True) - avatar = models.ImageField(upload_to=avatar_file_path, blank=True) - time_zone = models.CharField(max_length=64, blank=True, - default='US/Pacific') - use_24_time = models.BooleanField(default=False) - forum_post_count = models.IntegerField(default=0) - status = models.IntegerField(default=STA_STRANGER, - choices=USER_STATUS_CHOICES) - status_date = models.DateTimeField(auto_now_add=True) - badges = models.ManyToManyField(Badge, through="BadgeOwnership") - update_date = models.DateTimeField(db_index=True, blank=True) - auto_favorite = models.BooleanField(default=False) - auto_subscribe = models.BooleanField(default=False) - - def __unicode__(self): - return self.user.username - - class Meta: - ordering = ('user__username', ) - - def save(self, *args, **kwargs): - """ - Custom profile save() function. - If content_update is True (default), then it is assumed that major - fields are being updated and that the profile_content_update signal - should be signalled. When content_update is False, the update_date is - not updated, expensive markup conversions are not performed, and the - signal is not signalled. This is useful for updating the - forum_post_count, for example. - - """ - content_update = kwargs.pop('content_update', True) - - if content_update: - self.update_date = datetime.datetime.now() - sm = SiteMarkup() - self.profile_html = sm.convert(self.profile_text) - self.signature_html = sm.convert(self.signature) - cache.delete('avatar_' + self.user.username) - - super(UserProfile, self).save(*args, **kwargs) - - if content_update: - notify_profile_content_update(self) - - @models.permalink - def get_absolute_url(self): - return ('bio-view_profile', (), {'username': self.user.username}) - - def badge_ownership(self): - return BadgeOwnership.objects.filter(profile=self).select_related('badge') - - def is_stranger(self): - """Returns True if this user profile status is STA_STRANGER.""" - return self.status == STA_STRANGER - - def user_is_active(self): - """Returns the profile's user is_active status. This function exists - for the admin. - """ - return self.user.is_active - user_is_active.boolean = True - user_is_active.short_description = "Is Active" - - def reset_text_fields(self): - """ - Reset profile text fields to empty defaults. - This function is useful when a spammer is identified. - - """ - self.location = '' - self.occupation = '' - self.interests = '' - self.profile_text = '' - self.signature = '' - - def search_title(self): - full_name = self.user.get_full_name() - if full_name: - return u"%s (%s)" % (self.user.username, full_name) - return self.user.username - - def search_summary(self): - text = render_to_string('search/indexes/bio/userprofile_text.txt', - {'object': self}); - return text - - -class UserProfileFlag(models.Model): - """This model represents a user flagging a profile as inappropriate.""" - user = models.ForeignKey(User) - profile = models.ForeignKey(UserProfile) - flag_date = models.DateTimeField(auto_now_add=True) - - def __unicode__(self): - return u"%s's profile flagged by %s" % (self.profile.user.username, - self.user.username) - - class Meta: - ordering = ('flag_date', ) - - def get_profile_url(self): - return 'Profile' % self.profile.get_absolute_url() - get_profile_url.allow_tags = True - - -class BadgeOwnership(models.Model): - """This model represents the ownership of badges by users.""" - profile = models.ForeignKey(UserProfile) - badge = models.ForeignKey(Badge) - count = models.IntegerField(default=1) - - class Meta: - verbose_name_plural = "badge ownership" - ordering = ('badge__order', ) - - def __unicode__(self): - if self.count == 1: - return u"%s owns 1 %s" % (self.profile.user.username, - self.badge.name) - else: - return u"%s owns %d %s badges" % (self.profile.user.username, - self.count, self.badge.name) - - def badge_count_str(self): - if self.count == 1: - return u"1 %s" % self.badge.name - return u"%d %ss" % (self.count, self.badge.name) - -# Put down here to avoid a circular import -from bio.signals import notify_profile_content_update diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/search_indexes.py --- a/gpp/bio/search_indexes.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,30 +0,0 @@ -"""Haystack search index for the bio application.""" -from haystack.indexes import * -from haystack import site -from custom_search.indexes import CondQueuedSearchIndex - -from bio.models import UserProfile -from bio.signals import profile_content_update - - -class UserProfileIndex(CondQueuedSearchIndex): - text = CharField(document=True, use_template=True) - author = CharField(model_attr='user') - - def index_queryset(self): - return UserProfile.objects.filter(user__is_active=True) - - def get_updated_field(self): - return 'update_date' - - def _setup_save(self, model): - profile_content_update.connect(self.enqueue_save) - - def _teardown_save(self, model): - profile_content_update.disconnect(self.enqueue_save) - - def enqueue_save(self, sender, **kwargs): - return self.enqueue('update', sender) - - -site.register(UserProfile, UserProfileIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/signals.py --- a/gpp/bio/signals.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,114 +0,0 @@ -""" -Signal handlers & signals for the bio application. - -""" -from django.db.models.signals import post_save -from django.contrib.auth.models import User -import django.dispatch - -from donations.models import Donation -from weblinks.models import Link -from downloads.models import Download -from news.models import Story -from potd.models import Photo - - -def on_user_save(sender, **kwargs): - """ - This signal handler ensures that every User has a corresonding - UserProfile. It is called after User instance is saved. It creates - a UserProfile for the User if the created argument is True. - - """ - created = kwargs['created'] - if created: - user = kwargs['instance'] - profile = UserProfile() - profile.user = user - profile.save() - - -def on_donation_save(sender, **kwargs): - """ - This function is called after a Donation is saved. - If the Donation was newly created and not anonymous, - award the user a contributor pin. - - """ - if kwargs['created']: - donation = kwargs['instance'] - if not donation.is_anonymous and donation.user: - bio.badges.award_badge(bio.badges.CONTRIBUTOR_PIN, donation.user) - - -def on_link_save(sender, **kwargs): - """ - This function is called after a Link is saved. If the Link was newly - created, award the user a link pin. - - """ - if kwargs['created']: - link = kwargs['instance'] - bio.badges.award_badge(bio.badges.LINK_PIN, link.user) - - -def on_download_save(sender, **kwargs): - """ - This function is called after a Download is saved. If the Download was - newly created, award the user a download pin. - - """ - if kwargs['created']: - download = kwargs['instance'] - bio.badges.award_badge(bio.badges.DOWNLOAD_PIN, download.user) - - -def on_story_save(sender, **kwargs): - """ - This function is called after a Story is saved. If the Story was - newly created, award the user a news pin. - - """ - if kwargs['created']: - story = kwargs['instance'] - bio.badges.award_badge(bio.badges.NEWS_PIN, story.submitter) - - -def on_photo_save(sender, **kwargs): - """ - This function is called after a Photo is saved. If the Photo was - newly created, award the user a POTD pin. - - """ - if kwargs['created']: - photo = kwargs['instance'] - bio.badges.award_badge(bio.badges.POTD_PIN, photo.user) - - -post_save.connect(on_user_save, sender=User, dispatch_uid='bio.signals') -post_save.connect(on_donation_save, sender=Donation, dispatch_uid='bio.signals') -post_save.connect(on_link_save, sender=Link, dispatch_uid='bio.signals') -post_save.connect(on_download_save, sender=Download, dispatch_uid='bio.signals') -post_save.connect(on_story_save, sender=Story, dispatch_uid='bio.signals') -post_save.connect(on_photo_save, sender=Photo, dispatch_uid='bio.signals') - -# Signals for the bio application -# -# This signal is sent whenever a profile has had its textual content updated. -# The provided arguments to the receiver function are: -# - sender - the profile model instance - -profile_content_update = django.dispatch.Signal(providing_args=[]) - - -def notify_profile_content_update(profile): - """ - Convenience function to send the profile content update signal. - - """ - profile_content_update.send_robust(profile) - - -# To avoid circular imports -import bio.badges -from bio.models import UserProfile diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/static/css/bio.css --- a/gpp/bio/static/css/bio.css Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,38 +0,0 @@ -div.user_profile th { - font-weight: bold; - text-align: left; - padding: 5px 5px; -} -div.user_profile td { - font-weight: normal; - text-align: left; - padding: 5px 5px; -} - -div.members-list table { - border-collapse: collapse; - width: 95%; - border: 1px solid black; - margin: 1em auto 1em auto; -} - -div.members-list th { - font-weight: bold; - text-align: center; - padding: 5px 5px; -} - -div.members-list tr { - border-top: 1px solid black; - border-bottom: 1px solid black; - text-align: center; -} - -div.members-list td { - padding: 5px 5px; - text-align: center; -} - -div.members-list tr.odd { - background-color: #ddd; -} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/static/js/bio.js --- a/gpp/bio/static/js/bio.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,36 +0,0 @@ -$(document).ready(function() { - var bday = $('#id_birthday'); - // jquery ui may not always be loaded - if (bday.length) { - bday.datepicker({changeMonth: true, - changeYear: true, - dateFormat: 'yy-mm-dd', - defaultDate: '-30y', - minDate: new Date(1909, 0, 1), - maxDate: new Date(), - yearRange: '-100:+0'}); - } - $('a.profile-flag').click(function() { - var id = this.id; - if (id.match(/fp-(\d+)/)) { - id = RegExp.$1; - if (confirm('Only report a profile if you feel it is spam, abuse, ' + - 'violates site rules, or is not appropriate. ' + - 'A moderator will be notified and will review the profile. ' + - 'Are you sure you want to report this profile?')) { - $.ajax({ - url: '/profile/flag/' + id + '/', - type: 'POST', - dataType: 'text', - success: function (response, textStatus) { - alert(response); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); - } - }); - } - } - return false; - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/templatetags/bio_tags.py --- a/gpp/bio/templatetags/bio_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,82 +0,0 @@ -""" -Template tags for the bio application. -""" -from django import template -from django.conf import settings -from django.core.cache import cache - -from bio.models import UserProfile - - -register = template.Library() - - -def get_img_url(profile=None): - """ - This function returns a URL for a user profile avatar. - If the profile is None or the profile doesn't contain a valid - avatar, the URL for the default avatar is returned. - - """ - if profile is None or profile.avatar.name == '': - return settings.AVATAR_DEFAULT_URL - else: - return profile.avatar.url - - -@register.inclusion_tag('bio/avatar_tag.html') -def avatar(user, profile_link=True, align='bottom'): - """ - Returns the HTML for a user's avatar image. - - If the user object has an attribute 'user_profile', this is assumed to be - the user's profile that has been pre-fetched. Otherwise, the cache is - consulted to retrieve the avatar info for the user. If there is a cache - miss, only then will a get_profile() call be made. - """ - # img_info is a tuple that contains info about the avatar: - # (url, width, height) - - if hasattr(user, 'user_profile'): - img_url = get_img_url(user.user_profile) - else: - # try the cache - cache_key = 'avatar_' + user.username - img_url = cache.get(cache_key) - if img_url is None: - try: - profile = user.get_profile() - except UserProfile.DoesNotExist: - profile = None - - img_url = get_img_url(profile) - cache.set(cache_key, img_url) - - title = user.username - style = '' - if align == 'left': - style = 'style="float:left;margin-right:3px;"' - # other styles not supported - - return { - 'url': img_url, - 'title': title, - 'style': style, - 'username': user.username, - 'profile_link': profile_link, - } - - -@register.inclusion_tag('bio/profile_link_tag.html') -def profile_link(username, trailing_text=''): - """ - Renders a link to a given user's profile page. - Trailing text is any text that you want displayed after the final tag. - Because of the way the Django template system works, a newline will - automatically be inserted after this tag is expanded. If you want a period - to follow immediately after the link, then set trailing_text to '.'. - Otherwise a space will appear between the linked text and any text that - follows the tag. - - """ - return {'username': username, 'trailing_text': trailing_text } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/templatetags/elsewhere_tags.py --- a/gpp/bio/templatetags/elsewhere_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,17 +0,0 @@ -""" -Template tags for the elsewhere application. -""" -from django import template -from django.conf import settings - -register = template.Library() - - -@register.inclusion_tag('bio/elsewhere_links.html') -def elsewhere_links(user): - return { - 'social_nets': user.social_network_profiles.all(), - 'ims': user.instant_messenger_profiles.all(), - 'websites': user.website_profiles.all(), - 'STATIC_URL': settings.STATIC_URL, - } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/tests/__init__.py --- a/gpp/bio/tests/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1 +0,0 @@ -from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/tests/view_tests.py --- a/gpp/bio/tests/view_tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,78 +0,0 @@ -""" -View tests for the bio application. - -""" -import datetime - -from django.contrib.auth.models import User -from django.test import TestCase -from django.core.urlresolvers import reverse, NoReverseMatch - - -class MemberSearchTest(TestCase): - - USERNAME = u'John' - - def setUp(self): - user = User.objects.create_user(self.USERNAME, '', 'password') - user.save() - - self.username = 'test_user' - self.pw = 'password' - self.user = User.objects.create_user(self.username, '', self.pw) - self.user.save() - self.assertTrue(self.client.login(username=self.username, - password=self.pw)) - - def tearDown(self): - self.client.logout() - - def testValidName(self): - """ - Test a valid username. - """ - - response = self.client.post(reverse('bio-member_search'), - {'username': self.USERNAME}, - follow=True) - - self.assertEqual(len(response.redirect_chain), 1) - if response.redirect_chain: - self.assertEqual(response.redirect_chain[0][0], - 'http://testserver' + reverse('bio-view_profile', - kwargs={'username': self.USERNAME})) - self.assertEqual(response.redirect_chain[0][1], 302) - - self.assertEqual(response.status_code, 200) - - def testInvalidName(self): - """ - Test a invalid username. - """ - - response = self.client.post(reverse('bio-member_search'), - {'username': self.USERNAME + '!'}) - - self.assertEqual(response.status_code, 200) - self.assertContains(response, "That username does not exist.") - - def testTrailingSpace(self): - """ - Test a username with a trailing space. - """ - - try: - response = self.client.post(reverse('bio-member_search'), - {'username': self.USERNAME + ' '}, - follow=True) - except NoReverseMatch: - self.fail('bit by a MySQL bug?') - - self.assertEqual(len(response.redirect_chain), 1) - if response.redirect_chain: - self.assertEqual(response.redirect_chain[0][0], - 'http://testserver' + reverse('bio-view_profile', - kwargs={'username': self.USERNAME})) - self.assertEqual(response.redirect_chain[0][1], 302) - - self.assertEqual(response.status_code, 200) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/urls.py --- a/gpp/bio/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,15 +0,0 @@ -"""urls for the bio application""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('bio.views', - url(r'^members/(?Puser|date)/$', - 'member_list', - name='bio-member_list'), - url(r'^members/search/$', 'member_search', name='bio-member_search'), - url(r'^me/$', 'my_profile', name='bio-me'), - url(r'^view/(?P[\w.@+-]{1,30})/$', 'view_profile', name='bio-view_profile'), - url(r'^edit/$', 'edit_profile', name='bio-edit_profile'), - url(r'^edit/elsewhere/$', 'edit_elsewhere', name='bio-edit_elsewhere'), - url(r'^avatar/$', 'change_avatar', name='bio-change_avatar'), - url(r'^flag/(\d+)/$', 'flag_profile', name='bio-flag_profile'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bio/views.py --- a/gpp/bio/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,288 +0,0 @@ -""" -Views for the bio application. - -""" -from django.shortcuts import render_to_response -from django.shortcuts import get_object_or_404 -from django.template import RequestContext -from django.contrib import messages -from django.contrib.auth.models import User -from django.http import HttpResponse -from django.http import HttpResponseBadRequest -from django.http import HttpResponseRedirect -from django.http import HttpResponseServerError -from django.http import Http404 -from django.core.paginator import InvalidPage -from django.core.urlresolvers import reverse -from django.contrib.auth.decorators import login_required -from django.views.decorators.http import require_POST - -from elsewhere.models import SocialNetworkForm -from elsewhere.models import InstantMessengerForm -from elsewhere.models import WebsiteForm - -from bio.models import UserProfile -from bio.models import UserProfileFlag -from bio.models import BadgeOwnership -from bio.forms import UploadAvatarForm -from bio.forms import EditUserForm -from bio.forms import EditUserProfileForm -from bio.forms import SearchUsersForm -from bio.signals import notify_profile_content_update -from core.paginator import DiggPaginator -from core.functions import email_admins -from core.functions import get_page - -####################################################################### - -@login_required -def member_list(request, type='user'): - """ - This view displays the member list. Only active members are displayed. - """ - qs = User.objects.filter(is_active=True) - if type == 'user': - qs = qs.order_by('username') - else: - qs = qs.order_by('date_joined') - num_members = qs.count() - - paginator = DiggPaginator(qs, 20, body=5, tail=3, margin=3, padding=2) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - - # Attach user profiles to each user to avoid using get_user_profile() in - # the template. - users = set(user.id for user in the_page.object_list) - - profiles = UserProfile.objects.filter(user__id__in=users).select_related() - user_profiles = dict((profile.user.id, profile) for profile in profiles) - - for user in the_page.object_list: - user.user_profile = user_profiles[user.id] - - return render_to_response('bio/members.html', { - 'page': the_page, - 'type': type, - 'num_members': num_members, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def my_profile(request): - profile = request.user.get_profile() - badge_collection = BadgeOwnership.objects.filter( - profile=profile).select_related("badge") - - return render_to_response('bio/view_profile.html', { - 'subject': request.user, - 'profile': profile, - 'hide_email': False, - 'this_is_me': True, - 'badge_collection': badge_collection, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def view_profile(request, username): - - user = get_object_or_404(User, username=username) - if user == request.user: - return HttpResponseRedirect(reverse('bio.views.my_profile')) - - profile = user.get_profile() - hide_email = profile.hide_email - - badge_collection = BadgeOwnership.objects.filter( - profile=profile).select_related("badge") - - return render_to_response('bio/view_profile.html', { - 'subject': user, - 'profile': profile, - 'hide_email': hide_email, - 'this_is_me': False, - 'badge_collection': badge_collection, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def edit_profile(request): - if request.method == 'POST': - if request.POST.get('submit_button', 'Cancel') == 'Cancel': - return HttpResponseRedirect(reverse('bio.views.my_profile')) - profile = request.user.get_profile() - user_form = EditUserForm(request.POST, instance=request.user) - profile_form = EditUserProfileForm(request.POST, instance=profile) - if user_form.is_valid() and profile_form.is_valid(): - user_form.save() - profile = profile_form.save(commit=False) - profile.user = request.user - profile.save() - return HttpResponseRedirect(reverse('bio.views.my_profile')) - else: - profile = request.user.get_profile() - user_form = EditUserForm(instance=request.user) - profile_form = EditUserProfileForm(instance=profile) - - return render_to_response('bio/edit_profile.html', { - 'user_form': user_form, - 'profile_form': profile_form, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def change_avatar(request): - if request.method == 'POST': - form = UploadAvatarForm(request.POST, request.FILES) - if form.is_valid(): - # Update the profile with the new avatar - profile = request.user.get_profile() - - # First delete any old avatar file - if profile.avatar.name != '': - profile.avatar.delete(save=False) - - try: - name, avatar = form.save() - except IOError: - messages.error(request, 'A file error occurred.') - return HttpResponseRedirect(reverse('bio-me')) - - if avatar is not None: - profile.avatar.save(name, avatar, save=False) - profile.save() - - messages.success(request, 'Avatar updated') - return HttpResponseRedirect(reverse('bio-me')) - else: - form = UploadAvatarForm() - - return render_to_response('bio/avatar.html', { - 'form': form, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@require_POST -def flag_profile(request, profile_id): - """ - This function handles the flagging of profiles by users. This function should - be the target of an AJAX post. - """ - if not request.user.is_authenticated(): - return HttpResponse('Please login or register to flag a profile.') - - try: - profile = UserProfile.objects.get(pk=profile_id) - except UserProfile.DoesNotExist: - return HttpResponseBadRequest("That profile doesn't exist.") - - flag = UserProfileFlag(user=request.user, profile=profile) - flag.save() - email_admins('A Profile Has Been Flagged', """Hello, - -A user has flagged a profile for review. -""") - return HttpResponse('The profile was flagged. A moderator will review the' \ - ' profile shortly. Thanks for helping to improve the content on this ' \ - 'site.') - -####################################################################### - -@login_required -def edit_elsewhere(request): - im_id = 'id_im_%s' # to prevent duplicate ID in HTML output - if request.method == 'POST': - new_data = request.POST.copy() - - # Add forms - if new_data.get('sn-form') or new_data.get('im-form') or new_data.get('w-form'): - - if new_data.get('sn-form'): - sn_form = SocialNetworkForm(new_data) - im_form = InstantMessengerForm(auto_id=im_id) - w_form = WebsiteForm() - form = sn_form - elif new_data.get('im-form'): - sn_form = SocialNetworkForm() - im_form = InstantMessengerForm(new_data, auto_id=im_id) - w_form = WebsiteForm() - form = im_form - elif new_data.get('w-form'): - sn_form = SocialNetworkForm() - im_form = InstantMessengerForm(auto_id=im_id) - w_form = WebsiteForm(new_data) - form = w_form - - if form.is_valid(): - profile = form.save(commit=False) - profile.user = request.user - profile.save() - return HttpResponseRedirect(request.path) - - # Delete forms - elif new_data.get('delete-sn-form') or new_data.get('delete-im-form') or new_data.get('delete-w-form'): - delete_id = request.POST['delete_id'] - - update_occurred = True - if new_data.get('delete-sn-form'): - request.user.social_network_profiles.get(id=delete_id).delete() - elif new_data.get('delete-im-form'): - request.user.instant_messenger_profiles.get(id=delete_id).delete() - elif new_data.get('delete-w-form'): - request.user.website_profiles.get(id=delete_id).delete() - else: - update_occurred = False - - if update_occurred: - notify_profile_content_update(request.user.get_profile()) - - return HttpResponseRedirect(request.path) - - # WTF? - else: - return HttpResponseServerError - - else: - # Create blank forms - sn_form = SocialNetworkForm() - im_form = InstantMessengerForm(auto_id=im_id) - w_form = WebsiteForm() - - return render_to_response('bio/edit_elsewhere.html', { - 'sn_form': sn_form, - 'im_form': im_form, - 'w_form': w_form, - }, - context_instance=RequestContext(request)) - -####################################################################### - -@login_required -def member_search(request): - if request.method == "POST": - form = SearchUsersForm(request.POST) - if form.is_valid(): - username = form.cleaned_data['username'] - return HttpResponseRedirect(reverse("bio-view_profile", - kwargs={'username': username})) - else: - form = SearchUsersForm() - - return render_to_response('bio/member_search.html', { - 'form': form, - }, - context_instance=RequestContext(request)) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bulletins/admin.py --- a/gpp/bulletins/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,20 +0,0 @@ -''' -This file contains the automatic admin site definitions for the Bulletins models. -''' - -from django.contrib import admin -from django.conf import settings - -from bulletins.models import Bulletin - -class BulletinAdmin(admin.ModelAdmin): - list_display = ('title', 'start_date', 'end_date', 'is_enabled') - list_filter = ('start_date', 'end_date', 'is_enabled') - search_fields = ('title', 'text') - date_hierarchy = 'start_date' - - class Media: - js = settings.GPP_THIRD_PARTY_JS['tiny_mce'] - - -admin.site.register(Bulletin, BulletinAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bulletins/models.py --- a/gpp/bulletins/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,38 +0,0 @@ -"""Models for the bulletins app. -Bulletins allow the sited admins to display and manage important notices for the website. -""" - -import datetime -from django.db import models -from django.db.models import Q - - -class BulletinManager(models.Manager): - """Manager for the Bulletin model.""" - - def get_current(self): - now = datetime.datetime.now() - return self.filter( - Q(is_enabled=True), - Q(start_date__lte=now), - Q(end_date__isnull=True) | Q(end_date__gte=now)) - - -class Bulletin(models.Model): - """Model to represent site bulletins.""" - title = models.CharField(max_length=200) - text = models.TextField() - start_date = models.DateTimeField(db_index=True, - help_text='Start date for when the bulletin will be active.',) - end_date = models.DateTimeField(blank=True, null=True, db_index=True, - help_text='End date for the bulletin. Leave blank to keep it open-ended.') - is_enabled = models.BooleanField(default=True, db_index=True, - help_text='Check to allow the bulletin to be viewed on the site.') - - objects = BulletinManager() - - class Meta: - ordering = ('-start_date', ) - - def __unicode__(self): - return self.title diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/bulletins/templatetags/bulletin_tags.py --- a/gpp/bulletins/templatetags/bulletin_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,17 +0,0 @@ -""" -Template tags for the bulletins application. -""" -from django import template - -from bulletins.models import Bulletin - - -register = template.Library() - - -@register.inclusion_tag('bulletins/bulletins.html') -def current_bulletins(): - bulletins = Bulletin.objects.get_current() - return { - 'bulletins': bulletins, - } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/admin.py --- a/gpp/comments/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,48 +0,0 @@ -""" -This file contains the automatic admin site definitions for the comment models. -""" -from django.contrib import admin -from comments.models import Comment -from comments.models import CommentFlag -import bio.badges - - -class CommentAdmin(admin.ModelAdmin): - fieldsets = ( - (None, - {'fields': ('content_type', 'object_id', )} - ), - ('Content', - {'fields': ('user', 'comment')} - ), - ('Metadata', - {'fields': ('ip_address', 'is_public', 'is_removed')} - ), - ) - list_display = ('__unicode__', 'content_type', 'object_id', 'ip_address', - 'creation_date', 'is_public', 'not_removed') - list_filter = ('creation_date', 'is_public', 'is_removed') - date_hierarchy = 'creation_date' - ordering = ('-creation_date', ) - search_fields = ('comment', 'user__username', 'ip_address') - raw_id_fields = ('user', 'content_type') - - -class CommentFlagAdmin(admin.ModelAdmin): - list_display = ('__unicode__', 'flag_date', 'get_comment_url') - actions = ('accept_flags', ) - raw_id_fields = ('user', 'comment') - - def accept_flags(self, request, qs): - """This admin action awards a security pin to the user who reported - the comment and then deletes the flagged comment object. - """ - for flag in qs: - bio.badges.award_badge(bio.badges.SECURITY_PIN, flag.user) - flag.delete() - - accept_flags.short_description = "Accept selected comment flags" - - -admin.site.register(Comment, CommentAdmin) -admin.site.register(CommentFlag, CommentFlagAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/forms.py --- a/gpp/comments/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,74 +0,0 @@ -""" -Forms for the comments application. -""" -import datetime -from django import forms -from django.conf import settings -from django.contrib.contenttypes.models import ContentType - -from comments.models import Comment - -COMMENT_MAX_LENGTH = getattr(settings, 'COMMENT_MAX_LENGTH', 3000) - -class CommentForm(forms.Form): - comment = forms.CharField(label='', - min_length=1, - max_length=COMMENT_MAX_LENGTH, - widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) - content_type = forms.CharField(widget=forms.HiddenInput) - object_pk = forms.CharField(widget=forms.HiddenInput) - - def __init__(self, target_object, data=None, initial=None): - self.target_object = target_object - if initial is None: - initial = {} - initial.update({ - 'content_type': str(self.target_object._meta), - 'object_pk': str(self.target_object.pk), - }) - super(CommentForm, self).__init__(data=data, initial=initial) - - def get_comment_object(self, user, ip_address): - """ - Return a new (unsaved) comment object based on the information in this - form. Assumes that the form is already validated and will throw a - ValueError if not. - """ - if not self.is_valid(): - raise ValueError("get_comment_object may only be called on valid forms") - - new = Comment( - content_type = ContentType.objects.get_for_model(self.target_object), - object_id = self.target_object.pk, - user = user, - comment = self.cleaned_data["comment"], - ip_address = ip_address, - is_public = True, - is_removed = False, - ) - - # Check that this comment isn't duplicate. (Sometimes people post comments - # twice by mistake.) If it is, fail silently by returning the old comment. - today = datetime.date.today() - possible_duplicates = Comment.objects.filter( - content_type = new.content_type, - object_id = new.object_id, - user = new.user, - creation_date__year = today.year, - creation_date__month = today.month, - creation_date__day = today.day, - ) - for old in possible_duplicates: - if old.comment == new.comment: - return old - - return new - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui']), - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui'] + - ['js/comments.js']) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/models.py --- a/gpp/comments/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,99 +0,0 @@ -""" -Models for the comments application. -""" -import datetime - -from django.db import models -from django.conf import settings -from django.contrib.contenttypes.models import ContentType -from django.contrib.contenttypes import generic -from django.contrib.auth.models import User -from django.core import urlresolvers - -from core.markup import site_markup - - -COMMENT_MAX_LENGTH = getattr(settings, 'COMMENT_MAX_LENGTH', 3000) - -class CommentManager(models.Manager): - """Manager for the Comment model class.""" - - def for_object(self, obj, filter_public=True): - """QuerySet for all comments for a particular model instance.""" - ct = ContentType.objects.get_for_model(obj) - qs = self.get_query_set().filter(content_type__pk=ct.id, - object_id=obj.id) - if filter_public: - qs = qs.filter(is_public=True) - return qs - - -class Comment(models.Model): - """My own version of a Comment class that can attach comments to any other model.""" - content_type = models.ForeignKey(ContentType) - object_id = models.PositiveIntegerField(db_index=True) - content_object = generic.GenericForeignKey('content_type', 'object_id') - user = models.ForeignKey(User) - comment = models.TextField(max_length=COMMENT_MAX_LENGTH) - html = models.TextField(blank=True) - creation_date = models.DateTimeField() - ip_address = models.IPAddressField('IP Address') - is_public = models.BooleanField(default=True, - help_text='Uncheck this field to make the comment invisible.') - is_removed = models.BooleanField(default=False, - help_text='Check this field to replace the comment with a ' \ - '"This comment has been removed" message') - - # Attach manager - objects = CommentManager() - - class Meta: - ordering = ('creation_date', ) - - def __unicode__(self): - return u'%s: %s...' % (self.user.username, self.comment[:50]) - - def save(self, *args, **kwargs): - if not self.id: - self.creation_date = datetime.datetime.now() - - self.html = site_markup(self.comment) - super(Comment, self).save(*args, **kwargs) - - def get_absolute_url(self): - return self.get_content_object_url() + ('#c%s' % self.id) - - def get_content_object_url(self): - """ - Get a URL suitable for redirecting to the content object. - """ - return urlresolvers.reverse( - "comments-url-redirect", - args=(self.content_type_id, self.object_id) - ) - - def not_removed(self): - """ - Returns not self.is_removed. Used on the admin display for - "green board" display purposes. - """ - return not self.is_removed - not_removed.boolean = True - - -class CommentFlag(models.Model): - """This model represents a user flagging a comment as inappropriate.""" - user = models.ForeignKey(User) - comment = models.ForeignKey(Comment) - flag_date = models.DateTimeField(auto_now_add=True) - - def __unicode__(self): - return u'Comment ID %s flagged by %s' % (self.comment.id, self.user.username) - - class Meta: - ordering = ('flag_date', ) - - def get_comment_url(self): - return 'Comment' % self.comment.id - get_comment_url.allow_tags = True - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/static/css/comments.css --- a/gpp/comments/static/css/comments.css Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,28 +0,0 @@ -div.comment-list { - float: left; - font-size: 18px; - font-weight: bold; - color: #999; - padding-right: .5em; -} -div.comment { - padding: 0.5em; - border-bottom: 1px dashed black; - font: 12px/18px "Lucida Grande", Verdana, sans-serif; - color: #333; -} -div.comment-avatar { - float: left; - padding-right: 1.5em; -} -div.comment-text { -} -div.comment-text-removed { - font-style: italic; -} -div.comment-details { - clear: both; - font-size: smaller; - font-style: italic; - padding-top: 0.5em; -} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/static/js/comments.js --- a/gpp/comments/static/js/comments.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,73 +0,0 @@ -$(document).ready(function() { - var postText = $('#id_comment'); - var postButton = $('#comment-form-post'); - postButton.click(function () { - var text = $.trim(postText.val()); - if (text.length == 0) { - alert('Please enter some text.'); - return false; - } - postButton.attr('disabled', 'disabled').val('Posting Comment...'); - $.ajax({ - url: '/comments/post/', - type: 'POST', - data: { - comment : text, - content_type : $('#id_content_type').val(), - object_pk : $('#id_object_pk').val() - }, - dataType: 'html', - success: function (data, textStatus) { - postText.val(''); - $('#comment-container').append(data); - var newDiv = $('#comment-container > div:last'); - newDiv.hide(); - var num = $('.comment-list', newDiv); - num.html($('#comment-container > div').size() + "."); - newDiv.fadeIn(3000); - postButton.removeAttr('disabled').val('Post Comment'); - var count = $('#comment-count'); - if (count.length) { - count.html(parseInt(count.html()) + 1); - } - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - postButton.removeAttr('disabled').val('Post Comment'); - } - }); - return false; - }); - $('a.comment-flag').click(function () { - var id = this.id; - if (id.match(/fc-(\d+)/)) { - id = RegExp.$1; - if (confirm('Only flag a comment if you feel it is spam, abuse, violates site rules, ' + - 'or is not appropriate. ' + - 'A moderator will be notified and will review the comment. ' + - 'Are you sure you want to flag this comment?')) { - $.ajax({ - url: '/comments/flag/', - type: 'POST', - data: {id: id}, - dataType: 'text', - success: function (response, textStatus) { - alert(response); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); - } - }); - } - } - return false; - }); - - $('.comment-text img').fadeIn('fast', function() { - var pic = $(this); - if (pic.width() > 720) { - pic.css('width', '720px'); - } - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/templatetags/comment_tags.py --- a/gpp/comments/templatetags/comment_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,169 +0,0 @@ -""" -Template tags for our Comments application. -We support the following template tags: - {% get_comment_count for [object] as [var] %} - {% get_comment_list for [object] as [var] %}` - {% get_comment_form for [object] as [var] %}` - {% render_comment_form for [object] %} - {% render_comment_list [object] %} -""" -from django import template -from django.conf import settings -from django.template.loader import render_to_string -from django.contrib.contenttypes.models import ContentType - -from comments.models import Comment -from comments.forms import CommentForm - - -register = template.Library() - - -class GetCommentCountNode(template.Node): - def __init__(self, obj, var): - self.object = template.Variable(obj) - self.as_var = var - - def render(self, context): - object = self.object.resolve(context) - qs = Comment.objects.for_object(object) - context[self.as_var] = qs.count() - return '' - -@register.tag -def get_comment_count(parser, token): - """ - Gets the comment count for the specified object and makes it available in the - template context under the variable name specified. - Syntax: - {% get_comment_count for [object] as [varname] %} - """ - try: - (tag, for_word, obj, as_word, var) = token.split_contents() - except ValueError: - raise template.TemplateSyntaxError, "%r tag requires exactly 4 arguments" % token.contents.split()[0] - - if for_word != 'for': - raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) - - if as_word != 'as': - raise template.TemplateSyntaxError("Third argument in %r tag must be 'as'" % tag) - - return GetCommentCountNode(obj, var) - - -class GetCommentListNode(template.Node): - def __init__(self, obj, var): - self.object = template.Variable(obj) - self.as_var = var - - def render(self, context): - object = self.object.resolve(context) - qs = Comment.objects.for_object(object) - context[self.as_var] = list(qs) - return '' - - -@register.tag -def get_comment_list(parser, token): - """ - Gets a list of comments for the specified object and makes it available in the - template context under the variable name specified. - Syntax: - {% get_comment_list for [object] as [varname] %} - """ - try: - (tag, for_word, obj, as_word, var) = token.split_contents() - except ValueError: - raise template.TemplateSyntaxError, "%r tag requires exactly 4 arguments" % token.contents.split()[0] - - if for_word != 'for': - raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) - - if as_word != 'as': - raise template.TemplateSyntaxError("Third argument in %r tag must be 'as'" % tag) - - return GetCommentListNode(obj, var) - - -class GetCommentFormNode(template.Node): - def __init__(self, obj, var): - self.object = template.Variable(obj) - self.as_var = var - - def render(self, context): - object = self.object.resolve(context) - context[self.as_var] = CommentForm(object) - return '' - - -@register.tag -def get_comment_form(parser, token): - """ - Gets the comment form for an object and makes it available in the - template context under the variable name specified. - Syntax: - {% get_comment_form for [object] as [varname] %} - """ - try: - (tag, for_word, obj, as_word, var) = token.split_contents() - except ValueError: - raise template.TemplateSyntaxError, "%r tag requires exactly 4 arguments" % token.contents.split()[0] - - if for_word != 'for': - raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) - - if as_word != 'as': - raise template.TemplateSyntaxError("Third argument in %r tag must be 'as'" % tag) - - return GetCommentFormNode(obj, var) - - -class RenderCommentFormNode(template.Node): - def __init__(self, obj): - self.object = template.Variable(obj) - - def render(self, context): - object = self.object.resolve(context) - context.push() - form_str = render_to_string('comments/comment_form.html', { - 'form': CommentForm(object), - }, - context) - context.pop() - return form_str - - -@register.tag -def render_comment_form(parser, token): - """ - Renders a comment form for the specified object using the template - comments/comment_form.html. - Syntax: - {% render_comment_form for [object] %} - """ - try: - (tag, for_word, obj) = token.split_contents() - except ValueError: - raise template.TemplateSyntaxError, "%r tag requires exactly 2 arguments" % token.contents.split()[0] - - if for_word != 'for': - raise template.TemplateSyntaxError("First argument in %r tag must be 'for'" % tag) - - return RenderCommentFormNode(obj) - - -@register.inclusion_tag('comments/comment_list.html') -def render_comment_list(object): - """ - Renders the comments for the specified object using the template - comments/comment_list.html. - Syntax: - {% render_comment_list [object] %} - """ - qs = Comment.objects.for_object(object).select_related('user') - return { - 'comments': qs, - 'STATIC_URL': settings.STATIC_URL, - } - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/urls.py --- a/gpp/comments/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,16 +0,0 @@ -""" -URLs for the comments application. -""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('comments.views', - url(r'^flag/$', 'flag_comment', name='comments-flag'), - url(r'^markdown/$', 'markdown_preview', name='comments-markdown_preview'), - url(r'^post/$', 'post_comment', name='comments-post'), -) - -urlpatterns += patterns('', - url(r'^cr/(\d+)/(\d+)/$', - 'django.contrib.contenttypes.views.shortcut', - name='comments-url-redirect'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/comments/views.py --- a/gpp/comments/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,134 +0,0 @@ -""" -Views for the comments application. -""" -from django.contrib.auth.decorators import login_required -from django.core.exceptions import ObjectDoesNotExist -from django.http import HttpResponse -from django.http import HttpResponseRedirect -from django.http import HttpResponseBadRequest -from django.http import HttpResponseForbidden -from django.db.models import get_model -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.utils.html import escape -from django.views.decorators.http import require_POST - -from core.functions import email_admins -from core.markup import site_markup -from comments.forms import CommentForm -from comments.models import Comment -from comments.models import CommentFlag -import antispam -import antispam.utils - - -@login_required -@require_POST -def post_comment(request): - """ - This function handles the posting of comments. If successful, returns - the comment text as the response. This function is meant to be the target - of an AJAX post. - """ - # Look up the object we're trying to comment about - ctype = request.POST.get('content_type', None) - object_pk = request.POST.get('object_pk', None) - if ctype is None or object_pk is None: - return HttpResponseBadRequest('Missing content_type or object_pk field.') - - try: - model = get_model(*ctype.split('.', 1)) - target = model.objects.get(pk=object_pk) - except TypeError: - return HttpResponseBadRequest( - "Invalid content_type value: %r" % escape(ctype)) - except AttributeError: - return HttpResponseBadRequest( - "The given content-type %r does not resolve to a valid model." % \ - escape(ctype)) - except ObjectDoesNotExist: - return HttpResponseBadRequest( - "No object matching content-type %r and object PK %r exists." % \ - (escape(ctype), escape(object_pk))) - - # Can we comment on the target object? - if hasattr(target, 'can_comment_on'): - if callable(target.can_comment_on): - can_comment_on = target.can_comment_on() - else: - can_comment_on = target.can_comment_on - else: - can_comment_on = True - - if not can_comment_on: - return HttpResponseForbidden('Cannot comment on this item.') - - # Check form validity - - form = CommentForm(target, request.POST) - if not form.is_valid(): - return HttpResponseBadRequest('Invalid comment; missing parameters?') - - comment = form.get_comment_object(request.user, request.META.get("REMOTE_ADDR", None)) - - # Check for spam - - if antispam.utils.spam_check(request, comment.comment): - return HttpResponseForbidden(antispam.BUSTED_MESSAGE) - - comment.save() - - # return the rendered comment - return render_to_response('comments/comment.html', { - 'comment': comment, - }, - context_instance = RequestContext(request)) - - -@require_POST -def flag_comment(request): - """ - This function handles the flagging of comments by users. This function should - be the target of an AJAX post. - """ - if not request.user.is_authenticated(): - return HttpResponse('Please login or register to flag a comment.') - - id = request.POST.get('id', None) - if id is None: - return HttpResponseBadRequest('No id') - - try: - comment = Comment.objects.get(pk=id) - except Comment.DoesNotExist: - return HttpResponseBadRequest('No comment with id %s' % id) - - flag = CommentFlag(user=request.user, comment=comment) - flag.save() - email_admins('A Comment Has Been Flagged', """Hello, - -A user has flagged a comment for review. -""") - return HttpResponse('The comment was flagged. A moderator will review the comment shortly. ' \ - 'Thanks for helping to improve the discussions on this site.') - - -@require_POST -def markdown_preview(request): - """ - This function should be the target of an AJAX POST. It takes the 'data' parameter - from the POST parameters and returns a rendered HTML page from the data, which - is assumed to be in markdown format. The HTML page is suitable for the preview - function for a javascript editor such as markItUp. - """ - if not request.user.is_authenticated(): - return HttpResponseForbidden('This service is only available to logged in users.') - - data = request.POST.get('data', None) - if data is None: - return HttpResponseBadRequest('No data') - - return render_to_response('comments/markdown_preview.html', { - 'data': site_markup(data), - }, - context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contact/forms.py --- a/gpp/contact/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,47 +0,0 @@ -"""forms for the contact application""" - -from django import forms -from django.conf import settings -from django.template.loader import render_to_string -from django.contrib.sites.models import Site -from core.functions import send_mail - - -class ContactForm(forms.Form): - """Form used to contact the website admins""" - name = forms.CharField(label = "Your Name", max_length = 61, - widget = forms.TextInput(attrs = {'size' : 50 })) - email = forms.EmailField(label = "Your Email", - widget = forms.TextInput(attrs = {'size' : 50 })) - subject = forms.CharField(max_length = 64, - widget = forms.TextInput(attrs = {'size' : 50 })) - honeypot = forms.CharField(max_length = 64, required = False, - label = 'If you enter anything in this field your message will be treated as spam') - message = forms.CharField(label = "Your Message", - widget = forms.Textarea(attrs = {'rows' : 16, 'cols' : 50}), - max_length = 3000) - - recipient_list = [mail_tuple[1] for mail_tuple in settings.MANAGERS] - - def clean_honeypot(self): - value = self.cleaned_data['honeypot'] - if value: - raise forms.ValidationError(self.fields['honeypot'].label) - return value - - def save(self): - # Send the feedback message email - - site = Site.objects.get_current() - - msg = render_to_string('contact/contact_email.txt', - { - 'site_name' : site.name, - 'user_name' : self.cleaned_data['name'], - 'user_email' : self.cleaned_data['email'], - 'message' : self.cleaned_data['message'], - }) - - subject = site.name + ' Feedback: ' + self.cleaned_data['subject'] - send_mail(subject, msg, self.cleaned_data['email'], self.recipient_list) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contact/urls.py --- a/gpp/contact/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,7 +0,0 @@ -"""urls for the contact application""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('contact.views', - url(r'^$', 'contact_form', name='contact-form'), - (r'^thanks/$', 'contact_thanks'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contact/views.py --- a/gpp/contact/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,33 +0,0 @@ -# Create your views here. - -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.http import HttpResponseRedirect -from django.core.urlresolvers import reverse - -from contact.forms import ContactForm -from core.functions import get_full_name - - -def contact_form(request): - if request.method == 'POST': - form = ContactForm(request.POST) - if form.is_valid(): - form.save() - return HttpResponseRedirect(reverse('contact.views.contact_thanks')) - else: - initial_data = {} - if request.user.is_authenticated(): - name = get_full_name(request.user) - initial_data = {'name' : name, 'email' : request.user.email} - - form = ContactForm(initial = initial_data) - - return render_to_response('contact/contact_form.html', - {'form' : form}, - context_instance = RequestContext(request)) - - -def contact_thanks(request): - return render_to_response('contact/contact_thanks.html', - context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/admin.py --- a/gpp/contests/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,50 +0,0 @@ -""" -Admin definitions for the contest application. - -""" -from django.contrib import admin -from django.conf import settings - -from contests.models import Contest - - -class ContestAdmin(admin.ModelAdmin): - list_display = ['title', 'is_public', 'creation_date', 'end_date', - 'contestant_count', 'winner'] - list_editable = ['is_public'] - date_hierarchy = 'creation_date' - search_fields = ['title', 'description'] - prepopulated_fields = {'slug': ['title']} - raw_id_fields = ['winner', 'contestants'] - actions = ['pick_winner'] - - class Media: - js = (['js/contests/contests_admin.js'] + - settings.GPP_THIRD_PARTY_JS['tiny_mce']) - - def contestant_count(self, obj): - return obj.contestants.count() - contestant_count.short_description = '# Entries' - - def pick_winner(self, request, qs): - """ - Picks a winner on the contests selected by the admin. Note that for - safety reasons, we only update those contests that don't have winners - already. - - """ - count = 0 - for contest in qs: - if not contest.winner: - contest.pick_winner() - contest.save() - count += 1 - - self.message_user(request, "%d of %d winners picked" % (count, - qs.count())) - - pick_winner.short_description = "Pick winners for selected contests" - - - -admin.site.register(Contest, ContestAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/models.py --- a/gpp/contests/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,93 +0,0 @@ -""" -Models for the contest application. - -""" -import random -import datetime - -from django.db import models -from django.contrib.auth.models import User - - -class PublicContestManager(models.Manager): - """ - The manager for all public contests. - - """ - def get_query_set(self): - return super(PublicContestManager, self).get_query_set().filter(is_public=True) - - -class Contest(models.Model): - """ - A model to represent contests where users sign up to win something. - - """ - title = models.CharField(max_length=64) - slug = models.SlugField(max_length=64) - description = models.TextField() - is_public = models.BooleanField(db_index=True) - creation_date = models.DateTimeField(blank=True) - end_date = models.DateTimeField() - contestants = models.ManyToManyField(User, related_name='contests', - null=True, blank=True) - winner = models.ForeignKey(User, null=True, blank=True, - related_name='winning_contests') - win_date = models.DateTimeField(null=True, blank=True) - meta_description = models.TextField() - - objects = models.Manager() - public_objects = PublicContestManager() - - class Meta: - ordering = ['-creation_date'] - - def __unicode__(self): - return self.title - - @models.permalink - def get_absolute_url(self): - return ('contests-contest', [], {'slug': self.slug}) - - def save(self, *args, **kwargs): - if not self.pk and not self.creation_date: - self.creation_date = datetime.datetime.now() - - super(Contest, self).save(*args, **kwargs) - - def is_active(self): - """ - Returns True if the contest is still active. - - """ - now = datetime.datetime.now() - return self.creation_date <= now < self.end_date - - def can_enter(self): - """ - Returns True if the contest is still active and does not have a winner. - - """ - return not self.winner and self.is_active() - - def pick_winner(self): - """ - This function randomly picks a winner from all the contestants. - - """ - user_ids = self.contestants.values_list('id', flat=True) - winner_id = random.choice(user_ids) - self.winner = User.objects.get(id=winner_id) - self.win_date = datetime.datetime.now() - - def ogp_tags(self): - """ - Returns a dict of Open Graph Protocol meta tags. - - """ - return { - 'og:title': self.title, - 'og:type': 'article', - 'og:url': self.get_absolute_url(), - 'og:description': self.meta_description, - } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/static/js/contests/contests.js --- a/gpp/contests/static/js/contests/contests.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,30 +0,0 @@ -$(function() { - var $button = $('#contest-button'); - $button.click(function() { - var buttonLabel = $button.text(); - $button.attr('disabled', 'disabled').val('Please wait...'); - - $.ajax({ - url: '/contests/enter/', - type: 'POST', - data: { - contest_id : contest_id - }, - dataType: 'json', - success: function (data, textStatus) { - var classname = data.entered ? 'success' : 'info'; - var $p = $('#contest-entry'); - $p.hide(); - $p.addClass(classname); - $p.html(data.msg); - $p.fadeIn(3000); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - $button.removeAttr('disabled').text(buttonLabel); - } - }); - return false; - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/static/js/contests/contests_admin.js --- a/gpp/contests/static/js/contests/contests_admin.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,3 +0,0 @@ -django.jQuery(document).ready(function() { - django.jQuery('#id_meta_description').addClass('mceNoEditor'); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/tests/__init__.py --- a/gpp/contests/tests/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,2 +0,0 @@ -from model_tests import * -from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/tests/model_tests.py --- a/gpp/contests/tests/model_tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,166 +0,0 @@ -""" -Model tests for the contests application. - -""" -import datetime - -from django.test import TestCase -from django.contrib.auth.models import User - -from contests.models import Contest - - -class ContestTestCase(TestCase): - - def test_creation_date(self): - - c = Contest(title='test', - slug='test', - description='test', - is_public=False, - end_date=datetime.datetime.now() + datetime.timedelta(days=30)) - - c.save() - - self.assertTrue(c.creation_date) - self.assertTrue(datetime.datetime.now() - c.creation_date < - datetime.timedelta(seconds=1)) - - def test_is_active(self): - - now = datetime.datetime.now() - start = now + datetime.timedelta(days=7) - end = start + datetime.timedelta(days=30) - - c = Contest(title='test', - slug='test', - description='test', - is_public=False, - creation_date=start, - end_date=end) - - self.failIf(c.is_active()) - - start = now - datetime.timedelta(days=7) - end = start + datetime.timedelta(days=30) - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end) - - self.assertTrue(c.is_active()) - - start = now - datetime.timedelta(days=7) - end = start - datetime.timedelta(days=3) - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end) - - self.failIf(c.is_active()) - - def test_can_enter(self): - - now = datetime.datetime.now() - start = now + datetime.timedelta(days=7) - end = start + datetime.timedelta(days=30) - - c = Contest(title='test', - slug='test', - description='test', - is_public=False, - creation_date=start, - end_date=end) - - self.failIf(c.can_enter()) - - start = now - datetime.timedelta(days=7) - end = start + datetime.timedelta(days=30) - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end) - - self.assertTrue(c.can_enter()) - - start = now - datetime.timedelta(days=7) - end = start - datetime.timedelta(days=3) - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end) - - self.failIf(c.can_enter()) - - start = now - datetime.timedelta(days=7) - end = start + datetime.timedelta(days=30) - - user = User.objects.create_user('test_user', '', 'password') - user.save() - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end, - winner=user, - win_date=now) - - self.failIf(c.can_enter()) - - start = now - datetime.timedelta(days=7) - end = start - datetime.timedelta(days=3) - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end, - winner=user, - win_date=end + datetime.timedelta(days=1)) - - self.failIf(c.can_enter()) - - def test_pick_winner(self): - - now = datetime.datetime.now() - start = now - datetime.timedelta(days=7) - end = start - datetime.timedelta(days=3) - - c = Contest(title='test', - slug='test', - description='test', - is_public=False, - creation_date=start, - end_date=end) - c.save() - - user1 = User.objects.create_user('test_user1', '', 'password') - user1.save() - user2 = User.objects.create_user('test_user2', '', 'password') - user2.save() - user3 = User.objects.create_user('test_user3', '', 'password') - user3.save() - - c.contestants.add(user1, user2, user3) - - c.pick_winner() - - self.assertTrue(datetime.datetime.now() - c.win_date < - datetime.timedelta(seconds=1)) - self.assertTrue(c.winner.id in [user1.id, user2.id, user3.id]) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/tests/view_tests.py --- a/gpp/contests/tests/view_tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,123 +0,0 @@ -""" -View tests for the contests application. - -""" -import datetime -from django.test import TestCase -from django.contrib.auth.models import User -from django.core.urlresolvers import reverse -from django.utils import simplejson - -from contests.models import Contest - - -class NoConstestsTestCase(TestCase): - - def test_no_contests(self): - response = self.client.get(reverse('contests-index')) - self.assertEqual(response.status_code, 200) - - url = reverse('contests-contest', kwargs={'slug': 'test'}) - response = self.client.get(url) - self.assertEqual(response.status_code, 404) - - -class ConstestsTestCase(TestCase): - - def setUp(self): - now = datetime.datetime.now() - start = now - datetime.timedelta(days=7) - end = start - datetime.timedelta(days=3) - - user = User.objects.create_user('test_user', '', 'password') - user.save() - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end, - winner=user, - win_date=end + datetime.timedelta(days=1)) - c.save() - self.contest_id = c.id - - def test_contests(self): - response = self.client.get(reverse('contests-index')) - self.assertEqual(response.status_code, 200) - - url = reverse('contests-contest', kwargs={'slug': 'test'}) - response = self.client.get(url) - self.assertEqual(response.status_code, 200) - - -class ContestEntryTestCase(TestCase): - - def setUp(self): - self.username = 'test_user' - self.pw = 'password' - self.user = User.objects.create_user(self.username, '', self.pw) - self.user.save() - self.assertTrue(self.client.login(username=self.username, - password=self.pw)) - - now = datetime.datetime.now() - start = now - datetime.timedelta(days=7) - end = now + datetime.timedelta(days=3) - - c = Contest(title='test', - slug='test', - description='test', - is_public=True, - creation_date=start, - end_date=end) - c.save() - self.contest_id = c.id - - def test_entry_toggle(self): - response = self.client.post(reverse('contests-enter'), - {'contest_id': self.contest_id}, - HTTP_X_REQUESTED_WITH='XMLHttpRequest') - self.assertEqual(response.status_code, 200) - - json = simplejson.loads(response.content) - self.assertTrue(json['entered']) - - contest = Contest.objects.get(pk=self.contest_id) - self.assertTrue(self.user in contest.contestants.all()) - - response = self.client.post(reverse('contests-enter'), - {'contest_id': self.contest_id}, - HTTP_X_REQUESTED_WITH='XMLHttpRequest') - self.assertEqual(response.status_code, 200) - - json = simplejson.loads(response.content) - self.failIf(json['entered']) - - contest = Contest.objects.get(pk=self.contest_id) - self.failIf(self.user in contest.contestants.all()) - - -class NoPublicConstestsTestCase(TestCase): - - def setUp(self): - now = datetime.datetime.now() - start = now - datetime.timedelta(days=7) - end = start - datetime.timedelta(days=3) - - c = Contest(title='test', - slug='test', - description='test', - is_public=False, - creation_date=start, - end_date=end) - c.save() - - def test_contests(self): - response = self.client.get(reverse('contests-index')) - self.assertEqual(response.status_code, 200) - - url = reverse('contests-contest', kwargs={'slug': 'test'}) - response = self.client.get(url) - self.assertEqual(response.status_code, 404) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/urls.py --- a/gpp/contests/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,27 +0,0 @@ -""" -Url patterns for the contests application. - -""" -from django.conf.urls import patterns, url -from django.views.generic import DetailView, ListView - -from contests.models import Contest - - -urlpatterns = patterns('', - url(r'^$', - ListView.as_view( - context_object_name='contests', - queryset=Contest.public_objects.select_related('winner')), - name='contests-index'), - - url(r'^enter/$', - 'contests.views.enter', - name='contests-enter'), - - url(r'^c/(?P[\w-]+)/$', - DetailView.as_view( - context_object_name='contest', - queryset=Contest.public_objects.all().select_related('winner')), - name='contests-contest'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/contests/views.py --- a/gpp/contests/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,46 +0,0 @@ -""" -Views for the contests application. - -""" -from django.http import (HttpResponse, HttpResponseForbidden, - HttpResponseBadRequest) -from django.shortcuts import get_object_or_404 -from django.utils import simplejson -from django.views.decorators.http import require_POST - -from contests.models import Contest - - -@require_POST -def enter(request): - """ - This view is an AJAX view that is used to enter or withdraw a user from a - given contest. This function toggles the user's entered state in the - contest. - - """ - if not request.user.is_authenticated(): - return HttpResponseForbidden("Please login first") - - contest_id = request.POST.get('contest_id') - if not contest_id: - return HttpResponseBadRequest("Missing contest_id") - - contest = get_object_or_404(Contest, pk=contest_id) - if not contest.can_enter(): - return HttpResponseForbidden("Contest is over") - - # Toggle the user's state in the contest - - result = {} - if request.user in contest.contestants.all(): - contest.contestants.remove(request.user) - result['entered'] = False - result['msg'] = 'You have been withdrawn from this contest.' - else: - contest.contestants.add(request.user) - result['entered'] = True - result['msg'] = 'You have been entered into this contest!' - - json = simplejson.dumps(result) - return HttpResponse(json, content_type='application/json') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/admin.py --- a/gpp/core/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,12 +0,0 @@ -from django.contrib import admin -from django.contrib.flatpages.models import FlatPage -from django.contrib.flatpages.admin import FlatPageAdmin as FlatPageAdminOld -from django.conf import settings - -class FlatPageAdmin(FlatPageAdminOld): - class Media: - js = settings.GPP_THIRD_PARTY_JS['tiny_mce'] - -# We have to unregister it, and then reregister -admin.site.unregister(FlatPage) -admin.site.register(FlatPage, FlatPageAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/fixtures/flatpages.json --- a/gpp/core/fixtures/flatpages.json Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,62 +0,0 @@ -[ - { - "pk": 1, - "model": "flatpages.flatpage", - "fields": { - "registration_required": false, - "title": "About SurfGuitar101.com", - "url": "/about/", - "template_name": "", - "sites": [ - 1 - ], - "content": "

SurfGuitar101.com is the premier place on the web for friends and fans of the world-wide phenomenon known as surf music. Surf music was created in the early 1960's in Southern California by such bands as The Belairs, Dick Dale & His Deltones, and The Chantays, and popularized further by bands like The Ventures, The Astronauts, The Pyramids, & The Lively Ones. Surf music was all but forgotten when The Beatles and the British Invasion landed in America in the mid to late 1960's. In the late 70's and early 1980's a revival began when bands like Jon & The Nightriders, The Surf Raiders, and The Halibuts heard the call of the surf and reintroduced it to hungry audiences. This revival continues today and has spread world-wide. Today you can find surf bands not only in California, but all across America, Europe, Australia, Central and South America, and Japan.

\r\n

Join us in our forums to discuss this great form of popular music. Discover great bands old and new. Check out our podcasts as we highlight the classic surf bands and the bands of today. Meet new friends and learn about the next surf show in your town. Exchange tips on playing and performing surf music and even starting your own band!

\r\n

Thanks for being part of the greatest online community dedicated to surf music!

\r\n

A Short History of SurfGuitar101.com

\r\n

This site started as a Yahoo Group in late October, 2001. There were several other surf music Yahoo groups at the time, so we started our focus on the musician aspect of playing surf music (hence the \"guitar 101\"). After a short time we dropped that angle and fully embraced all aspects of surf music.

\r\n

After seeing The Surf Coasters (Japan) on their first US tour in the summer of 2004, we needed a place to host our many photos and videos. The domain name surfguitar101.com was registered, and a simple static website was created to host media files as a supplement to the Yahoo group. 

\r\n

Cramped by the confines of the Yahoo Group, in February of 2006 we launched an interactive version of the website, complete with our now famous forums. This format was kept until February, 2011 when the website software was rewritten and a new look was designed.

\r\n

The SG101 community held its first annual convention weekend in 2008 in Southern California, a tradition that continues today. Ever year our members get together for a surf music packed weekend, and each year has been bigger and better than the last. In 2010, Germany's The Space Rangers and Italy's (via Antigua) Wadadli Riders were the first non-US bands to play at the convention. Fans of surf music get to see, hear, and mingle with musicians from the original 60's bands as well as the up and coming bands of today.

\r\n

Surf's Up!

", - "enable_comments": false - } - }, - { - "pk": 4, - "model": "flatpages.flatpage", - "fields": { - "registration_required": false, - "title": "Colophon", - "url": "/colophon/", - "template_name": "", - "sites": [ - 1 - ], - "content": "

SurfGuitar101.com was created by Brian Neal. The server-side code is written in the Python programming language using the awesome Django Web framework. Client-side coding was done in Javascript, making heavy use of the jQuery and jQuery UI libraries.

\r\n

The site design was created by Ken Dushane of the band The Crashmatics. Various icons and graphics were contributed by Ariel (DreadInBabylon), Ferenc Dobronyi, and Joseph Koch. Additional icons courtesy of FamFamFam.

\r\n

The following 3rd party libraries were leveraged in the construction of this site: MySQLdb, python-markdown, PIL, pytz, django-tagging, django-elsewhere, gdata-python-client, python-memcached, html5lib, tinymce, markItUp!, Haystack, xapian-haystack, Blueprint, jQuery Cycle, JEditable, & repoze.timeago.

\r\n

The site runs on an infrastructure powered by many open-source tools: the Apache server with mod_wsgi, a MySQL database, the Xapian search engine library, and memcached. The server is running Ubuntu, an operating system based upon the Debian GNU / Linux distribution.

\r\n

Special thanks to Abraham Aguilar and Brian Fady for providing useful feedback and testing.

", - "enable_comments": false - } - }, - { - "pk": 3, - "model": "flatpages.flatpage", - "fields": { - "registration_required": false, - "title": "SurfGuitar101.com Privacy Policy", - "url": "/policy/privacy/", - "template_name": "", - "sites": [ - 1 - ], - "content": "

SurfGuitar101.com is committed to ensuring the privacy of its readers and registered members and wants you to fully understand our terms and conditions This privacy statement describes how any personal, and anonymous, information is collected and managed and how you can request changes to any sharing of this information that may occur.

\r\n

Statistical Reports

\r\n

SurfGuitar101.com's servers automatically recognize a visitor's IP address and domain name. These items do not reveal any personal information about the visitor. The information is used solely to compile statistics that enable us to examine page impression levels and numbers of unique users visiting our Web sites. This information helps us to understand the areas of our sites that people visit in order to deliver more effective content.

\r\n

Cookies

\r\n

Like most other Web sites, SurfGuitar101.com uses cookies. Cookies are small data files that some Web sites write to your hard drive when you visit them. A cookie file can contain information such as a user ID that the site uses to track the pages you've visited. Cookies do not tell us who you are unless you've specifically given us personally identifiable information. A cookie can't read data off your hard drive or read cookie files created by other sites.

SurfGuitar101.com uses cookies to allow automatic logins to improve your experience with our sites. For example, we may use a cookie to identify our site members so they don't have to re-enter a user id and password when they sign-in.  Cookies can also be used to help us to better understand how visitors interact with our sites leading to the delivery of more relevant content. Cookies may be created directly by our sites for these purposes, or by third-party companies operating on our behalf. If you choose to become a member of SurfGuitar101.com, you must have cookies enabled to access the member related pages (i.e. Discussion Boards and Member Profile pages).

Most web browsers automatically accept cookies but allow you to modify security settings so you can approve or reject cookies on a case-by-case basis.

\r\n

Pixel Tags

\r\n

SurfGuitar101.com does not currently use pixel tags, also known as beacons, web bugs or clear gifs.

\r\n

Online Ad Serving

\r\n

SurfGuitar101.com does not currently use third-party advertising companies to serve advertisements.

\r\n

Newsletters / Mailing Lists

\r\n

Through the registration process for SurfGuitar101.com, we request some personal information such as your e-mail address, company information, your name, job title, etc. We will never give your personal information to any third party vendor without your prior consent. We currently do not make our email and postal lists available to any third-party.

\r\n

SurfGuitar101.com Email Announcements

\r\n

At this time we do not send mass e-mails to make site-wide announcements.

\r\n

Necessary Disclosure

\r\n

The necessary disclosure of any of the above information to third parties will be governed by the following principles:

\r\n
    \r\n
  1. Where SurfGuitar101.com is required to do so by law and any order of the court.
  2. \r\n
  3. Where it is necessary to identify anyone who may be violating the rights of others or the law in general.
  4. \r\n
  5. Where SurfGuitar101.com intends to co-operate with the investigation of any alleged unlawful activities without being required to by virtue of any court order or other legal requirement.
  6. \r\n
  7. Where it is necessary to protect the rights of SurfGuitar101.com.
  8. \r\n
\r\n

Security

\r\n

We use all reasonable precautions to securely maintain all information given to us by our registered members and we are not responsible for any breach of the reasonable security measures installed to protect the said information. We are not responsible for the private policies of any site linked to, or from, SurfGuitar101.com.

\r\n

Opt Out Policy

\r\n

SurfGuitar101.com gives users options whenever necessary, and practical. Such choices include: Opting not to receive our electronic messages, opting not to provide certain optional personal information when registering for an account.

\r\n

Transfer of Information

\r\n

SurfGuitar101.com reserves the right to transfer any information accumulated as described above in the event of the sale of part or all of SurfGuitar101.com assets and/or stock. By visiting our Web sites and by registering you consent to the collection and use of information in the manner herein described.

\r\n

Privacy Policy Changes

\r\n

This Privacy Policy may be modified from time to time. Any modifications to our Privacy Policy will be reflected on this page. If there is a significant change, we will indicate it on our sites and provide a link to the new policy.

", - "enable_comments": false - } - }, - { - "pk": 2, - "model": "flatpages.flatpage", - "fields": { - "registration_required": false, - "title": "SurfGuitar101.com Terms of Service", - "url": "/policy/tos/", - "template_name": "", - "sites": [ - 1 - ], - "content": "
\r\n

Your use of our Internet sites is subject to these Terms of Service (\"Terms\"). We may modify these Terms at any time without notice to you by posting revised Terms on our sites. Your use of our sites constitutes your binding acceptance of these Terms, including any modifications that we make.

\r\n

Content on Our Sites

\r\n

Our sites include a combination of content that we create and that our users create. You are solely responsible for all materials, whether publicly posted or privately transmitted, that you upload, post, email, transmit or otherwise make available on our sites (\"Your Content\"). You certify that you own all intellectual property rights in Your Content. You hereby grant us, our affiliates and our partners a worldwide, irrevocable, royalty-free, nonexclusive, sublicensable license to use, reproduce, create derivative works of, distribute, publicly perform, publicly display, transfer, transmit, distribute and publish Your Content and subsequent versions of Your Content for the purposes of (i) displaying Your Content on our sites, (ii) distributing Your Content, either electronically or via other media, to users seeking to download or otherwise acquire it, and/or (iii) storing Your Content in a remote database accessible by end users. This license shall apply to the distribution and the storage of Your Content in any form, medium, or technology now known or later developed.

\r\n

Your Conduct on Our Sites

\r\n

You agree not to post or transmit material that is knowingly false and/or defamatory, misleading, inaccurate, abusive, vulgar, hateful, harassing, obscene, profane, sexually oriented, threatening or invasive of a person's privacy; that otherwise violates any law; or that encourages conduct constituting a criminal offense.

\r\n

User Agreement for SurfGuitar101.com Forums

\r\n

This message forum, and other user contributed/comment areas (\"Forums\") are provided as a service to members of our community. By using or participating on the Forums, you agree to this User Agreement including but not limited to the Rules of Conduct and the Terms of Service stated below. For purposes of this agreement, \"User\" refers to any individual posting on or otherwise using the Forums and SG101 refers to the owners and staff of SurfGuitar101.com and their authorized representatives.

\r\n

SG101 reserves the right to change the Rules of Conduct, Terms of Service and all other parts of this User Agreement at its sole discretion and without notice.

\r\n

As a standard operating procedure, SG101 does not enter into correspondence, discussions or other communication, either public or private, about SG101 policies, individual moderators, enforcement or application of the User Agreement, bans or other sanctions, etc.

\r\n

RULES OF CONDUCT

\r\n

User agrees not to post material that is knowingly false and/or defamatory, misleading, inaccurate, abusive, vulgar, hateful, harassing, obscene, profane, sexually oriented, threatening, invasive of a person's privacy, that otherwise violates any law, or that encourages conduct constituting a criminal offense.

\r\n

User agrees not to post any material that is protected by copyright, trademark or other proprietary right without the express permission of the owner(s) of said copyright, trademark or other proprietary right.

\r\n

User agrees not to use nicknames that might be deemed abusive, vulgar, hateful, harassing, obscene, profane, sexually oriented, threatening, invasive of a person's privacy, or otherwise inappropriate. User agrees not to use nicknames that might mislead other Users. This includes but is not limited to using nicknames that impersonate developers, staff, or other Users, or other individuals outside of SG101.

\r\n

TERMS OF SERVICE

\r\n

User acknowledges and agrees that use of the SG101 is a privilege, not a right, and that SG101 has the right, at its sole discretion, to revoke this privilege at any time without notice or reason. User agrees that this Agreement in its entirety applies to both public and private messages.

\r\n

The goal of the Forums is to foster communication and the interchange of ideas within the User community. User agrees and acknowledges that any posts, nicknames or other material deemed offensive, harassing, baiting or otherwise inappropriate may be removed at the sole discretion of SG101.

\r\n

User authorizes SG101 to make use of any original stories, concepts, ideas, drawings, photographs, opinions and other creative materials posted on the Forums without compensation or other recourse. User also agrees to indemnify and hold harmless SG101 and our agents with respect to any claims based upon or arising from the transmission and/or content of your message(s).

\r\n

SG101 has the right but not the obligation to monitor and/or moderate the Forums, and offers no assurances in this regard.

\r\n

SG101 is not responsible for messages posted on the Forums or the content therein. We do not vouch for or warrant the accuracy, completeness or usefulness of any message. Each message expresses the views of its originating User, not necessarily those of SG101. Unless expressly stated otherwise by a senior SG101 representative, this includes messages posted by SG101 personnel, agents, delegates, representatives et al.

\r\n

Any User who feels that a posted message is objectionable is encouraged to contact us. We have the ability to remove messages and we will make every effort to do so within a reasonable time if we determine that removal is necessary. This is a manual process, however, so please realize that we may not be able to act immediately. Removal of messages is at the sole discretion of SG101.

\r\n

The appropriate individual to contact is usually the editor of the site associated with the board where the message in question is to be found. As a standard operating procedure, SG101 does not enter into discussions, either public or private, about Forum policies, individual moderators, bans or other sanctions, etc.

\r\n

SG101 reserves the right to reveal the identity of and/or whatever information we know about any User in the event of a complaint or legal action arising from any message posted by said User.

\r\n

Advertisements, chain letters, pyramid schemes and other commercial solicitations are inappropriate on the Forums.

\r\n

SG101 does not permit children under the age of 13 to become members, post home pages or web sites on our service.

\r\n

SG101 is not responsible for the content posted by SG101 members or visitors on any area of our site including without limitation. The opinions and views expressed by SG101's members or visitors do not necessarily represent those of SG101 and SG101 does not verify, endorse, or vouch for the content of such opinions or views. Further, SG101 is not responsible for the delivery or quality of any goods or services sold or advertised through or on SG101 members' page(s). If you believe that any of the content posted by our members or visitors violates your proprietary rights, including copyrights, please contact us.

\r\n

You are solely and fully responsible for any content that you post any area of our site. We do not regularly review the contents of materials posted by our members or other visitors to our site. We strictly prohibit the posting of the following types of content on all areas of our sites:

\r\n
    \r\n
  • nudity, pornography, and sexual material of a lewd, lecherous or obscene nature and intent or that violates local, state and national laws.
  • \r\n
  • any material that violates or infringes in any way upon the proprietary rights of others, including, without limitation, copyright or trademark rights; this includes \"WAREZ\" (copyrighted software that is distributed illegally), \"mp3\" files of copyrighted music, copyrighted photographs, text, video or artwork. If you don't own the copyright or have express authorization and documented permission to use it, don't put it on SG101 (if you do have express permission you must say so clearly). SG101 will terminate the memberships of, and remove the pages of, repeat infringers.
  • \r\n
  • any material that is threatening, abusive, harassing, defamatory, invasive of privacy or publicity rights, vulgar, obscene, profane, indecent, or otherwise objectionable; including posting other peoples' private information.
  • \r\n
  • content that promotes, encourages, or provides instructional information about illegal activities - specifically hacking, cracking, or phreaking.
  • \r\n
  • any software, information, or other material that contains a virus, \"Trojan Horse\", \"worm\" corrupted data, or any other harmful or damaging component;
  • \r\n
  • hate propaganda or hate mongering, swearing, or fraudulent material or activity;
  • \r\n
\r\n
\r\n
\r\n

By submitting your data to SG101, you represent that the data complies with SG101's Terms of Service. If any third party brings a claim, lawsuit or other proceeding against SG101 based on your conduct or use of SG101 services, you agree to compensate SG101 (including its officers, directors, employees and agents) for any and all losses, liabilities, damages or expenses, including attorney's fees, incurred by SG101 in connection with any such claim, lawsuit or proceeding.

\r\n

SG101 is the final arbiter of what IS and IS NOT allowed on our site. Further, SG101 reserves the right to modify or remove anything submitted to SG101, and to cancel any membership, at any time for any reason without prior notice. SG101 is not obliged to maintain back-ups copies of any material submitted or posted on our site. Actions or activities that may cause termination of your membership and/or removal of your page(s) include, but are not limited to:

\r\n
    \r\n
  • posting or providing links to any content which violates our Terms of Service:
  • \r\n
  • conducting or providing links to any raffle, contest, or game which violates any local, state or national laws;
  • \r\n
  • using in the registration of your SG101 membership an email account that is not your own or that is or becomes inactive.
  • \r\n
  • violating the SG101 Terms of Service. Please read and familiarize yourself with the SG101 Terms of Service.
  • \r\n
  • sending unsolicited email using a SG101 address
  • \r\n
  • reproducing, distributing, republishing or retransmitting material posted by other SG101 members without the prior permission of such members.
  • \r\n
\r\n
\r\n
\r\n

We reserve the right to monitor, and to investigate any complaints regarding any content of SG101 members' pages, message-board postings, and to take appropriate action if SG101 finds violations of these Terms of Service. In the case of any such complaint, SG101 reserves the right to remove the content complained of while the SG101 member and the complaining party attempt to resolve their dispute. This could result in your posts(s) being removed from SG101 for as long as it takes to resolve the dispute.

\r\n

You grant to SG101 and its affiliates a royalty-free, perpetual, irrevocable, nonexclusive, worldwide, unrestricted license to use, copy, modify, transmit, distribute, and publicly perform or display the submitted pages or other content for the purposes of displaying such information on SG101's sites and for the promotion and marketing of SG101's services.

\r\n

MISC.

\r\n

SG101 makes no guarantee of availability of service and reserves the right to change, withdraw, suspend, or discontinue any functionality or feature of the SG101 service. IN NO EVENT WILL BE LIABLE FOR ANY DAMAGES, INCLUDING, WITHOUT LIMITATION, DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL, OR PUNITIVE DAMAGES ARISING OUT OF THE USE OF OR INABILITY TO USE SG101'S SERVICES OR ANY CONTENT THEREON FOR ANY REASON INCLUDING, WITHOUT LIMITATION, SG101'S REMOVAL OR DELETION OF ANY MATERIALS OR RECORDS SUBMITTED OR POSTED ON SG101'S SITE FOR ANY REASON. THIS DISCLAIMER APPLIES, WITHOUT LIMITATION, TO ANY DAMAGES OR INJURY, WHETHER FOR BREACH OF CONTRACT, TORT, OR OTHERWISE, CAUSED; ANY FAILURE OF PERFORMANCE; ERROR; OMISSION; INTERRUPTION; DELETION; DEFECT; DELAY IN OPERATION OR TRANSMISSION; COMPUTER VIRUS; FILE CORRUPTION; COMMUNICATION-LINE FAILURE; NETWORK OR SYSTEM OUTAGE; OR THEFT, DESTRUCTION, UNAUTHORIZED ACCESS TO, ALTERATION OF, OR USE OF ANY RECORD.

\r\n

SG101 reserves the right to change or amend these Terms of Service at any time without prior notice. By registering and/or submitting any content, including without limitation, message-board postings, you signify your agreement to these Terms of Service.

\r\n
", - "enable_comments": false - } - } -] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/functions.py --- a/gpp/core/functions.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,117 +0,0 @@ -"""This file houses various core utility functions for GPP""" -import datetime -import re -import logging - -from django.contrib.sites.models import Site -from django.conf import settings -import django.core.mail - -import core.tasks - - -def send_mail(subject, message, from_email, recipient_list, defer=True, **kwargs): - """ - The main send email function. Use this function to send email from the - site. All applications should use this function instead of calling - Django's directly. - If defer is True, the email will be sent to a Celery task to actually send - the email. Otherwise it is sent on the caller's thread. In any event, the - email will be logged at the DEBUG level. - - """ - # Guard against empty email addresses - recipient_list = [dest for dest in recipient_list if dest] - if not recipient_list: - logging.warning("Empty recipient_list in send_mail") - return - - logging.debug('EMAIL:\nFrom: %s\nTo: %s\nSubject: %s\nMessage:\n%s', - from_email, str(recipient_list), subject, message) - - if defer: - core.tasks.send_mail.delay(subject, message, from_email, recipient_list, - **kwargs) - else: - django.core.mail.send_mail(subject, message, from_email, recipient_list, - **kwargs) - - -def email_admins(subject, message): - """Emails the site admins. Goes through the site send_mail function.""" - site = Site.objects.get_current() - subject = '[%s] %s' % (site.name, subject) - send_mail(subject, - message, - '%s@%s' % (settings.GPP_NO_REPLY_EMAIL, site.domain), - [mail_tuple[1] for mail_tuple in settings.ADMINS]) - - -def email_managers(subject, message): - """Emails the site managers. Goes through the site send_mail function.""" - site = Site.objects.get_current() - subject = '[%s] %s' % (site.name, subject) - send_mail(subject, - msg, - '%s@%s' % (settings.GPP_NO_REPLY_EMAIL, site.domain), - [mail_tuple[1] for mail_tuple in settings.MANAGERS]) - - -def get_full_name(user): - """Returns the user's full name if available, otherwise falls back - to the username.""" - full_name = user.get_full_name() - if full_name: - return full_name - return user.username - - -BASE_YEAR = 2010 - -def copyright_str(): - curr_year = datetime.datetime.now().year - if curr_year == BASE_YEAR: - year_range = str(BASE_YEAR) - else: - year_range = "%d - %d" % (BASE_YEAR, curr_year) - - return 'Copyright (C) %s, SurfGuitar101.com' % year_range - - -IP_PAT = re.compile('(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})') - -def get_ip(request): - """Returns the IP from the request or None if it cannot be retrieved.""" - ip = request.META.get('HTTP_X_FORWARDED_FOR', - request.META.get('REMOTE_ADDR')) - - if ip: - match = IP_PAT.match(ip) - ip = match.group(1) if match else None - - return ip - - -def get_page(qdict): - """Attempts to retrieve the value for "page" from the given query dict and - return it as an integer. If the key cannot be found or converted to an - integer, 1 is returned. - """ - n = qdict.get('page', 1) - try: - n = int(n) - except ValueError: - n = 1 - return n - - -def quote_message(who, message): - """ - Builds a message reply by quoting the existing message in a - typical email-like fashion. The quoting is compatible with Markdown. - """ - msg = "> %s" % message.replace('\n', '\n> ') - if msg.endswith('\n> '): - msg = msg[:-2] - - return "*%s wrote:*\n\n%s\n\n" % (who, msg) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/html.py --- a/gpp/core/html.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,28 +0,0 @@ -import html5lib -from html5lib import sanitizer, treebuilders, treewalkers, serializer - -def sanitizer_factory(*args, **kwargs): - san = sanitizer.HTMLSanitizer(*args, **kwargs) - # This isn't available yet - # san.strip_tokens = True - return san - -def clean_html(buf): - """Cleans HTML of dangerous tags and content.""" - buf = buf.strip() - if not buf: - return buf - - p = html5lib.HTMLParser(tree=treebuilders.getTreeBuilder("dom"), - tokenizer=sanitizer_factory) - dom_tree = p.parseFragment(buf) - - walker = treewalkers.getTreeWalker("dom") - stream = walker(dom_tree) - - s = serializer.htmlserializer.HTMLSerializer( - omit_optional_tags=False, - quote_attr_values=True) - return s.render(stream) - -# vim: ts=4 sw=4 diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/image.py --- a/gpp/core/image.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,43 +0,0 @@ -""" -This file contains common utility functions for manipulating images for -the rest of the applications in the project. -""" -from PIL import ImageFile -from PIL import Image - - -def parse_image(file): - """ - Returns a PIL Image from the supplied Django file object. - Throws IOError if the file does not parse as an image file or some other - I/O error occurred. - - """ - parser = ImageFile.Parser() - for chunk in file.chunks(): - parser.feed(chunk) - image = parser.close() - return image - - -def downscale_image_square(image, size): - """ - Scale an image to the square dimensions given by size (in pixels). - The new image is returned. - If the image is already smaller than (size, size) then no scaling - is performed and the image is returned unchanged. - - """ - # don't upscale - if (size, size) >= image.size: - return image - - (w, h) = image.size - if w > h: - diff = (w - h) / 2 - image = image.crop((diff, 0, w - diff, h)) - elif h > w: - diff = (h - w) / 2 - image = image.crop((0, diff, w, h - diff)) - image = image.resize((size, size), Image.ANTIALIAS) - return image diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/management/commands/max_users.py --- a/gpp/core/management/commands/max_users.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,17 +0,0 @@ -""" -max_users is a custom manage.py command. -It is intended to be called from a cron job to calculate the maximum -number of users online statistic. -""" -import datetime - -from django.core.management.base import NoArgsCommand - -from core.whos_online import max_users - - -class Command(NoArgsCommand): - help = "Run periodically to compute the max users online statistic." - - def handle_noargs(self, **options): - max_users() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/markup.py --- a/gpp/core/markup.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,57 +0,0 @@ -""" -Markup related utitlities useful for the entire project. -""" -import markdown as _markdown -from django.utils.encoding import force_unicode - -from smiley import SmilifyMarkdown - -class Markdown(object): - """ - This is a thin wrapper around the Markdown class which deals with the - differences in Markdown versions on the production and development server. - This code was inspired by the code in - django/contrib/markup/templatetags/markup.py. - Currently, we only have to worry about Markdown 1.6b and 2.0. - """ - def __init__(self, safe_mode='escape'): - # Unicode support only in markdown v1.7 or above. Version_info - # exists only in markdown v1.6.2rc-2 or above. - self.unicode_support = getattr(_markdown, "version_info", None) >= (1, 7) - self.md = _markdown.Markdown(safe_mode=safe_mode, - extensions=['urlize', 'nl2br', 'del']) - - def convert(self, s): - if self.unicode_support: - return self.md.convert(force_unicode(s)) - else: - return force_unicode(self.md.convert(s)) - - -def markdown(s): - """ - A convenience function for one-off markdown jobs. - """ - md = Markdown() - return md.convert(s) - - -class SiteMarkup(object): - """ - This class provides site markup by combining markdown and - our own smiley markup. - """ - def __init__(self): - self.md = Markdown() - self.smiley = SmilifyMarkdown() - - def convert(self, s): - return self.md.convert(self.smiley.convert(s)) - - -def site_markup(s): - """ - Convenience function for one-off site markup jobs. - """ - sm = SiteMarkup() - return sm.convert(s) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/middleware.py --- a/gpp/core/middleware.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,72 +0,0 @@ -"""Common middleware for the entire project.""" -import datetime -import logging - -from django.db import IntegrityError -from django.contrib.auth import logout -from django.conf import settings - -from core.functions import get_ip -from core.whos_online import report_user, report_visitor - - -class InactiveUserMiddleware(object): - """ - This middleware ensures users with is_active set to False get their - session destroyed and are treated as logged out. - This middleware should come after the 'django.contrib.auth.middleware. - AuthenticationMiddleware' in settings.py. - Idea taken from: http://djangosnippets.org/snippets/1105/ - """ - - def process_view(self, request, view_func, view_args, view_kwargs): - if request.user.is_authenticated() and not request.user.is_active: - logout(request) - - -ONLINE_COOKIE = 'sg101_online' # online cookie name -ONLINE_TIMEOUT = 5 * 60 # online cookie lifetime in seconds - - -class WhosOnline(object): - """ - This middleware class keeps track of which registered users have - been seen recently, and the number of unique unregistered users. - This middleware should come after the authentication middleware, - as we count on the user attribute being attached to the request. - """ - - def process_response(self, request, response): - """ - Keep track of who is online. - """ - # Note that some requests may not have a user attribute - # as these may have been redirected in the middleware chain before - # the auth middleware got a chance to run. If this is the case, just - # bail out. We also ignore AJAX requests. - - if not hasattr(request, 'user') or request.is_ajax(): - return response - - if request.user.is_authenticated(): - if request.COOKIES.get(ONLINE_COOKIE) is None: - # report that we've seen the user - report_user(request.user.username) - - # set a cookie to expire - response.set_cookie(ONLINE_COOKIE, '1', max_age=ONLINE_TIMEOUT) - else: - if request.COOKIES.get(settings.CSRF_COOKIE_NAME) is not None: - # We have a non-authenticated user that has cookies enabled. This - # means we can track them. - if request.COOKIES.get(ONLINE_COOKIE) is None: - # see if we can get the IP address - ip = get_ip(request) - if ip: - # report that we've seen this visitor - report_visitor(ip) - - # set a cookie to expire - response.set_cookie(ONLINE_COOKIE, '1', max_age=ONLINE_TIMEOUT) - - return response diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/models.py --- a/gpp/core/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,24 +0,0 @@ -""" -This file contains the core Models used in gpp -""" -import datetime - -from django.db import models -from django.contrib.auth.models import User - - -class Statistic(models.Model): - """ - This model keeps track of site statistics. Currently, the only statistic - is the maximum number of users online. This stat is computed by a mgmt. - command that is run on a cron job to peek at the previous two models. - """ - max_users = models.IntegerField() - max_users_date = models.DateTimeField() - max_anon_users = models.IntegerField() - max_anon_users_date = models.DateTimeField() - - def __unicode__(self): - return u'%d users on %s' % (self.max_users, - self.max_users_date.strftime('%Y-%m-%d %H:%M:%S')) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/paginator.py --- a/gpp/core/paginator.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,286 +0,0 @@ -""" -Digg.com style paginator. -References: -http://www.djangosnippets.org/snippets/773/ -http://blog.elsdoerfer.name/2008/05/26/diggpaginator-update/ -http://blog.elsdoerfer.name/2008/03/06/yet-another-paginator-digg-style/ -""" -import math -from django.core.paginator import \ - Paginator, QuerySetPaginator, Page, InvalidPage - -__all__ = ( - 'InvalidPage', - 'ExPaginator', - 'DiggPaginator', - 'QuerySetDiggPaginator', -) - -class ExPaginator(Paginator): - """Adds a ``softlimit`` option to ``page()``. If True, querying a - page number larger than max. will not fail, but instead return the - last available page. - - This is useful when the data source can not provide an exact count - at all times (like some search engines), meaning the user could - possibly see links to invalid pages at some point which we wouldn't - want to fail as 404s. - - >>> items = range(1, 1000) - >>> paginator = ExPaginator(items, 10) - >>> paginator.page(1000) - Traceback (most recent call last): - InvalidPage: That page contains no results - >>> paginator.page(1000, softlimit=True) - - - # [bug] graceful handling of non-int args - >>> paginator.page("str") - Traceback (most recent call last): - InvalidPage: That page number is not an integer - """ - def _ensure_int(self, num, e): - # see Django #7307 - try: - return int(num) - except ValueError: - raise e - - def page(self, number, softlimit=False): - try: - return super(ExPaginator, self).page(number) - except InvalidPage, e: - number = self._ensure_int(number, e) - if number > self.num_pages and softlimit: - return self.page(self.num_pages, softlimit=False) - else: - raise e - -class DiggPaginator(ExPaginator): - """ - Based on Django's default paginator, it adds "Digg-style" page ranges - with a leading block of pages, an optional middle block, and another - block at the end of the page range. They are available as attributes - on the page: - - {# with: page = digg_paginator.page(1) #} - {% for num in page.leading_range %} ... - {% for num in page.main_range %} ... - {% for num in page.trailing_range %} ... - - Additionally, ``page_range`` contains a nun-numeric ``False`` element - for every transition between two ranges. - - {% for num in page.page_range %} - {% if not num %} ... {# literally output dots #} - {% else %}{{ num }} - {% endif %} - {% endfor %} - - Additional arguments passed to the constructor allow customization of - how those bocks are constructed: - - body=5, tail=2 - - [1] 2 3 4 5 ... 91 92 - |_________| |___| - body tail - |_____| - margin - - body=5, tail=2, padding=2 - - 1 2 ... 6 7 [8] 9 10 ... 91 92 - |_| |__| - ^padding^ - |_| |__________| |___| - tail body tail - - ``margin`` is the minimum number of pages required between two ranges; if - there are less, they are combined into one. - - When ``align_left`` is set to ``True``, the paginator operates in a - special mode that always skips the right tail, e.g. does not display the - end block unless necessary. This is useful for situations in which the - exact number of items/pages is not actually known. - - # odd body length - >>> print DiggPaginator(range(1,1000), 10, body=5).page(1) - 1 2 3 4 5 ... 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=5).page(100) - 1 2 ... 96 97 98 99 100 - - # even body length - >>> print DiggPaginator(range(1,1000), 10, body=6).page(1) - 1 2 3 4 5 6 ... 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=6).page(100) - 1 2 ... 95 96 97 98 99 100 - - # leading range and main range are combined when close; note how - # we have varying body and padding values, and their effect. - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2).page(3) - 1 2 3 4 5 ... 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=6, padding=2, margin=2).page(4) - 1 2 3 4 5 6 ... 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2).page(6) - 1 2 3 4 5 6 7 ... 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2).page(7) - 1 2 ... 5 6 7 8 9 ... 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2).page(7) - 1 2 ... 5 6 7 8 9 ... 99 100 - - # the trailing range works the same - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2, ).page(98) - 1 2 ... 96 97 98 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=6, padding=2, margin=2, ).page(97) - 1 2 ... 95 96 97 98 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2, ).page(95) - 1 2 ... 94 95 96 97 98 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=2, margin=2, ).page(94) - 1 2 ... 92 93 94 95 96 ... 99 100 - >>> print DiggPaginator(range(1,1000), 10, body=5, padding=1, margin=2, ).page(94) - 1 2 ... 92 93 94 95 96 ... 99 100 - - # all three ranges may be combined as well - >>> print DiggPaginator(range(1,151), 10, body=6, padding=2).page(7) - 1 2 3 4 5 6 7 8 9 ... 14 15 - >>> print DiggPaginator(range(1,151), 10, body=6, padding=2).page(8) - 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 - >>> print DiggPaginator(range(1,151), 10, body=6, padding=1).page(8) - 1 2 3 4 5 6 7 8 9 ... 14 15 - - # no leading or trailing ranges might be required if there are only - # a very small number of pages - >>> print DiggPaginator(range(1,80), 10, body=10).page(1) - 1 2 3 4 5 6 7 8 - >>> print DiggPaginator(range(1,80), 10, body=10).page(8) - 1 2 3 4 5 6 7 8 - >>> print DiggPaginator(range(1,12), 10, body=5).page(1) - 1 2 - - # test left align mode - >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(1) - 1 2 3 4 5 - >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(50) - 1 2 ... 48 49 50 51 52 - >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(97) - 1 2 ... 95 96 97 98 99 - >>> print DiggPaginator(range(1,1000), 10, body=5, align_left=True).page(100) - 1 2 ... 96 97 98 99 100 - - # padding: default value - >>> DiggPaginator(range(1,1000), 10, body=10).padding - 4 - - # padding: automatic reduction - >>> DiggPaginator(range(1,1000), 10, body=5).padding - 2 - >>> DiggPaginator(range(1,1000), 10, body=6).padding - 2 - - # padding: sanity check - >>> DiggPaginator(range(1,1000), 10, body=5, padding=3) - Traceback (most recent call last): - ValueError: padding too large for body (max 2) - """ - def __init__(self, *args, **kwargs): - self.body = kwargs.pop('body', 10) - self.tail = kwargs.pop('tail', 2) - self.align_left = kwargs.pop('align_left', False) - self.margin = kwargs.pop('margin', 4) # TODO: make the default relative to body? - # validate padding value - max_padding = int(math.ceil(self.body/2.0)-1) - self.padding = kwargs.pop('padding', min(4, max_padding)) - if self.padding > max_padding: - raise ValueError('padding too large for body (max %d)'%max_padding) - super(DiggPaginator, self).__init__(*args, **kwargs) - - def page(self, number, *args, **kwargs): - """Return a standard ``Page`` instance with custom, digg-specific - page ranges attached. - """ - - page = super(DiggPaginator, self).page(number, *args, **kwargs) - number = int(number) # we know this will work - - # easier access - num_pages, body, tail, padding, margin = \ - self.num_pages, self.body, self.tail, self.padding, self.margin - - # put active page in middle of main range - main_range = map(int, [ - math.floor(number-body/2.0)+1, # +1 = shift odd body to right - math.floor(number+body/2.0)]) - # adjust bounds - if main_range[0] < 1: - main_range = map(abs(main_range[0]-1).__add__, main_range) - if main_range[1] > num_pages: - main_range = map((num_pages-main_range[1]).__add__, main_range) - - # Determine leading and trailing ranges; if possible and appropriate, - # combine them with the main range, in which case the resulting main - # block might end up considerable larger than requested. While we - # can't guarantee the exact size in those cases, we can at least try - # to come as close as possible: we can reduce the other boundary to - # max padding, instead of using half the body size, which would - # otherwise be the case. If the padding is large enough, this will - # of course have no effect. - # Example: - # total pages=100, page=4, body=5, (default padding=2) - # 1 2 3 [4] 5 6 ... 99 100 - # total pages=100, page=4, body=5, padding=1 - # 1 2 3 [4] 5 ... 99 100 - # If it were not for this adjustment, both cases would result in the - # first output, regardless of the padding value. - if main_range[0] <= tail+margin: - leading = [] - main_range = [1, max(body, min(number+padding, main_range[1]))] - main_range[0] = 1 - else: - leading = range(1, tail+1) - # basically same for trailing range, but not in ``left_align`` mode - if self.align_left: - trailing = [] - else: - if main_range[1] >= num_pages-(tail+margin)+1: - trailing = [] - if not leading: - # ... but handle the special case of neither leading nor - # trailing ranges; otherwise, we would now modify the - # main range low bound, which we just set in the previous - # section, again. - main_range = [1, num_pages] - else: - main_range = [min(num_pages-body+1, max(number-padding, main_range[0])), num_pages] - else: - trailing = range(num_pages-tail+1, num_pages+1) - - # finally, normalize values that are out of bound; this basically - # fixes all the things the above code screwed up in the simple case - # of few enough pages where one range would suffice. - main_range = [max(main_range[0], 1), min(main_range[1], num_pages)] - - # make the result of our calculations available as custom ranges - # on the ``Page`` instance. - page.main_range = range(main_range[0], main_range[1]+1) - page.leading_range = leading - page.trailing_range = trailing - page.page_range = reduce(lambda x, y: x+((x and y) and [False])+y, - [page.leading_range, page.main_range, page.trailing_range]) - - page.__class__ = DiggPage - return page - -class DiggPage(Page): - def __str__(self): - return " ... ".join(filter(None, [ - " ".join(map(str, self.leading_range)), - " ".join(map(str, self.main_range)), - " ".join(map(str, self.trailing_range))])) - -class QuerySetDiggPaginator(DiggPaginator, QuerySetPaginator): - pass - -if __name__ == "__main__": - import doctest - doctest.testmod() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/services.py --- a/gpp/core/services.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,21 +0,0 @@ -""" -This module provides a common way for the various apps to integrate with services -that are installed at this site. - -""" -from django.conf import settings -import redis - -# Redis connection and database settings - -REDIS_HOST = getattr(settings, 'REDIS_HOST', 'localhost') -REDIS_PORT = getattr(settings, 'REDIS_PORT', 6379) -REDIS_DB = getattr(settings, 'REDIS_DB', 0) - - -def get_redis_connection(host=REDIS_HOST, port=REDIS_PORT, db=REDIS_DB): - """ - Create and return a Redis connection using the supplied parameters. - - """ - return redis.StrictRedis(host=host, port=port, db=db) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/tasks.py --- a/gpp/core/tasks.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,61 +0,0 @@ -""" -Celery tasks for the core application. - -""" -from celery.task import task -import django.core.mail - -import core.whos_online - - -@task -def add(x, y): - """ - It is useful to have a test task laying around. This is it. - - """ - return x + y - - -@task -def send_mail(subject, message, from_email, recipient_list, **kwargs): - """ - A task to send mail via Django. - - """ - django.core.mail.send_mail(subject, message, from_email, recipient_list, - **kwargs) - - -@task -def cleanup(): - """ - A task to perform site-wide cleanup actions. - - """ - # These imports, when placed at the top of the module, caused all kinds of - # import problems when running on the production server (Python 2.5 and - # mod_wsgi). Moving them here worked around that problem. - - from django.core.management.commands.cleanup import Command as CleanupCommand - from forums.management.commands.forum_cleanup import Command as ForumCleanup - - # Execute Django's cleanup command (deletes old sessions). - - command = CleanupCommand() - command.execute() - - # Execute our forum cleanup command to delete old last visit records. - - command = ForumCleanup() - command.execute() - - -@task -def max_users(): - """ - Run the periodic task to calculate the who's online max users/visitors - statistics. - - """ - core.whos_online.max_users() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/templatetags/core_tags.py --- a/gpp/core/templatetags/core_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,223 +0,0 @@ -""" -Miscellaneous/utility template tags. - -""" -import collections -import datetime -import urllib - -from django import template -from django.conf import settings -from django.core.cache import cache -from django.contrib.sites.models import Site - -import repoze.timeago - -from core.whos_online import get_users_online, get_visitors_online, get_stats -from bio.models import UserProfile - - -register = template.Library() - -ICON_PARAMS = { - True: (settings.STATIC_URL + 'icons/accept.png', 'Yes'), - False: (settings.STATIC_URL + 'icons/delete.png', 'No'), -} - -@register.simple_tag -def bool_icon(flag): - params = ICON_PARAMS[bool(flag)] - return u"""%s""" % ( - params[0], params[1], params[1]) - - -@register.inclusion_tag('core/comment_dialogs.html') -def comment_dialogs(): - return {'STATIC_URL': settings.STATIC_URL} - - -@register.inclusion_tag('core/max_users_tag.html') -def max_users(): - """ - Displays max users online information. - - """ - return { - 'stats': get_stats(), - } - -@register.inclusion_tag('core/whos_online_tag.html') -def whos_online(): - """ - Displays a list of who is online. - - """ - users = get_users_online() - users.sort(key=str.lower) - - visitors = get_visitors_online() - - return { - 'num_users': len(users), - 'users': users, - 'num_guests': len(visitors), - 'total': len(users) + len(visitors), - } - - -@register.inclusion_tag('core/social_sharing_tag.html') -def social_sharing(title, url): - """ - Displays social media sharing buttons. - - """ - site = Site.objects.get_current() - url = _fully_qualify(url, site.domain) - - return { - 'title': title, - 'url': url, - } - - -def _fully_qualify(url, domain): - """ - Returns a "fully qualified" URL by checking the given url. - If the url starts with '/' then http://domain is pre-pended - onto it. Otherwise the original URL is returned. - - """ - if url.startswith('/'): - url = "http://%s%s" % (domain, url) - return url - - -@register.inclusion_tag('core/open_graph_meta_tag.html') -def open_graph_meta_tags(item=None): - """ - Generates Open Graph meta tags by interrogating the given item. - To generate tags for the home page, set item to None. - - """ - site = Site.objects.get_current() - - if item: - props = item.ogp_tags() - else: - props = { - 'og:title': site.name, - 'og:type': 'website', - 'og:url': 'http://%s' % site.domain, - 'og:description': settings.OGP_SITE_DESCRIPTION, - } - - props['og:site_name'] = site.name - props['fb:admins'] = settings.OGP_FB_ID - - if 'og:image' not in props: - props['og:image'] = settings.OGP_DEFAULT_IMAGE - - if 'og:url' in props: - props['og:url'] = _fully_qualify(props['og:url'], site.domain) - - if 'og:image' in props: - props['og:image'] = _fully_qualify(props['og:image'], site.domain) - - return {'props': props} - - -# A somewhat ugly hack until we decide if we should be using UTC time -# everywhere or not. -repoze.timeago._NOW = datetime.datetime.now - -@register.filter(name='elapsed') -def elapsed(timestamp): - """ - This filter accepts a datetime and computes an elapsed time from "now". - The elapsed time is displayed as a "humanized" string. - Examples: - 1 minute ago - 5 minutes ago - 1 hour ago - 10 hours ago - 1 day ago - 7 days ago - - """ - return repoze.timeago.get_elapsed(timestamp) -elapsed.is_safe = True - - -class Birthday(object): - """ - A simple named tuple-type class for birthdays. - This class was created to make things easier in the template. - - """ - day = None - profiles = [] - - def __init__(self, day, profiles=None): - self.day = day - self.profiles = profiles if profiles else [] - - -@register.inclusion_tag('core/birthday_block.html') -def birthday_block(): - """ - A template tag to display all the users who have birthdays this month. - """ - today = datetime.date.today() - profiles = list(UserProfile.objects.filter(birthday__month=today.month).select_related( - 'user')) - - days = collections.defaultdict(list) - for profile in profiles: - days[profile.birthday.day].append(profile) - - birthdays = [Birthday(day, profiles) for day, profiles in days.iteritems()] - birthdays.sort(key=lambda b: b.day) - - return { - 'STATIC_URL': settings.STATIC_URL, - 'birthdays': birthdays, - 'today': today, - } - - -class EncodeParamsNode(template.Node): - """ - This is the Node class for the encode_params template tag. - This template tag retrieves the named parameters from the supplied - querydict and returns them as a urlencoded string. - - """ - def __init__(self, querydict, args): - self.querydict = template.Variable(querydict) - self.args = args - - def render(self, context): - querydict = self.querydict.resolve(context) - params = [] - for arg in self.args: - params.extend([(arg, value) for value in querydict.getlist(arg)]) - - return urllib.urlencode(params) - - -@register.tag -def encode_params(parser, token): - """ - This is the compilation function for the encode_params template tag. - This template tag retrieves the named parameters from the supplied - querydict and returns them as a urlencoded string. - - """ - bits = token.split_contents() - if len(bits) < 3: - raise template.TemplateSyntaxError("%s takes at least 2 arguments: " - "querydict arg1 [arg2 arg3 ... argN]" % bits[0]) - - querydict = bits[1] - args = [arg[1:-1] for arg in bits[2:]] - return EncodeParamsNode(querydict, args) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/templatetags/custom_admin_tags.py --- a/gpp/core/templatetags/custom_admin_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,50 +0,0 @@ -""" -Custom template tags for the admin. -""" -from django import template -from django.db.models import Q - -from bio.models import UserProfileFlag -from comments.models import CommentFlag -from downloads.models import PendingDownload -from forums.models import FlaggedPost -from gcalendar.models import Event -from news.models import PendingStory -from weblinks.models import PendingLink, FlaggedLink -from shoutbox.models import ShoutFlag - - -register = template.Library() - - -@register.inclusion_tag('core/admin_dashboard.html') -def admin_dashboard(user): - """ - This tag is used in the admin to create a dashboard - of pending content that an admin must approve. - """ - flagged_profiles = UserProfileFlag.objects.count() - flagged_comments = CommentFlag.objects.count() - new_downloads = PendingDownload.objects.count() - flagged_posts = FlaggedPost.objects.count() - event_requests = Event.objects.filter( - Q(status=Event.NEW) | - Q(status=Event.EDIT_REQ) | - Q(status=Event.DEL_REQ)).count() - new_stories = PendingStory.objects.count() - new_links = PendingLink.objects.count() - broken_links = FlaggedLink.objects.count() - flagged_shouts = ShoutFlag.objects.count() - - return { - 'user': user, - 'flagged_profiles': flagged_profiles, - 'flagged_comments': flagged_comments, - 'new_downloads': new_downloads, - 'flagged_posts': flagged_posts, - 'event_requests': event_requests, - 'new_stories': new_stories, - 'new_links': new_links, - 'broken_links': broken_links, - 'flagged_shouts': flagged_shouts, - } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/templatetags/script_tags.py --- a/gpp/core/templatetags/script_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,27 +0,0 @@ -""" -Template tags to generate and ' % (prefix, path) - - return s diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/urls.py --- a/gpp/core/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,9 +0,0 @@ -""" -Urls for the core application. -""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('core.views', - url(r'^markdown_help/$', 'markdown_help', name='core-markdown_help'), - url(r'^ajax/users/$', 'ajax_users', name='core-ajax_users'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/views.py --- a/gpp/core/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,38 +0,0 @@ -""" -Views for the core application. These are mainly shared, common views -used by multiple applications. -""" -from django.contrib.auth.models import User -from django.http import HttpResponse -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.contrib.auth.decorators import login_required -from django.views.decorators.http import require_GET -import django.utils.simplejson as json - - -@login_required -@require_GET -def markdown_help(request): - """ - This view provides the Markdown help cheat sheet. It is expected - to be called via AJAX. - """ - return render_to_response('core/markdown_help.html') - - -def ajax_users(request): - """ - If the user is authenticated, return a JSON array of strings of usernames - whose names start with the 'q' GET parameter, limited by the 'limit' GET - parameter. Only active usernames are returned. - If the user is not authenticated, return an empty array. - """ - q = request.GET.get('q', None) - if q is None or not request.user.is_authenticated(): - return HttpResponse(json.dumps([]), content_type='application/json') - - limit = int(request.GET.get('limit', 10)) - users = User.objects.filter(is_active=True, - username__istartswith=q).values_list('username', flat=True)[:limit] - return HttpResponse(json.dumps(list(users)), content_type='application/json') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/whos_online.py --- a/gpp/core/whos_online.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,239 +0,0 @@ -""" -This module keeps track of who is online. We maintain records for both -authenticated users ("users") and non-authenticated visitors ("visitors"). -""" -import datetime -import logging -import time - -import redis - -from core.services import get_redis_connection -from core.models import Statistic - - -# Users and visitors each have a sorted set in a Redis database. When a user or -# visitor is seen, the respective set is updated with the score of the current -# time. Periodically we remove elements by score (time) to stale out members. - -# Redis key names: -USER_SET_KEY = "whos_online:users" -VISITOR_SET_KEY = "whos_online:visitors" - -CORE_STATS_KEY = "core:stats" - -# the period over which we collect who's online stats: -MAX_AGE = datetime.timedelta(minutes=15) - - -# Logging: we don't want a Redis malfunction to bring down the site. So we -# catch all Redis exceptions, log them, and press on. -logger = logging.getLogger(__name__) - - -def _get_connection(): - """ - Create and return a Redis connection. Returns None on failure. - """ - try: - conn = get_redis_connection() - return conn - except redis.RedisError, e: - logger.error(e) - - return None - - -def to_timestamp(dt): - """ - Turn the supplied datetime object into a UNIX timestamp integer. - - """ - return int(time.mktime(dt.timetuple())) - - -def _zadd(key, member): - """ - Adds the member to the given set key, using the current time as the score. - - """ - conn = _get_connection() - if conn: - ts = to_timestamp(datetime.datetime.now()) - try: - conn.zadd(key, ts, member) - except redis.RedisError, e: - logger.error(e) - - -def _zrangebyscore(key): - """ - Performs a zrangebyscore operation on the set given by key. - The minimum score will be a timestap equal to the current time - minus MAX_AGE. The maximum score will be a timestap equal to the - current time. - - """ - conn = _get_connection() - if conn: - now = datetime.datetime.now() - min = to_timestamp(now - MAX_AGE) - max = to_timestamp(now) - try: - return conn.zrangebyscore(key, min, max) - except redis.RedisError, e: - logger.error(e) - - return [] - - -def report_user(username): - """ - Call this function when a user has been seen. The username will be added to - the set of users online. - - """ - _zadd(USER_SET_KEY, username) - - -def report_visitor(ip): - """ - Call this function when a visitor has been seen. The IP address will be - added to the set of visitors online. - - """ - _zadd(VISITOR_SET_KEY, ip) - - -def get_users_online(): - """ - Returns a list of user names from the user set. - sets. - """ - return _zrangebyscore(USER_SET_KEY) - - -def get_visitors_online(): - """ - Returns a list of visitor IP addresses from the visitor set. - """ - return _zrangebyscore(VISITOR_SET_KEY) - - -def _tick(conn): - """ - Call this function to "age out" the sets by removing old users/visitors. - It then returns a tuple of the form: - (zcard users, zcard visitors) - - """ - cutoff = to_timestamp(datetime.datetime.now() - MAX_AGE) - - try: - pipeline = conn.pipeline(transaction=False) - pipeline.zremrangebyscore(USER_SET_KEY, 0, cutoff) - pipeline.zremrangebyscore(VISITOR_SET_KEY, 0, cutoff) - pipeline.zcard(USER_SET_KEY) - pipeline.zcard(VISITOR_SET_KEY) - result = pipeline.execute() - except redis.RedisError, e: - logger.error(e) - return 0, 0 - - return result[2], result[3] - - -def max_users(): - """ - Run this function periodically to clean out the sets and to compute our max - users and max visitors statistics. - - """ - conn = _get_connection() - if not conn: - return - - num_users, num_visitors = _tick(conn) - now = datetime.datetime.now() - - stats = get_stats(conn) - update = False - - if stats is None: - stats = Statistic(id=1, - max_users=num_users, - max_users_date=now, - max_anon_users=num_visitors, - max_anon_users_date=now) - update = True - else: - if num_users > stats.max_users: - stats.max_users = num_users - stats.max_users_date = now - update = True - - if num_visitors > stats.max_anon_users: - stats.max_anon_users = num_visitors - stats.max_anon_users_date = now - update = True - - if update: - _save_stats_to_redis(conn, stats) - stats.save() - - -def get_stats(conn=None): - """ - This function retrieves the who's online max user stats out of Redis. If - the keys do not exist in Redis, we fall back to the database. If the stats - are not available, None is returned. - Note that if we can find stats data, it will be returned as a Statistic - object. - - """ - if conn is None: - conn = _get_connection() - - stats = None - if conn: - try: - stats = conn.hgetall(CORE_STATS_KEY) - except redis.RedisError, e: - logger.error(e) - - if stats: - return Statistic( - id=1, - max_users=stats['max_users'], - max_users_date=datetime.datetime.fromtimestamp( - float(stats['max_users_date'])), - max_anon_users=stats['max_anon_users'], - max_anon_users_date=datetime.datetime.fromtimestamp( - float(stats['max_anon_users_date']))) - - try: - stats = Statistic.objects.get(pk=1) - except Statistic.DoesNotExist: - return None - else: - _save_stats_to_redis(conn, stats) - return stats - - -def _save_stats_to_redis(conn, stats): - """ - Saves the statistics to Redis. A TTL is put on the key to prevent Redis and - the database from becoming out of sync. - - """ - fields = dict( - max_users=stats.max_users, - max_users_date=to_timestamp(stats.max_users_date), - max_anon_users=stats.max_anon_users, - max_anon_users_date=to_timestamp(stats.max_anon_users_date)) - - try: - conn.hmset(CORE_STATS_KEY, fields) - conn.expire(CORE_STATS_KEY, 4 * 60 * 60) - except redis.RedisError, e: - logger.error(e) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/core/widgets.py --- a/gpp/core/widgets.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,55 +0,0 @@ -""" -Various useful widgets for the GPP application. -""" - -from django import forms -from django.utils.safestring import mark_safe -from django.core.urlresolvers import reverse -from django.conf import settings - - -class AutoCompleteUserInput(forms.TextInput): - - def render(self, name, value, attrs=None): - url = reverse('core-ajax_users') - output = super(AutoCompleteUserInput, self).render(name, value, attrs) - return output + mark_safe(u"""\ -""" % (name, url)) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/custom_search/forms.py --- a/gpp/custom_search/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,34 +0,0 @@ -""" -This module contains custom forms to tailor the Haystack search application to -our needs. - -""" -from django import forms -from haystack.forms import ModelSearchForm - - -MODEL_CHOICES = ( - ('forums.topic', 'Forum Topics'), - ('forums.post', 'Forum Posts'), - ('news.story', 'News Stories'), - ('bio.userprofile', 'User Profiles'), - ('weblinks.link', 'Links'), - ('downloads.download', 'Downloads'), - ('podcast.item', 'Podcasts'), - ('ygroup.post', 'Yahoo Group Archives'), -) - - -class CustomModelSearchForm(ModelSearchForm): - """ - This customized ModelSearchForm allows us to explictly label and order - the model choices. - - """ - q = forms.CharField(required=False, label='', - widget=forms.TextInput(attrs={'class': 'text', 'size': 48})) - - def __init__(self, *args, **kwargs): - super(CustomModelSearchForm, self).__init__(*args, **kwargs) - self.fields['models'] = forms.MultipleChoiceField(choices=MODEL_CHOICES, - label='', widget=forms.CheckboxSelectMultiple) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/custom_search/indexes.py --- a/gpp/custom_search/indexes.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,31 +0,0 @@ -""" -This module contains custom search indexes to tailor the Haystack search -application to our needs. - -""" -from queued_search.indexes import QueuedSearchIndex - - -class CondQueuedSearchIndex(QueuedSearchIndex): - """ - This customized version of QueuedSearchIndex conditionally enqueues items - to be indexed by calling the can_index() method. - - """ - def can_index(self, instance): - """ - The default is to index all instances. Override this method to - customize the behavior. This will be called on all update operations. - - """ - return True - - def enqueue(self, action, instance): - """ - This method enqueues the instance only if the can_index() method - returns True. - - """ - if (action == 'update' and self.can_index(instance) or - action == 'delete'): - super(CondQueuedSearchIndex, self).enqueue(action, instance) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/custom_search/tasks.py --- a/gpp/custom_search/tasks.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,18 +0,0 @@ -""" -Tasks for our custom search application. - -""" -from celery.task import task - -from queued_search.management.commands.process_search_queue import Command - - -@task -def process_search_queue_task(): - """ - Celery task to run the queued_search application's process_search_queue - command. - - """ - command = Command() - command.execute() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/donations/admin.py --- a/gpp/donations/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,10 +0,0 @@ -""" -This file contains the admin definitions for the donations application. -""" -from django.contrib import admin -from donations.models import Donation - -class DonationAdmin(admin.ModelAdmin): - raw_id_fields = ('user', ) - -admin.site.register(Donation, DonationAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/donations/models.py --- a/gpp/donations/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,85 +0,0 @@ -""" -Models for the donations application. -""" -import datetime -import decimal - -from django.db import models -from django.contrib.auth.models import User -from django.conf import settings - - -class DonationManager(models.Manager): - def monthly_stats(self, year=None, month=None): - """ - Returns a tuple of items for the given month in the given - year. If year is None, the current year is used. If month is None, - the current month is used. - The returned tuple has the following items, in order: - (gross, net, donations) - where: - 'gross': total gross donations - 'net': total net donations - 'donations': list of donation objects - """ - today = datetime.date.today() - if year is None: - year = today.year - if month is None: - month = today.month - - qs = self.filter(payment_date__year=year, - payment_date__month=month, - test_ipn=settings.DONATIONS_DEBUG).order_by( - 'payment_date').select_related('user') - - gross = decimal.Decimal() - net = decimal.Decimal() - donations = [] - for donation in qs: - gross += donation.mc_gross - net += donation.mc_gross - donation.mc_fee - donations.append(donation) - - return gross, net, donations - - -class Donation(models.Model): - """Model to represent a donation to the website.""" - - user = models.ForeignKey(User, null=True, blank=True) - is_anonymous = models.BooleanField() - test_ipn = models.BooleanField(default=False, verbose_name="Test IPN") - txn_id = models.CharField(max_length=20, verbose_name="Txn ID") - txn_type = models.CharField(max_length=64) - first_name = models.CharField(max_length=64, blank=True) - last_name = models.CharField(max_length=64, blank=True) - payer_email = models.EmailField(max_length=127, blank=True) - payer_id = models.CharField(max_length=13, blank=True, verbose_name="Payer ID") - mc_fee = models.DecimalField(max_digits=8, decimal_places=2, verbose_name="Fee") - mc_gross = models.DecimalField(max_digits=8, decimal_places=2, verbose_name="Gross") - memo = models.TextField(blank=True) - payer_status = models.CharField(max_length=10, blank=True) - payment_date = models.DateTimeField() - - objects = DonationManager() - - class Meta: - ordering = ('-payment_date', ) - - def __unicode__(self): - if self.user: - return u'%s from %s' % (self.mc_gross, self.user.username) - return u'%s from %s %s' % (self.mc_gross, self.first_name, self.last_name) - - def donor(self): - """Returns the donor name for the donation.""" - if self.is_anonymous: - return settings.DONATIONS_ANON_NAME - if self.user is not None: - return self.user.username - if self.first_name or self.last_name: - name = u'%s %s' % (self.first_name, self.last_name) - return name.strip() - return settings.DONATIONS_ANON_NAME - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/donations/tests.py --- a/gpp/donations/tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,110 +0,0 @@ -""" -Tests for the donations application. -""" -import urlparse -from decimal import Decimal -import datetime - -from django.contrib.auth.models import User -from django.test import TestCase -from django.core.urlresolvers import reverse - -from donations.models import Donation -import bio.badges - - -# This data was copy/pasted from my actual Paypal IPN history. Some alterations -# were made since this file is getting committed to version control and I -# didn't want to store "real" data that could be used to trace a transaction or -# real payer. - -# This data is for a non-anonymous donation: -TEST_POST_DATA_1 = """\ -mc_gross=5.00&protection_eligibility=Ineligible&payer_id=FAKEPAYERID01&tax=0.00&payment_date=04:14:08 Jan 21, 2011 PST&payment_status=Completed&charset=windows-1252&first_name=John&option_selection1=No&mc_fee=0.50¬ify_version=3.0&custom=test_user&payer_status=verified&business=brian@surfguitar101.com&quantity=1&verify_sign=Ai1PaTHIS-IS-FAKE-DATA-jB264AOjpiTa4vcsPCEavq-83oyIclHKI&payer_email=test_user@example.com&option_name1=List your name?&txn_id=TESTTXNID5815921V&payment_type=instant&last_name=Doe&receiver_email=brian@surfguitar101.com&payment_fee=0.50&receiver_id=FAKERECEIVERU&txn_type=web_accept&item_name=Donation for www.surfguitar101.com&mc_currency=USD&item_number=500&residence_country=AU&handling_amount=0.00&transaction_subject=test_user&payment_gross=5.00&shipping=0.00""" - -# Data from a user that wanted to remain anonymous -TEST_POST_DATA_2 = """\ -mc_gross=100.00&protection_eligibility=Ineligible&payer_id=FAKEPAYERID02&tax=0.00&payment_date=05:40:33 Jan 16, 2011 PST&payment_status=Completed&charset=windows-1252&first_name=John&option_selection1=No&mc_fee=3.20¬ify_version=3.0&custom=test_user&payer_status=unverified&business=brian@surfguitar101.com&quantity=1&verify_sign=AIkKNFAKE-DATA-NOT-REALpqCSxA-E7Tm4rMGlUpNy6ym0.exBzfiyI&payer_email=test_user@example.com&option_name1=List your name?&txn_id=TESTTXNIDK548343A&payment_type=instant&last_name=Doe&receiver_email=brian@surfguitar101.com&payment_fee=3.20&receiver_id=FAKERECEIVERU&txn_type=web_accept&item_name=Donation for www.surfguitar101.com&mc_currency=USD&item_number=501&residence_country=US&handling_amount=0.00&transaction_subject=test_user&payment_gross=100.00&shipping=0.00""" - - -class DonationsTest(TestCase): - fixtures = ['badges'] - - def test_ipn_post_1(self): - """ - Test a simulated IPN post - """ - user = User.objects.create_user('test_user', 'test_user@example.com', - 'password') - user.save() - - args = urlparse.parse_qs(TEST_POST_DATA_1) - response = self.client.post(reverse('donations-ipn'), args) - - self.assertEqual(response.status_code, 200) - - try: - d = Donation.objects.get(pk=1) - except Donation.DoesNotExist: - self.fail("Donation object was not created") - else: - self.assertEqual(d.user, user) - self.assertFalse(d.is_anonymous) - self.assertFalse(d.test_ipn) - self.assertEqual(d.txn_id, 'TESTTXNID5815921V') - self.assertEqual(d.txn_type, 'web_accept') - self.assertEqual(d.first_name, 'John') - self.assertEqual(d.last_name, 'Doe') - self.assertEqual(d.payer_email, 'test_user@example.com') - self.assertEqual(d.payer_id, 'FAKEPAYERID01') - self.assertEqual(d.mc_fee, Decimal('0.50')) - self.assertEqual(d.mc_gross, Decimal('5.00')) - self.assertEqual(d.memo, '') - self.assertEqual(d.payer_status, 'verified') - self.assertEqual(d.payment_date, - datetime.datetime(2011, 1, 21, 4, 14, 8)) - - # user should have got a badge for donating - p = user.get_profile() - badges = list(p.badges.all()) - self.assertEqual(len(badges), 1) - if len(badges) == 1: - self.assertEqual(badges[0].numeric_id, bio.badges.CONTRIBUTOR_PIN) - - def test_ipn_post_2(self): - """ - Test a simulated IPN post - """ - user = User.objects.create_user('test_user', 'test_user@example.com', - 'password') - user.save() - - args = urlparse.parse_qs(TEST_POST_DATA_2) - response = self.client.post(reverse('donations-ipn'), args) - - self.assertEqual(response.status_code, 200) - - try: - d = Donation.objects.get(pk=1) - except Donation.DoesNotExist: - self.fail("Donation object was not created") - else: - self.assertEqual(d.user, user) - self.assertTrue(d.is_anonymous) - self.assertFalse(d.test_ipn) - self.assertEqual(d.txn_id, 'TESTTXNIDK548343A') - self.assertEqual(d.txn_type, 'web_accept') - self.assertEqual(d.first_name, 'John') - self.assertEqual(d.last_name, 'Doe') - self.assertEqual(d.payer_email, 'test_user@example.com') - self.assertEqual(d.payer_id, 'FAKEPAYERID02') - self.assertEqual(d.mc_fee, Decimal('3.20')) - self.assertEqual(d.mc_gross, Decimal('100.00')) - self.assertEqual(d.memo, '') - self.assertEqual(d.payer_status, 'unverified') - self.assertEqual(d.payment_date, - datetime.datetime(2011, 1, 16, 5, 40, 33)) - - # user should not have got a badge for donating - p = user.get_profile() - self.assertEqual(p.badges.count(), 0) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/donations/urls.py --- a/gpp/donations/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,14 +0,0 @@ -""" -URLs for the donations application. -""" -from django.conf.urls import patterns, url -from django.views.generic import TemplateView - -urlpatterns = patterns('donations.views', - url(r'^$', 'index', name='donations-index'), - url(r'^ipn/$', 'ipn', name='donations-ipn'), -) -urlpatterns += patterns('', - url(r'^thanks/$', TemplateView.as_view(template_name='donations/thanks.html'), - name='donations-thanks'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/donations/views.py --- a/gpp/donations/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,221 +0,0 @@ -""" -Views for the donations application. -""" -import urllib2 -import decimal -import datetime -import logging - -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.conf import settings -from django.contrib.sites.models import Site -from django.http import HttpResponse -from django.http import HttpResponseServerError -from django.contrib.auth.models import User -from django.views.decorators.csrf import csrf_exempt - - -from donations.models import Donation - -PP_DATE_FMT = '%H:%M:%S %b %d, %Y' - -def paypal_params(): - """ - This function returns a tuple where the 1st element is the Paypal - URL and the 2nd element is the Paypal business email. This information - depends on the setting DONATIONS_DEBUG. - """ - if settings.DONATIONS_DEBUG: - form_action = 'https://www.sandbox.paypal.com/cgi-bin/webscr' - business = settings.DONATIONS_BUSINESS_DEBUG - else: - form_action = 'https://www.paypal.com/cgi-bin/webscr' - business = settings.DONATIONS_BUSINESS - - return form_action, business - - -def verify_request(params): - """ - Send the parameters back to Paypal and return the response string. - """ - # If we are doing localhost-type unit tests, just return whatever - # the test wants us to... - if hasattr(settings, 'DONATIONS_DEBUG_VERIFY_RESPONSE'): - return settings.DONATIONS_DEBUG_VERIFY_RESPONSE - - req = urllib2.Request(paypal_params()[0], params) - req.add_header("Content-type", "application/x-www-form-urlencoded") - try: - response = urllib2.urlopen(req) - except URLError, e: - logging.exception('IPN: exception verifying IPN: %s', e) - return None - - return response.read() - - -def index(request): - gross, net, donations = Donation.objects.monthly_stats() - current_site = Site.objects.get_current() - form_action, business = paypal_params() - - return render_to_response('donations/index.html', { - 'goal': settings.DONATIONS_GOAL, - 'gross': gross, - 'net': net, - 'left': settings.DONATIONS_GOAL - net, - 'donations': donations, - 'form_action': form_action, - 'business': business, - 'anonymous': settings.DONATIONS_ANON_NAME, - 'item_name': settings.DONATIONS_ITEM_NAME, - 'item_number': settings.DONATIONS_ITEM_NUM, - 'item_anon_number': settings.DONATIONS_ITEM_ANON_NUM, - 'domain': current_site.domain, - }, - context_instance = RequestContext(request)) - - -@csrf_exempt -def ipn(request): - """ - This function is the IPN listener and handles the IPN POST from Paypal. - The algorithm here roughly follows the outline described in chapter 2 - of Paypal's IPNGuide.pdf "Implementing an IPN Listener". - - """ - # Log some info about this IPN event - ip = request.META.get('REMOTE_ADDR', '?') - parameters = request.POST.copy() - logging.info('IPN from %s; post data: %s', ip, parameters.urlencode()) - - # Now we follow the instructions in chapter 2 of the Paypal IPNGuide.pdf. - # Create a request that contains exactly the same IPN variables and values in - # the same order, preceded with cmd=_notify-validate - parameters['cmd']='_notify-validate' - - # Post the request back to Paypal (either to the sandbox or the real deal), - # and read the response: - status = verify_request(parameters.urlencode()) - if status != 'VERIFIED': - logging.warning('IPN: Payapl did not verify; status was %s', status) - return HttpResponse() - - # Response was VERIFIED; act on this if it is a Completed donation, - # otherwise don't handle it (we are just a donations application. Here - # is where we could be expanded to be a more general payment processor). - - payment_status = parameters.get('payment_status') - if payment_status != 'Completed': - logging.info('IPN: payment_status is %s; we are done.', payment_status) - return HttpResponse() - - # Is this a donation to the site? - item_number = parameters.get('item_number') - if (item_number == settings.DONATIONS_ITEM_NUM or - item_number == settings.DONATIONS_ITEM_ANON_NUM): - process_donation(item_number, parameters) - else: - logging.info('IPN: not a donation; done.') - - return HttpResponse() - - -def process_donation(item_number, params): - """ - A few validity and duplicate checks are made on the donation params. - If everything is ok, construct a donation object from the parameters and - store it in the database. - - """ - # Has this transaction been processed before? - txn_id = params.get('txn_id') - if txn_id is None: - logging.error('IPN: missing txn_id') - return - - try: - donation = Donation.objects.get(txn_id__exact=txn_id) - except Donation.DoesNotExist: - pass - else: - logging.warning('IPN: duplicate txn_id') - return # no exception, this is a duplicate - - # Is the email address ours? - business = params.get('business') - if business != paypal_params()[1]: - logging.warning('IPN: invalid business: %s', business) - return - - # is this a payment received? - txn_type = params.get('txn_type') - if txn_type != 'web_accept': - logging.warning('IPN: invalid txn_type: %s', txn_type) - return - - # Looks like a donation, save it to the database. - # Determine which user this came from, if any. - # The username is stored in the custom field if the user was logged in when - # the donation was made. - user = None - if 'custom' in params and params['custom']: - try: - user = User.objects.get(username__exact=params['custom']) - except User.DoesNotExist: - pass - - is_anonymous = item_number == settings.DONATIONS_ITEM_ANON_NUM - test_ipn = params.get('test_ipn') == '1' - - first_name = params.get('first_name', '') - last_name = params.get('last_name', '') - payer_email = params.get('payer_email', '') - payer_id = params.get('payer_id', '') - memo = params.get('memo', '') - payer_status = params.get('payer_status', '') - - try: - mc_gross = decimal.Decimal(params['mc_gross']) - mc_fee = decimal.Decimal(params['mc_fee']) - except KeyError, decimal.InvalidOperation: - logging.error('IPN: invalid/missing mc_gross or mc_fee') - return - - payment_date = params.get('payment_date') - if payment_date is None: - logging.error('IPN: missing payment_date') - return - - # strip off the timezone - payment_date = payment_date[:-4] - try: - payment_date = datetime.datetime.strptime(payment_date, PP_DATE_FMT) - except ValueError: - logging.error('IPN: invalid payment_date "%s"', params['payment_date']) - return - - try: - donation = Donation( - user=user, - is_anonymous=is_anonymous, - test_ipn=test_ipn, - txn_id=txn_id, - txn_type=txn_type, - first_name=first_name, - last_name=last_name, - payer_email=payer_email, - payer_id=payer_id, - memo=memo, - payer_status=payer_status, - mc_gross=mc_gross, - mc_fee=mc_fee, - payment_date=payment_date) - except: - logging.exception('IPN: exception during donation creation') - else: - donation.save() - logging.info('IPN: donation saved') - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/__init__.py --- a/gpp/downloads/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1 +0,0 @@ -import signals diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/admin.py --- a/gpp/downloads/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,81 +0,0 @@ -""" -This file contains the automatic admin site definitions for the downloads models. -""" -import datetime - -from django.contrib import admin -from django.conf import settings - -from downloads.models import PendingDownload -from downloads.models import Download -from downloads.models import Category -from downloads.models import AllowedExtension -from downloads.models import VoteRecord - - -class CategoryAdmin(admin.ModelAdmin): - list_display = ('title', 'slug', 'description', 'count') - prepopulated_fields = {'slug': ('title', )} - readonly_fields = ('count', ) - - -class PendingDownloadAdmin(admin.ModelAdmin): - exclude = ('html', ) - list_display = ('title', 'user', 'category', 'date_added', 'ip_address', 'size') - ordering = ('date_added', ) - raw_id_fields = ('user', ) - readonly_fields = ('update_date', ) - - actions = ('approve_downloads', ) - - def approve_downloads(self, request, qs): - for pending_dl in qs: - dl = Download( - title=pending_dl.title, - category=pending_dl.category, - description=pending_dl.description, - html=pending_dl.html, - file=pending_dl.file, - user=pending_dl.user, - date_added=datetime.datetime.now(), - ip_address=pending_dl.ip_address, - hits=0, - average_score=0.0, - total_votes=0, - is_public=True) - dl.save() - - # If we don't do this, the actual file will be deleted when - # the pending download is deleted. - pending_dl.file = None - pending_dl.delete() - - approve_downloads.short_description = "Approve selected downloads" - - -class DownloadAdmin(admin.ModelAdmin): - exclude = ('html', ) - list_display = ('title', 'user', 'category', 'date_added', 'ip_address', - 'hits', 'average_score', 'size', 'is_public') - list_filter = ('date_added', 'is_public', 'category') - list_editable = ('is_public', ) - date_hierarchy = 'date_added' - ordering = ('-date_added', ) - search_fields = ('title', 'description', 'user__username') - raw_id_fields = ('user', ) - readonly_fields = ('update_date', ) - save_on_top = True - - -class VoteRecordAdmin(admin.ModelAdmin): - list_display = ('user', 'download', 'vote_date') - list_filter = ('user', 'download') - date_hierarchy = 'vote_date' - - -admin.site.register(PendingDownload, PendingDownloadAdmin) -admin.site.register(Download, DownloadAdmin) -admin.site.register(Category, CategoryAdmin) -admin.site.register(AllowedExtension) -admin.site.register(VoteRecord, VoteRecordAdmin) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/fixtures/downloads_categories.json --- a/gpp/downloads/fixtures/downloads_categories.json Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,82 +0,0 @@ -[ - { - "pk": 1, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "Jam along to backing tracks made by your fellow SG101'ers!", - "slug": "backing-tracks", - "title": "Backing Tracks" - } - }, - { - "pk": 5, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "User demos.", - "slug": "demos", - "title": "Demos" - } - }, - { - "pk": 2, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "Recordings of user gear in action.", - "slug": "gear-samples", - "title": "Gear Samples" - } - }, - { - "pk": 6, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "Interviews with surf scenesters.", - "slug": "interviews", - "title": "Interviews" - } - }, - { - "pk": 3, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "Anything else.", - "slug": "misc", - "title": "Misc" - } - }, - { - "pk": 7, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "Legal music created by members.", - "slug": "music", - "title": "Music" - } - }, - { - "pk": 4, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "Please upload original surf music ringtones here.", - "slug": "ringtones", - "title": "Ringtones" - } - }, - { - "pk": 8, - "model": "downloads.category", - "fields": { - "count": 0, - "description": "User contributed tablature. Please upload in .pdf or .txt formats only.", - "slug": "tablature", - "title": "Tablature" - } - } -] diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/fixtures/downloads_extensions.json --- a/gpp/downloads/fixtures/downloads_extensions.json Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,86 +0,0 @@ -[ - { - "pk": 7, - "model": "downloads.allowedextension", - "fields": { - "extension": ".gif" - } - }, - { - "pk": 9, - "model": "downloads.allowedextension", - "fields": { - "extension": ".jpeg" - } - }, - { - "pk": 8, - "model": "downloads.allowedextension", - "fields": { - "extension": ".jpg" - } - }, - { - "pk": 6, - "model": "downloads.allowedextension", - "fields": { - "extension": ".m4a" - } - }, - { - "pk": 10, - "model": "downloads.allowedextension", - "fields": { - "extension": ".mov" - } - }, - { - "pk": 3, - "model": "downloads.allowedextension", - "fields": { - "extension": ".mp3" - } - }, - { - "pk": 5, - "model": "downloads.allowedextension", - "fields": { - "extension": ".mp4" - } - }, - { - "pk": 2, - "model": "downloads.allowedextension", - "fields": { - "extension": ".pdf" - } - }, - { - "pk": 13, - "model": "downloads.allowedextension", - "fields": { - "extension": ".png" - } - }, - { - "pk": 1, - "model": "downloads.allowedextension", - "fields": { - "extension": ".txt" - } - }, - { - "pk": 4, - "model": "downloads.allowedextension", - "fields": { - "extension": ".wma" - } - }, - { - "pk": 11, - "model": "downloads.allowedextension", - "fields": { - "extension": ".zip" - } - } -] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/forms.py --- a/gpp/downloads/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,38 +0,0 @@ -""" -Forms for the downloads application. -""" -import os - -from django import forms -from django.conf import settings - -from downloads.models import PendingDownload -from downloads.models import AllowedExtension - - -class AddDownloadForm(forms.ModelForm): - """Form to allow adding downloads.""" - title = forms.CharField(required=True, - widget=forms.TextInput(attrs={'size': 64, 'maxlength': 64})) - description = forms.CharField(required=False, - widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) - - def clean_file(self): - file = self.cleaned_data['file'] - ext = os.path.splitext(file.name)[1] - allowed_exts = AllowedExtension.objects.get_extension_list() - if ext in allowed_exts: - return file - raise forms.ValidationError('The file extension "%s" is not allowed.' % ext) - - class Meta: - model = PendingDownload - fields = ('title', 'category', 'description', 'file') - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui']) - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui']) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/management/commands/dlcatreport.py --- a/gpp/downloads/management/commands/dlcatreport.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,38 +0,0 @@ -""" -dlcatreport - a management command to produce a HTML report of all the downloads -in a given category. - -""" -from django.core.management.base import LabelCommand, CommandError -from django.template.loader import render_to_string - -from downloads.models import Category, Download - - -class Command(LabelCommand): - help = "Produce on standard output a report of all downloads in a category." - args = "category-slug" - - def handle_label(self, slug, **options): - """ - Render a template using the downloads in a given category and send it to - stdout. - - """ - try: - category = Category.objects.get(slug=slug) - except Category.DoesNotExist: - raise CommandError("category slug '%s' does not exist" % slug) - - downloads = Download.public_objects.filter(category=category).order_by( - 'title').select_related() - - report = render_to_string('downloads/commands/category_report.html', { - 'category': category, - 'downloads': downloads, - }) - - # encode it ourselves since it can fail if you try to redirect output to - # a file and any of the content is not ASCII... - print report.encode('utf-8') - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/management/commands/dlwgetcat.py --- a/gpp/downloads/management/commands/dlwgetcat.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,53 +0,0 @@ -""" -dlwgetcat - a management command to produce a bash script that wgets all the -files in a given category. - -""" -import os.path - -from django.core.management.base import LabelCommand, CommandError -from django.template.loader import render_to_string -from django.template.defaultfilters import slugify -from django.contrib.sites.models import Site -from django.conf import settings - -from downloads.models import Category, Download - - -class Command(LabelCommand): - help = ("Produce on standard output a bash script that wgets all the files" - " in a category. The files are downloaded with a slugified name.") - - args = "category-slug" - - def handle_label(self, slug, **options): - """ - Render a template using the downloads in a given category and send it to - stdout. - - """ - try: - category = Category.objects.get(slug=slug) - except Category.DoesNotExist: - raise CommandError("category slug '%s' does not exist" % slug) - - downloads = Download.public_objects.filter(category=category).order_by( - 'title').select_related() - - # Create new destination names for the files since the uploaders often - # give the files terrible names. The new names will be slugified - # versions of the titles, with the same extension. - - for dl in downloads: - ext = os.path.splitext(dl.file.name)[1] - dl.dest_filename = slugify(dl.title) + ext - - output = render_to_string('downloads/commands/wget_cat.html', { - 'downloads': downloads, - 'domain': Site.objects.get_current().domain, - 'MEDIA_URL': settings.MEDIA_URL, - }) - - # encode it ourselves since it can fail if you try to redirect output to - # a file and any of the content is not ASCII... - print output.encode('utf-8') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/models.py --- a/gpp/downloads/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,166 +0,0 @@ -""" -Models for the downloads application. -""" -import os - -import datetime -from django.db import models -from django.contrib.auth.models import User -from django.template.defaultfilters import filesizeformat - -from core.markup import site_markup - - -class Category(models.Model): - """Downloads belong to categories.""" - title = models.CharField(max_length=64) - slug = models.SlugField(max_length=64) - description = models.TextField(blank=True) - count = models.IntegerField(default=0, blank=True) - - class Meta: - verbose_name_plural = 'Categories' - ordering = ('title', ) - - def __unicode__(self): - return self.title - - -def download_path(instance, filename): - """ - Creates a path for a download. Uses the current date to avoid filename - clashes. Uses the current microsecond also to make the directory name - harder to guess. - """ - now = datetime.datetime.now() - parts = ['downloads'] - parts.extend([str(p) for p in (now.year, now.month, now.day)]) - parts.append(hex((now.hour * 3600 + now.minute * 60 + now.second) * 1000 + ( - now.microsecond / 1000))[2:]) - parts.append(filename) - return os.path.join(*parts) - - -class PublicDownloadManager(models.Manager): - """The manager for all public downloads.""" - def get_query_set(self): - return super(PublicDownloadManager, self).get_query_set().filter( - is_public=True).select_related() - - -class DownloadBase(models.Model): - """Abstract model to collect common download fields.""" - title = models.CharField(max_length=128) - category = models.ForeignKey(Category) - description = models.TextField() - html = models.TextField(blank=True) - file = models.FileField(upload_to=download_path) - user = models.ForeignKey(User) - date_added = models.DateTimeField(db_index=True) - ip_address = models.IPAddressField('IP Address') - update_date = models.DateTimeField(db_index=True, blank=True) - - class Meta: - abstract = True - - def size(self): - return filesizeformat(self.file.size) - - -class PendingDownload(DownloadBase): - """This model represents pending downloads created by users. These pending - downloads must be approved by an admin before they turn into "real" - Downloads and are visible on site. - """ - class Meta: - ordering = ('date_added', ) - - def __unicode__(self): - return self.title - - def save(self, *args, **kwargs): - if not self.pk: - self.date_added = datetime.datetime.now() - self.update_date = self.date_added - else: - self.update_date = datetime.datetime.now() - - self.html = site_markup(self.description) - super(PendingDownload, self).save(*args, **kwargs) - - -class Download(DownloadBase): - """Model to represent a download.""" - hits = models.IntegerField(default=0) - average_score = models.FloatField(default=0.0) - total_votes = models.IntegerField(default=0) - is_public = models.BooleanField(default=False, db_index=True) - - # Managers: - objects = models.Manager() - public_objects = PublicDownloadManager() - - def __unicode__(self): - return self.title - - @models.permalink - def get_absolute_url(self): - return ('downloads-details', [str(self.id)]) - - def save(self, *args, **kwargs): - if not self.pk: - self.date_added = datetime.datetime.now() - self.update_date = self.date_added - else: - self.update_date = datetime.datetime.now() - - self.html = site_markup(self.description) - super(Download, self).save(*args, **kwargs) - - def vote(self, vote_value): - """receives a vote_value and updates internal score accordingly""" - total_score = self.average_score * self.total_votes - total_score += vote_value - self.total_votes += 1 - self.average_score = total_score / self.total_votes - return self.average_score - - def search_title(self): - return self.title - - def search_summary(self): - return self.description - - -class AllowedExtensionManager(models.Manager): - def get_extension_list(self): - return self.values_list('extension', flat=True) - - -class AllowedExtension(models.Model): - """Model to represent the list of allowed file extensions.""" - extension = models.CharField(max_length=8, help_text="e.g. .txt") - - objects = AllowedExtensionManager() - - def __unicode__(self): - return self.extension - - class Meta: - ordering = ('extension', ) - - -class VoteRecord(models.Model): - """Model to record the date that a user voted on a download.""" - download = models.ForeignKey(Download) - user = models.ForeignKey(User) - vote_date = models.DateTimeField(auto_now_add=True) - - def __unicode__(self): - return u"%s voted on '%s' on %s" % ( - self.user.username, - self.download.title, - self.vote_date.strftime('%b %d, %Y %H:%M:%S')) - - class Meta: - ordering = ('-vote_date', ) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/search_indexes.py --- a/gpp/downloads/search_indexes.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,23 +0,0 @@ -"""Haystack search index for the downloads application.""" -from haystack.indexes import * -from haystack import site -from custom_search.indexes import CondQueuedSearchIndex - -from downloads.models import Download - - -class DownloadIndex(CondQueuedSearchIndex): - text = CharField(document=True, use_template=True) - author = CharField(model_attr='user') - pub_date = DateTimeField(model_attr='date_added') - - def index_queryset(self): - return Download.public_objects.all() - - def get_updated_field(self): - return 'update_date' - - def can_index(self, instance): - return instance.is_public - -site.register(Download, DownloadIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/signals.py --- a/gpp/downloads/signals.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,43 +0,0 @@ -"""Signals for the downloads application. -We use signals to compute the denormalized category counts whenever a download -is saved.""" -from django.db.models.signals import post_save -from django.db.models.signals import post_delete - -from downloads.models import Category, Download - - -def on_download_save(sender, **kwargs): - """This function updates the count field for all categories. - It is called whenever a download is saved via a signal. - """ - if kwargs['created']: - # we only have to update the parent category - download = kwargs['instance'] - cat = download.category - cat.count = Download.public_objects.filter(category=cat).count() - cat.save() - else: - # update all categories just to be safe (an existing download could - # have been moved from one category to another - cats = Category.objects.all() - for cat in cats: - cat.count = Download.public_objects.filter(category=cat).count() - cat.save() - - -def on_download_delete(sender, **kwargs): - """This function updates the count field for the download's parent - category. It is called when a download is deleted via a signal. - """ - # update the parent category - download = kwargs['instance'] - cat = download.category - cat.count = Download.public_objects.filter(category=cat).count() - cat.save() - - -post_save.connect(on_download_save, sender=Download, - dispatch_uid='downloads.signals') -post_delete.connect(on_download_delete, sender=Download, - dispatch_uid='downloads.signals') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/static/css/downloads.css --- a/gpp/downloads/static/css/downloads.css Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,8 +0,0 @@ -#downloads-add td { - padding-bottom: 5px; -} - -#downloads-add fieldset { - margin: 1em 0 1em; - padding: 0.5em; -} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/static/js/downloads-get.js --- a/gpp/downloads/static/js/downloads-get.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,33 +0,0 @@ -$(document).ready(function() { - $('.dl-button').each(function(n) { - var button = $(this); - var id = button.attr('id'); - var numeric_id = -1; - if (id.match(/dl-(\d+)/)) - { - numeric_id = RegExp.$1; - } - button.click(function() { - button.attr('disabled', 'disabled').val('Getting link, stand by...'); - $.ajax({ - url: '/downloads/request/', - type: 'POST', - data: { id: numeric_id }, - dataType: 'json', - success: function(result) { - var link_id = result.id; - var div = $('#link-' + link_id); - div.hide(); - div.html( - 'Thank you! Your download is now ready. Click here to download.'); - div.fadeIn(3000); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - } - }); - }); - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/static/js/rating.js --- a/gpp/downloads/static/js/rating.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,115 +0,0 @@ -function dlRatingOver(event) -{ - var div = $(this).parent('div'); - var stars = $('img', div); - for (var i = 0; i <= event.data; ++i) - { - var star = $(stars[i]); - star.attr('src', '/static/icons/stars/rating_over.gif'); - } -} - -function dlRatingOut(event) -{ - var div = $(this).parent('div'); - var stars = $('img', div); - for (var i = 0; i <= event.data; ++i) - { - var star = $(stars[i]); - star.attr('src', '/static/icons/stars/rating_' + star.attr('class') + '.gif'); - } -} - -function dlRatingClick(event) -{ - var star = $(this); - var id = star.attr('id'); - if (id.match(/star-(\d+)-(\d+)/)) - { - $.ajax({ - url: '/downloads/rate/', - type: 'POST', - data: { id: RegExp.$1, rating: parseInt(RegExp.$2) + 1}, - dataType: 'text', - success: function(rating) { - rating = parseFloat(rating); - if (rating < 0) - { - alert("You've already rated this download."); - return; - } - alert('Thanks for rating this download!'); - var div = star.parent('div'); - var stars = $('img', div); - rating = parseFloat(rating); - for (var i = 0; i < 5; ++i) - { - var s = $(stars[i]); - s.removeClass(s.attr('class')); - if (rating >= 1.0) - { - s.attr('src', '/static/icons/stars/rating_on.gif'); - s.addClass('on') - rating -= 1.0; - } - else if (rating >= 0.5) - { - s.attr('src', '/static/icons/stars/rating_half.gif'); - s.addClass('half') - rating = 0; - } - else - { - s.attr('src', '/static/icons/stars/rating_off.gif'); - s.addClass('off') - } - } - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - } - }); - } -} - -$(document).ready(function() { - $('.rating').each(function(n) { - var div = $(this); - var id = div.attr('id'); - var numeric_id = -1; - if (id.match(/rating-(\d+)/)) - { - numeric_id = RegExp.$1; - } - var rating = div.html(); - div.html(''); - for (var i = 0; i < 5; ++i) - { - var star = $(''); - if (rating >= 1) - { - star.attr('src', '/static/icons/stars/rating_on.gif'); - star.addClass('on') - --rating; - } - else if (rating >= 0.5) - { - star.attr('src', '/static/icons/stars/rating_half.gif'); - star.addClass('half') - rating = 0; - } - else - { - star.attr('src', '/static/icons/stars/rating_off.gif'); - star.addClass('off') - } - star.attr('alt', 'star'); - star.attr('id', 'star-' + numeric_id + '-' + i); - star.bind('mouseover', i, dlRatingOver); - star.bind('mouseout', i, dlRatingOut); - star.click(dlRatingClick); - div.append(star); - } - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/templatetags/downloads_tags.py --- a/gpp/downloads/templatetags/downloads_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,17 +0,0 @@ -""" -Template tags for the downloads application. -""" -from django import template - -from downloads.models import Download - - -register = template.Library() - - -@register.inclusion_tag('downloads/latest_tag.html') -def latest_downloads(): - downloads = Download.public_objects.order_by('-date_added')[:10] - return { - 'downloads': downloads, - } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/urls.py --- a/gpp/downloads/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,19 +0,0 @@ -""" -URLs for the downloads application. -""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('downloads.views', - url(r'^$', 'index', name='downloads-index'), - url(r'^add/$', 'add', name='downloads-add'), - url(r'^category/(?P[\w\d-]+)/(?Ptitle|date|rating|hits)/$', - 'category', - name='downloads-category'), - url(r'^details/(\d+)/$', 'details', name='downloads-details'), - url(r'^new/$', 'new', name='downloads-new'), - url(r'^popular/$', 'popular', name='downloads-popular'), - url(r'^request/$', 'request_download', name='downloads-request_download'), - url(r'^rate/$', 'rate_download', name='downloads-rate'), - url(r'^rating/$', 'rating', name='downloads-rating'), - url(r'^thanks/$', 'thanks', name='downloads-add_thanks'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/downloads/views.py --- a/gpp/downloads/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,244 +0,0 @@ -""" -Views for the downloads application. -""" -import random - -from django.shortcuts import render_to_response, get_object_or_404 -from django.template import RequestContext -from django.contrib.auth.decorators import login_required -from django.http import Http404 -from django.http import HttpResponse -from django.http import HttpResponseRedirect -from django.http import HttpResponseForbidden -from django.http import HttpResponseBadRequest -from django.http import HttpResponseNotFound -from django.core.paginator import InvalidPage -from django.core.urlresolvers import reverse -from django.db.models import Q -from django.views.decorators.http import require_POST -import django.utils.simplejson as json - -from core.paginator import DiggPaginator -from core.functions import email_admins -from core.functions import get_page -from downloads.models import Category -from downloads.models import Download -from downloads.models import VoteRecord -from downloads.forms import AddDownloadForm - -####################################################################### - -DLS_PER_PAGE = 10 - -def create_paginator(dls): - return DiggPaginator(dls, DLS_PER_PAGE, body=5, tail=3, margin=3, padding=2) - -####################################################################### - -@login_required -def index(request): - categories = Category.objects.all() - total_dls = Download.public_objects.all().count() - return render_to_response('downloads/index.html', { - 'categories': categories, - 'total_dls': total_dls, - }, - context_instance = RequestContext(request)) - -####################################################################### -# Maps URL component to database field name for the Download table: - -DOWNLOAD_FIELD_MAP = { - 'title': 'title', - 'date': '-date_added', - 'rating': '-average_score', - 'hits': '-hits' -} - -@login_required -def category(request, slug, sort='title'): - - cat = get_object_or_404(Category, slug=slug) - - if sort not in DOWNLOAD_FIELD_MAP: - sort = 'title' - order_by = DOWNLOAD_FIELD_MAP[sort] - - downloads = Download.public_objects.filter(category=cat.pk).order_by( - order_by) - paginator = create_paginator(downloads) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - - return render_to_response('downloads/download_list.html', { - 's' : sort, - 'category' : cat, - 'page' : the_page, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def new(request): - """Display new downloads with pagination.""" - - downloads = Download.public_objects.order_by('-date_added') - - paginator = create_paginator(downloads) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - - return render_to_response('downloads/download_summary.html', { - 'page': the_page, - 'title': 'Newest Downloads', - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def popular(request): - """Display popular downloads with pagination.""" - - downloads = Download.public_objects.order_by('-hits') - - paginator = create_paginator(downloads) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - - return render_to_response('downloads/download_summary.html', { - 'page': the_page, - 'title': 'Popular Downloads', - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def rating(request): - """Display downloads by rating with pagination.""" - - downloads = Download.public_objects.order_by('-average_score') - paginator = create_paginator(downloads) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - - return render_to_response('downloads/download_summary.html', { - 'page': the_page, - 'title': 'Highest Rated Downloads', - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def details(request, id): - download = get_object_or_404(Download.public_objects, pk=id) - return render_to_response('downloads/download_detail.html', { - 'download' : download, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def add(request): - if request.method == 'POST': - form = AddDownloadForm(request.POST, request.FILES) - if form.is_valid(): - dl = form.save(commit=False) - dl.user = request.user - dl.ip_address = request.META.get('REMOTE_ADDR', None) - dl.save() - email_admins('New download for approval', """Hello, - -A user has uploaded a new download for your approval. -""") - return HttpResponseRedirect(reverse('downloads-add_thanks')) - else: - form = AddDownloadForm() - - return render_to_response('downloads/add.html', { - 'add_form': form, - }, - context_instance=RequestContext(request)) - -####################################################################### - -@login_required -def thanks(request): - return render_to_response('downloads/thanks.html', { - }, - context_instance=RequestContext(request)) - -####################################################################### - -@require_POST -def rate_download(request): - """This function is called by AJAX to rate a download.""" - if request.user.is_authenticated(): - id = request.POST.get('id', None) - rating = request.POST.get('rating', None) - if id is None or rating is None: - return HttpResponseBadRequest('Missing id or rating.') - - try: - rating = int(rating) - except ValueError: - return HttpResponseBadRequest('Invalid rating.') - - # rating will be from 0-4 - rating = min(5, max(1, rating)) - - download = get_object_or_404(Download.public_objects, pk=id) - - # prevent multiple votes from the same user - vote_record, created = VoteRecord.objects.get_or_create( - download=download, user=request.user) - if created: - new_score = download.vote(rating) - download.save() - return HttpResponse(str(new_score)) - else: - return HttpResponse('-1') - - return HttpResponseForbidden('You must be logged in to rate a download.') - -####################################################################### - -@require_POST -def request_download(request): - """ - This function is called by AJAX to request a download. We update the hit - count and then return a JSON object of the form: - { id: download-id, 'url': link-to-download } - - """ - if request.user.is_authenticated(): - dl_id = request.POST.get('id') - if dl_id: - try: - dl = Download.public_objects.get(pk=dl_id) - except Download.DoesNotExist: - return HttpResponseNotFound("Download not found") - - dl.hits += 1 - dl.save() - - s = json.dumps({'id': dl_id, 'url': dl.file.url}) - return HttpResponse(s, content_type='application/json') - - return HttpResponseForbidden('An error occurred.') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/__init__.py --- a/gpp/forums/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,2 +0,0 @@ -import signals -import latest diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/admin.py --- a/gpp/forums/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,109 +0,0 @@ -""" -This file contains the admin definitions for the forums application. -""" -from django.contrib import admin - -from forums.models import Category -from forums.models import Forum -from forums.models import Topic -from forums.models import Post -from forums.models import FlaggedPost -from forums.models import ForumLastVisit -from forums.models import TopicLastVisit -from forums.signals import (notify_new_topic, notify_updated_topic, - notify_new_post, notify_updated_post) - -import bio.badges - - -class CategoryAdmin(admin.ModelAdmin): - list_display = ('name', 'position', ) - list_editable = ('position', ) - prepopulated_fields = { 'slug': ('name', ) } - save_on_top = True - - -class ForumAdmin(admin.ModelAdmin): - list_display = ('name', 'category', 'position', 'topic_count', 'post_count') - list_editable = ('position', ) - prepopulated_fields = { 'slug': ('name', ) } - raw_id_fields = ('last_post', ) - ordering = ('category', ) - save_on_top = True - - -class TopicAdmin(admin.ModelAdmin): - list_display = ('name', 'forum', 'creation_date', 'update_date', 'user', 'sticky', 'locked', - 'post_count') - raw_id_fields = ('user', 'last_post', 'subscribers', 'bookmarkers') - search_fields = ('name', ) - date_hierarchy = 'creation_date' - list_filter = ('creation_date', 'update_date', ) - save_on_top = True - - # override save_model() to update the search index - def save_model(self, request, obj, form, change): - obj.save() - - if change: - notify_updated_topic(obj) - else: - notify_new_topic(obj) - - -class PostAdmin(admin.ModelAdmin): - list_display = ('user', 'creation_date', 'update_date', 'user_ip', 'summary') - raw_id_fields = ('topic', 'user', ) - exclude = ('html', ) - search_fields = ('body', ) - date_hierarchy = 'creation_date' - list_filter = ('creation_date', 'update_date', ) - ordering = ('-creation_date', ) - save_on_top = True - - def queryset(self, request): - return Post.objects.select_related('user') - - # override save_model() to update the search index - def save_model(self, request, obj, form, change): - obj.save() - - if change: - notify_updated_post(obj) - else: - notify_new_post(obj) - - -class FlaggedPostAdmin(admin.ModelAdmin): - list_display = ['__unicode__', 'flag_date', 'get_post_url'] - actions = ['accept_flags'] - raw_id_fields = ['post', 'user', ] - - def accept_flags(self, request, qs): - """This admin action awards a security pin to the user who reported - the post and then deletes the flagged post object. - """ - for flag in qs: - bio.badges.award_badge(bio.badges.SECURITY_PIN, flag.user) - flag.delete() - - accept_flags.short_description = "Accept selected flagged posts" - - -class ForumLastVisitAdmin(admin.ModelAdmin): - raw_id_fields = ('user', 'forum') - list_display = ('user', 'forum', 'begin_date', 'end_date') - - -class TopicLastVisitAdmin(admin.ModelAdmin): - raw_id_fields = ('user', 'topic') - list_display = ('user', 'topic', 'last_visit') - - -admin.site.register(Category, CategoryAdmin) -admin.site.register(Forum, ForumAdmin) -admin.site.register(Topic, TopicAdmin) -admin.site.register(Post, PostAdmin) -admin.site.register(FlaggedPost, FlaggedPostAdmin) -admin.site.register(ForumLastVisit, ForumLastVisitAdmin) -admin.site.register(TopicLastVisit, TopicLastVisitAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/attachments.py --- a/gpp/forums/attachments.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,59 +0,0 @@ -""" -This module contains a class for handling attachments on forum posts. -""" -from oembed.models import Oembed -from forums.models import Attachment - - -class AttachmentProcessor(object): - """ - This class is aggregated by various form classes to handle - attachments on forum posts. New posts can receive attachments and edited - posts can have their attachments replaced, augmented, or deleted. - - """ - def __init__(self, ids): - """ - This class is constructed with a list of Oembed ids. We retrieve the - actual Oembed objects associated with these keys for use in subsequent - operations. - - """ - # ensure all ids are integers - self.pks = [] - for pk in ids: - try: - pk = int(pk) - except ValueError: - continue - self.pks.append(pk) - - self.embeds = [] - if self.pks: - self.embeds = Oembed.objects.in_bulk(self.pks) - - def save_attachments(self, post): - """ - Create and save attachments to the supplied post object. - Any existing attachments on the post are removed first. - - """ - post.attachments.clear() - - for n, pk in enumerate(self.pks): - attachment = Attachment(post=post, embed=self.embeds[pk], order=n) - attachment.save() - - def has_attachments(self): - """ - Return true if we have valid pending attachments. - - """ - return len(self.embeds) > 0 - - def get_ids(self): - """ - Return the list of Oembed ids. - - """ - return self.pks diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/feeds.py --- a/gpp/forums/feeds.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,78 +0,0 @@ -""" -This file contains the feed class for the forums application. - -""" -from django.contrib.syndication.views import Feed -from django.core.exceptions import ObjectDoesNotExist -from django.shortcuts import get_object_or_404 - -from forums.models import Forum, Topic, Post -from core.functions import copyright_str -from forums.latest import get_latest_posts - - -class ForumsFeed(Feed): - """The Feed class for a specific forum""" - - ttl = '60' - author_name = 'Brian Neal' - author_email = 'admin@surfguitar101.com' - - def get_object(self, request, forum_slug): - - if forum_slug: - forum = Forum.objects.get(slug=forum_slug) - # only return public forums - if forum.id not in Forum.objects.public_forum_ids(): - raise ObjectDoesNotExist - return forum - - else: - # return None to indicate we want a combined feed - return None - - def title(self, obj): - if obj is None: - forum_name = 'Combined' - else: - forum_name = obj.name - - return 'SurfGuitar101.com %s Forum Feed' % forum_name - - def link(self, obj): - if obj is None: - bits = '' - else: - bits = obj.slug + '/' - - return '/feeds/forums/' + bits - - def description(self, obj): - if obj is None: - return "User posts to SurfGuitar101.com forums." - return obj.description - - def feed_copyright(self): - return copyright_str() - - def items(self, obj): - forum_id = obj.id if obj else None - return get_latest_posts(forum_id=forum_id) - - def item_title(self, item): - return item['title'] - - def item_description(self, item): - return item['content'] - - def item_author_name(self, item): - return item['author'] - - def item_pubdate(self, item): - return item['pubdate'] - - def item_categories(self, item): - return [item['forum_name']] - - def item_link(self, item): - return item['url'] diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/fixtures/forums.json --- a/gpp/forums/fixtures/forums.json Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,288 +0,0 @@ -[ - { - "pk": 2, - "model": "auth.group", - "fields": { - "name": "Forum Moderators", - "permissions": [] - } - }, - { - "pk": 1, - "model": "forums.category", - "fields": { - "position": 0, - "name": "SurfGuitar101.com Site Specific", - "groups": [], - "slug": "surfguitar101com-site-specific" - } - }, - { - "pk": 2, - "model": "forums.category", - "fields": { - "position": 1, - "name": "Surf Music", - "groups": [], - "slug": "surf-music" - } - }, - { - "pk": 3, - "model": "forums.category", - "fields": { - "position": 2, - "name": "Classifieds", - "groups": [], - "slug": "classifieds" - } - }, - { - "pk": 4, - "model": "forums.category", - "fields": { - "position": 3, - "name": "Off-Topic", - "groups": [], - "slug": "off-topic" - } - }, - { - "pk": 14, - "model": "forums.forum", - "fields": { - "category": 1, - "description": "For general discussion about this site only, including news and rules. Start here. Anything relating to surf music should go to the Surf Music General Discussion forum, below.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 0, - "last_post": null, - "slug": "surfguitar101-website", - "name": "SurfGuitar101 Website" - } - }, - { - "pk": 2, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "Main surf music discussion forum. Insert glissando sound here.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 0, - "last_post": null, - "slug": "surf-music", - "name": "Surf Music General Discussion" - } - }, - { - "pk": 3, - "model": "forums.forum", - "fields": { - "category": 3, - "description": "For sale and trading of surf music related items only.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 0, - "last_post": null, - "slug": "for-sale-trade", - "name": "For Sale / Trade" - } - }, - { - "pk": 4, - "model": "forums.forum", - "fields": { - "category": 4, - "description": "General off-topic chit-chat. Grab a cool drink and hop in. New members please introduce yourselves here. This forum is dedicated to the memory of Rip Thrillby and Spanky Twangler.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 0, - "last_post": null, - "slug": "shallow-end", - "name": "The Shallow End" - } - }, - { - "pk": 6, - "model": "forums.forum", - "fields": { - "category": 3, - "description": "Need someone to play with? Starting a band? Need a gig? Post here.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 1, - "last_post": null, - "slug": "musicians-gigs-wanted", - "name": "Musicians & Gigs Wanted" - } - }, - { - "pk": 8, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "Please post show announcements here.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 1, - "last_post": null, - "slug": "gigs", - "name": "Show Announcements" - } - }, - { - "pk": 9, - "model": "forums.forum", - "fields": { - "category": 1, - "description": "Got an idea for the site? Something not working? Post here.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 1, - "last_post": null, - "slug": "suggestion-box", - "name": "Suggestion Box" - } - }, - { - "pk": 5, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "Playing, performing, and writing surf music. All instruments welcome.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 2, - "last_post": null, - "slug": "surf-musician", - "name": "Surf Musician" - } - }, - { - "pk": 10, - "model": "forums.forum", - "fields": { - "category": 1, - "description": "Feedback, suggestions, playlists, and discussions about the SurfGuitar101 podcast.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 2, - "last_post": null, - "slug": "sg101-podcast", - "name": "SG101 Podcast" - } - }, - { - "pk": 7, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "For questions and discussions about instruments, amplifiers, and yes, outboard reverb units!", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 3, - "last_post": null, - "slug": "gear", - "name": "Gear" - } - }, - { - "pk": 11, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "For discussion of recording techniques.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 4, - "last_post": null, - "slug": "recording-corner", - "name": "Recording Corner" - } - }, - { - "pk": 12, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "Got a link to a surf or surf-related video? Post it here.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 5, - "last_post": null, - "slug": "surf-videos", - "name": "Surf Videos" - } - }, - { - "pk": 13, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "Please post your reviews of surf music releases here.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 6, - "last_post": null, - "slug": "music-reviews", - "name": "Music Reviews" - } - }, - { - "pk": 16, - "model": "forums.forum", - "fields": { - "category": 2, - "description": "This forum contains some classic and important threads from our history, preserved here for historical reasons! These threads are still live, so please keep posting to them.", - "post_count": 0, - "topic_count": 0, - "moderators": [ - 2 - ], - "position": 7, - "last_post": null, - "slug": "best-sg101", - "name": "Best-Of SG101" - } - } -] diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/forms.py --- a/gpp/forums/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,248 +0,0 @@ -""" -Forms for the forums application. - -""" -from django import forms -from django.conf import settings - -from forums.models import Forum -from forums.models import Topic -from forums.models import Post -from forums.attachments import AttachmentProcessor -import forums.permissions as perms -from forums.signals import notify_new_topic, notify_new_post - - -class NewPostForm(forms.Form): - """Form for creating a new post.""" - body = forms.CharField(label='', - required=False, - widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) - topic_id = forms.IntegerField(widget=forms.HiddenInput) - topic = None - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui']), - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui'] + - ['js/forums.js']) - - def __init__(self, *args, **kwargs): - super(NewPostForm, self).__init__(*args, **kwargs) - attachments = args[0].getlist('attachment') if len(args) else [] - self.attach_proc = AttachmentProcessor(attachments) - - def clean_body(self): - data = self.cleaned_data['body'] - if not data and not self.attach_proc.has_attachments(): - raise forms.ValidationError("This field is required.") - return data - - def clean_topic_id(self): - id = self.cleaned_data['topic_id'] - try: - self.topic = Topic.objects.select_related().get(pk=id) - except Topic.DoesNotExist: - raise forms.ValidationError('invalid topic') - return id - - def save(self, user, ip=None): - """ - Creates a new post from the form data and supplied arguments. - """ - post = Post(topic=self.topic, user=user, body=self.cleaned_data['body'], - user_ip=ip) - post.save() - self.attach_proc.save_attachments(post) - notify_new_post(post) - return post - - -class NewTopicForm(forms.Form): - """ - Form for creating a new topic and 1st post to that topic. - Superusers and moderators can also create the topic as a sticky or initially - locked. - """ - name = forms.CharField(label='Subject', max_length=255, - widget=forms.TextInput(attrs={'size': 64})) - body = forms.CharField(label='', required=False, - widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) - user = None - forum = None - has_mod_fields = False - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui']), - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui'] + - ['js/forums.js']) - - def __init__(self, user, forum, *args, **kwargs): - super(NewTopicForm, self).__init__(*args, **kwargs) - self.user = user - self.forum = forum - - if perms.can_moderate(forum, user): - self.fields['sticky'] = forms.BooleanField(required=False) - self.fields['locked'] = forms.BooleanField(required=False) - self.has_mod_fields = True - - attachments = args[0].getlist('attachment') if len(args) else [] - self.attach_proc = AttachmentProcessor(attachments) - - # If this form is being POSTed, and the user is trying to add - # attachments, create hidden fields to list the Oembed ids. In - # case the form isn't valid, the client-side javascript will know - # which Oembed media to ask for when the form is displayed with - # errors. - if self.attach_proc.has_attachments(): - pks = self.attach_proc.get_ids() - self.fields['attachment'] = forms.MultipleChoiceField(label='', - widget=forms.MultipleHiddenInput(), - choices=[(v, v) for v in pks]) - - def clean_body(self): - data = self.cleaned_data['body'] - if not data and not self.attach_proc.has_attachments(): - raise forms.ValidationError("This field is required.") - return data - - def save(self, ip=None): - """ - Creates the new Topic and first Post from the form data and supplied - arguments. - """ - topic = Topic(forum=self.forum, - name=self.cleaned_data['name'], - user=self.user, - sticky=self.has_mod_fields and self.cleaned_data['sticky'], - locked=self.has_mod_fields and self.cleaned_data['locked']) - topic.save() - - post = Post(topic=topic, - user=self.user, - body=self.cleaned_data['body'], - user_ip=ip) - post.save() - - self.attach_proc.save_attachments(post) - - notify_new_topic(topic) - notify_new_post(post) - - return topic - - -class PostForm(forms.ModelForm): - """ - Form for editing an existing post or a new, non-quick post. - """ - body = forms.CharField(label='', - required=False, - widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) - - class Meta: - model = Post - fields = ('body', ) - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui']), - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui'] + - ['js/forums.js']) - - def __init__(self, *args, **kwargs): - topic_name = kwargs.pop('topic_name', None) - super(PostForm, self).__init__(*args, **kwargs) - - if topic_name is not None: # this is a "first post" - self.fields.insert(0, 'name', forms.CharField(label='Subject', - max_length=255, - widget=forms.TextInput(attrs={'size': 64}))) - self.initial['name'] = topic_name - - attachments = args[0].getlist('attachment') if len(args) else [] - self.attach_proc = AttachmentProcessor(attachments) - - # If this form is being used to edit an existing post, and that post - # has attachments, create a hidden post_id field. The client-side - # AJAX will use this as a cue to retrieve the HTML for the embedded - # media. - if 'instance' in kwargs: - post = kwargs['instance'] - if post.attachments.count(): - self.fields['post_id'] = forms.CharField(label='', - widget=forms.HiddenInput(attrs={'value': post.id})) - - def clean_body(self): - data = self.cleaned_data['body'] - if not data and not self.attach_proc.has_attachments(): - raise forms.ValidationError('This field is required.') - return data - - def save(self, *args, **kwargs): - commit = kwargs.get('commit', False) - post = super(PostForm, self).save(*args, **kwargs) - - # Are we saving a "first post"? - if 'name' in self.cleaned_data: - post.topic.name = self.cleaned_data['name'] - if commit: - post.topic.save() - return post - - -class MoveTopicForm(forms.Form): - """ - Form for a moderator to move a topic to a forum. - """ - forums = forms.ModelChoiceField(label='Move to forum', - queryset=Forum.objects.none()) - - def __init__(self, user, *args, **kwargs): - hide_label = kwargs.pop('hide_label', False) - required = kwargs.pop('required', True) - super(MoveTopicForm, self).__init__(*args, **kwargs) - self.fields['forums'].queryset = \ - Forum.objects.forums_for_user(user).order_by('name') - if hide_label: - self.fields['forums'].label = '' - self.fields['forums'].required = required - - -class SplitTopicForm(forms.Form): - """ - Form for a moderator to split posts from a topic to a new topic. - """ - name = forms.CharField(label='New topic title', max_length=255, - widget=forms.TextInput(attrs={'size': 64})) - forums = forms.ModelChoiceField(label='Forum for new topic', - queryset=Forum.objects.none()) - post_ids = [] - split_at = False - - def __init__(self, user, *args, **kwargs): - super(SplitTopicForm, self).__init__(*args, **kwargs) - self.fields['forums'].queryset = \ - Forum.objects.forums_for_user(user).order_by('name') - - def clean(self): - self.post_ids = self.data.getlist('post_ids') - if len(self.post_ids) == 0: - raise forms.ValidationError('Please select some posts') - - self.split_at = 'split-at' in self.data - if self.split_at and len(self.post_ids) > 1: - raise forms.ValidationError('Please select only one post to split the topic at') - - return self.cleaned_data diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/latest.py --- a/gpp/forums/latest.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,342 +0,0 @@ -""" -This module maintains the latest posts datastore. The latest posts are often -needed by RSS feeds, "latest posts" template tags, etc. This module listens for -the post_content_update signal, then bundles the post up and stores it by forum -ID in Redis. We also maintain a combined forums list. This allows quick -retrieval of the latest posts and avoids some slow SQL queries. - -We also do things like send topic notification emails, auto-favorite, and -auto-subscribe functions here rather than bog the user down in the request / -response cycle. - -""" -import datetime -import logging -import time - -from django.dispatch import receiver -from django.utils import simplejson -import redis - -from forums.signals import post_content_update, topic_content_update -from forums.models import Forum, Topic, Post -from forums.views.subscriptions import notify_topic_subscribers -from forums.tools import auto_favorite, auto_subscribe -from core.services import get_redis_connection - -# This constant controls how many latest posts per forum we store -MAX_POSTS = 50 - -# This controls how many updated topics we track -MAX_UPDATED_TOPICS = 50 - -# Redis key names: -POST_COUNT_KEY = "forums:public_post_count" -TOPIC_COUNT_KEY = "forums:public_topic_count" -UPDATED_TOPICS_SET_KEY = "forums:updated_topics:set" -UPDATED_TOPIC_KEY = "forums:updated_topics:%s" - -logger = logging.getLogger(__name__) - - -@receiver(post_content_update, dispatch_uid='forums.latest_posts') -def on_post_update(sender, **kwargs): - """ - This function is our signal handler, called when a post has been updated. - We only care about newly created posts, and ignore updates. - - We kick off a Celery task to perform work outside of the request/response - cycle. - - """ - # ignore non-new posts - if not kwargs['created']: - return - - # Kick off a Celery task to process this new post - forums.tasks.new_post_task.delay(sender.id) - - -def process_new_post(post_id): - """ - This function is run on a Celery task. It performs all new-post processing. - - """ - try: - post = Post.objects.select_related().get(pk=post_id) - except Post.DoesNotExist: - logger.warning("process_new_post: post %d does not exist", post_id) - return - - # selectively process posts from non-public forums - public_forums = Forum.objects.public_forum_ids() - - if post.topic.forum.id in public_forums: - conn = get_redis_connection() - _update_post_feeds(conn, post) - _update_post_count(conn, public_forums) - _update_latest_topics(conn, post) - - # send out any email notifications - notify_topic_subscribers(post, defer=False) - - # perform any auto-favorite and auto-subscribe actions for the new post - auto_favorite(post) - auto_subscribe(post) - - -def _update_post_feeds(conn, post): - """ - Updates the forum feeds we keep in Redis so that our RSS feeds are quick. - - """ - # serialize post attributes - post_content = { - 'id': post.id, - 'title': post.topic.name, - 'content': post.html, - 'author': post.user.username, - 'pubdate': int(time.mktime(post.creation_date.timetuple())), - 'forum_name': post.topic.forum.name, - 'url': post.get_absolute_url() - } - - s = simplejson.dumps(post_content) - - # store in Redis - - pipeline = conn.pipeline() - - key = 'forums:latest:%d' % post.topic.forum.id - - pipeline.lpush(key, s) - pipeline.ltrim(key, 0, MAX_POSTS - 1) - - # store in the combined feed; yes this wastes some memory storing it twice, - # but it makes things much easier - - key = 'forums:latest:*' - - pipeline.lpush(key, s) - pipeline.ltrim(key, 0, MAX_POSTS - 1) - - pipeline.execute() - - -def _update_post_count(conn, public_forums): - """ - Updates the post count we cache in Redis. Doing a COUNT(*) on the post table - can be expensive in MySQL InnoDB. - - """ - result = conn.incr(POST_COUNT_KEY) - if result == 1: - # it is likely redis got trashed, so re-compute the correct value - - count = Post.objects.filter(topic__forum__in=public_forums).count() - conn.set(POST_COUNT_KEY, count) - - -def _update_latest_topics(conn, post): - """ - Updates the "latest topics with new posts" list we cache in Redis for speed. - There is a template tag and forum view that uses this information. - - """ - # serialize topic attributes - topic_id = post.topic.id - topic_score = int(time.mktime(post.creation_date.timetuple())) - - topic_content = { - 'title': post.topic.name, - 'author': post.user.username, - 'date': topic_score, - 'url': post.topic.get_latest_post_url() - } - json = simplejson.dumps(topic_content) - key = UPDATED_TOPIC_KEY % topic_id - - pipeline = conn.pipeline() - pipeline.set(key, json) - pipeline.zadd(UPDATED_TOPICS_SET_KEY, topic_score, topic_id) - pipeline.zcard(UPDATED_TOPICS_SET_KEY) - results = pipeline.execute() - - # delete topics beyond our maximum count - num_topics = results[-1] - num_to_del = num_topics - MAX_UPDATED_TOPICS - if num_to_del > 0: - # get the IDs of the topics we need to delete first - start = 0 - stop = num_to_del - 1 # Redis indices are inclusive - old_ids = conn.zrange(UPDATED_TOPICS_SET_KEY, start, stop) - - keys = [UPDATED_TOPIC_KEY % n for n in old_ids] - conn.delete(*keys) - - # now delete the oldest num_to_del topics - conn.zremrangebyrank(UPDATED_TOPICS_SET_KEY, start, stop) - - -def get_latest_posts(num_posts=MAX_POSTS, forum_id=None): - """ - This function retrieves num_posts latest posts for the forum with the given - forum_id. If forum_id is None, the posts are retrieved from the combined - forums datastore. A list of dictionaries is returned. Each dictionary - contains information about a post. - - """ - key = 'forums:latest:%d' % forum_id if forum_id else 'forums:latest:*' - - num_posts = max(0, min(MAX_POSTS, num_posts)) - - if num_posts == 0: - return [] - - conn = get_redis_connection() - raw_posts = conn.lrange(key, 0, num_posts - 1) - - posts = [] - for raw_post in raw_posts: - post = simplejson.loads(raw_post) - - # fix up the pubdate; turn it back into a datetime object - post['pubdate'] = datetime.datetime.fromtimestamp(post['pubdate']) - - posts.append(post) - - return posts - - -@receiver(topic_content_update, dispatch_uid='forums.latest_posts') -def on_topic_update(sender, **kwargs): - """ - This function is our signal handler, called when a topic has been updated. - We only care about newly created topics, and ignore updates. - - We kick off a Celery task to perform work outside of the request/response - cycle. - - """ - # ignore non-new topics - if not kwargs['created']: - return - - # Kick off a Celery task to process this new post - forums.tasks.new_topic_task.delay(sender.id) - - -def process_new_topic(topic_id): - """ - This function contains new topic processing. Currently we only update the - topic count statistic. - - """ - try: - topic = Topic.objects.select_related().get(pk=topic_id) - except Topic.DoesNotExist: - logger.warning("process_new_topic: topic %d does not exist", topic_id) - return - - # selectively process topics from non-public forums - public_forums = Forum.objects.public_forum_ids() - - if topic.forum.id not in public_forums: - return - - # update the topic count statistic - conn = get_redis_connection() - - result = conn.incr(TOPIC_COUNT_KEY) - if result == 1: - # it is likely redis got trashed, so re-compute the correct value - - count = Topic.objects.filter(forum__in=public_forums).count() - conn.set(TOPIC_COUNT_KEY, count) - - -def get_stats(): - """ - This function returns the topic and post count statistics as a tuple, in - that order. If a statistic is not available, its position in the tuple will - be None. - - """ - try: - conn = get_redis_connection() - result = conn.mget(TOPIC_COUNT_KEY, POST_COUNT_KEY) - except redis.RedisError, e: - logger.error(e) - return (None, None) - - topic_count = int(result[0]) if result[0] else None - post_count = int(result[1]) if result[1] else None - - return (topic_count, post_count) - - -def get_latest_topic_ids(num): - """ - Return a list of topic ids from the latest topics that have posts. The ids - will be sorted from newest to oldest. - - """ - try: - conn = get_redis_connection() - result = conn.zrevrange(UPDATED_TOPICS_SET_KEY, 0, num - 1) - except redis.RedisError, e: - logger.error(e) - return [] - - return [int(n) for n in result] - - -def get_latest_topics(num): - """ - Return a list of dictionaries with information about the latest topics that - have updated posts. The topics are sorted from newest to oldest. - - """ - try: - conn = get_redis_connection() - result = conn.zrevrange(UPDATED_TOPICS_SET_KEY, 0, num - 1) - - topic_keys = [UPDATED_TOPIC_KEY % n for n in result] - json_list = conn.mget(topic_keys) if topic_keys else [] - - except redis.RedisError, e: - logger.error(e) - return [] - - topics = [] - for s in json_list: - item = simplejson.loads(s) - item['date'] = datetime.datetime.fromtimestamp(item['date']) - topics.append(item) - - return topics - - -def notify_topic_delete(topic): - """ - This function should be called when a topic is deleted. It will remove the - topic from the updated topics set, if present, and delete any info we have - about the topic. - - Note we don't do anything like this for posts. Since they just populate RSS - feeds we'll let them 404. The updated topic list is seen in a prominent - template tag however, so it is a bit more important to get that cleaned up. - - """ - try: - conn = get_redis_connection() - pipeline = conn.pipeline() - pipeline.zrem(UPDATED_TOPICS_SET_KEY, topic.id) - pipeline.delete(UPDATED_TOPIC_KEY % topic.id) - pipeline.execute() - except redis.RedisError, e: - logger.error(e) - - -# Down here to avoid a circular import -import forums.tasks diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/management/commands/forum_cleanup.py --- a/gpp/forums/management/commands/forum_cleanup.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,26 +0,0 @@ -""" -forum_cleanup.py - A management command to cleanup forum model objects. Right -now this entails deleting old forum and topic last visit records. - -""" -import datetime - -from django.core.management.base import NoArgsCommand, CommandError - -from forums.models import ForumLastVisit, TopicLastVisit -import forums.unread - - -class Command(NoArgsCommand): - help = "This command deletes old forum and topic last visit records." - - def handle_noargs(self, **opts): - - now = datetime.datetime.now() - threshold = now - forums.unread.THRESHOLD * 2 - - # delete old topic last visit records - TopicLastVisit.objects.filter(last_visit__lt=threshold).delete() - - # delete old forum visit records - ForumLastVisit.objects.filter(end_date__lt=threshold).delete() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/management/commands/sync_forums.py --- a/gpp/forums/management/commands/sync_forums.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,42 +0,0 @@ -""" -sync_forums.py - A management command to synchronize the forums by recomputing -the de-normalized fields in the forum and topic objects. - -""" -import optparse - -from django.core.management.base import NoArgsCommand, CommandError - -from forums.models import Forum -from forums.models import Topic - - -class Command(NoArgsCommand): - help = """\ -This command synchronizes the forum application's forums and topic objects -by updating their de-normalized fields. -""" - option_list = NoArgsCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 50 topics to show progress"), - ) - - def handle_noargs(self, **opts): - - show_progress = opts.get('progress', False) or False - - n = 0 - for topic in Topic.objects.iterator(): - topic.post_count_update() - topic.save() - n += 1 - if n % 50 == 0: - self.stdout.write('.') - self.stdout.flush() - - for forum in Forum.objects.all(): - forum.sync() - forum.save() - - self.stdout.write('\n') - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/models.py --- a/gpp/forums/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,420 +0,0 @@ -""" -Models for the forums application. -""" -import datetime - -from django.db import models -from django.db.models import Q -from django.contrib.auth.models import User, Group -from django.core.cache import cache - -from core.markup import site_markup -from oembed.models import Oembed - - -class Category(models.Model): - """ - Forums belong to a category, whose access may be assigned to groups. - """ - name = models.CharField(max_length=80) - slug = models.SlugField(max_length=80) - position = models.IntegerField(blank=True, default=0) - groups = models.ManyToManyField(Group, blank=True, null=True, - help_text="If groups are assigned to this category, only members" \ - " of those groups can view this category.") - - class Meta: - ordering = ('position', ) - verbose_name_plural = 'Categories' - - def __unicode__(self): - return self.name - - -class ForumManager(models.Manager): - """ - The manager for the Forum model. Provides a centralized place to - put commonly used and useful queries. - """ - - def forums_for_user(self, user): - """ - Returns a queryset containing the forums that the given user can - "see" due to authenticated status, superuser status and group membership. - """ - qs = self._for_user(user) - return qs.select_related('category', 'last_post', 'last_post__user') - - def forum_ids_for_user(self, user): - """Returns a list of forum IDs that the given user can "see".""" - qs = self._for_user(user) - return qs.values_list('id', flat=True) - - def public_forums(self): - """Returns a queryset containing the public forums.""" - return self.filter(category__groups__isnull=True) - - def public_forum_ids(self): - """ - Returns a list of ids for the public forums; the list is cached for - performance. - """ - public_forums = cache.get('public_forum_ids') - if public_forums is None: - public_forums = list(self.filter( - category__groups__isnull=True).values_list('id', flat=True)) - cache.set('public_forum_ids', public_forums, 3600) - return public_forums - - def _for_user(self, user): - """Common code for the xxx_for_user() methods.""" - if user.is_superuser: - qs = self.all() - else: - user_groups = user.groups.all() if user.is_authenticated() else [] - qs = self.filter(Q(category__groups__isnull=True) | - Q(category__groups__in=user_groups)) - return qs - - -class Forum(models.Model): - """ - A forum is a collection of topics. - """ - category = models.ForeignKey(Category, related_name='forums') - name = models.CharField(max_length=80) - slug = models.SlugField(max_length=80) - description = models.TextField(blank=True, default='') - position = models.IntegerField(blank=True, default=0) - moderators = models.ManyToManyField(Group, blank=True, null=True) - - # denormalized fields to reduce database hits - topic_count = models.IntegerField(blank=True, default=0) - post_count = models.IntegerField(blank=True, default=0) - last_post = models.OneToOneField('Post', blank=True, null=True, - related_name='parent_forum') - - objects = ForumManager() - - class Meta: - ordering = ('position', ) - - def __unicode__(self): - return self.name - - @models.permalink - def get_absolute_url(self): - return ('forums-forum_index', [self.slug]) - - def topic_count_update(self): - """Call to notify the forum that its topic count has been updated.""" - self.topic_count = Topic.objects.filter(forum=self).count() - - def post_count_update(self): - """Call to notify the forum that its post count has been updated.""" - my_posts = Post.objects.filter(topic__forum=self) - self.post_count = my_posts.count() - if self.post_count > 0: - self.last_post = my_posts[self.post_count - 1] - else: - self.last_post = None - - def sync(self): - """ - Call to notify the forum that it needs to recompute its - denormalized fields. - """ - self.topic_count_update() - self.post_count_update() - - def last_post_pre_delete(self, deleting_topic=False): - """ - Call this function prior to deleting the last post in the forum. - A new last post will be found, if one exists. - This is to avoid the Django cascading delete issue. - If deleting_topic is True, then the whole topic the last post is - part of is being deleted, so we can't pick a new last post from that - topic. - """ - try: - qs = Post.objects.filter(topic__forum=self) - if deleting_topic: - qs = qs.exclude(topic=self.last_post.topic) - else: - qs = qs.exclude(pk=self.last_post.pk) - - self.last_post = qs.latest() - - except Post.DoesNotExist: - self.last_post = None - - def catchup(self, user, flv=None): - """ - Call to mark this forum all caught up for the given user (i.e. mark all topics - read for this user). - """ - TopicLastVisit.objects.filter(user=user, topic__forum=self).delete() - if flv is None: - try: - flv = ForumLastVisit.objects.get(user=user, forum=self) - except ForumLastVisit.DoesNotExist: - flv = ForumLastVisit(user=user, forum=self) - - now = datetime.datetime.now() - flv.begin_date = now - flv.end_date = now - flv.save() - - -class Topic(models.Model): - """ - A topic is a thread of discussion, consisting of a series of posts. - """ - forum = models.ForeignKey(Forum, related_name='topics') - name = models.CharField(max_length=255) - creation_date = models.DateTimeField(db_index=True) - user = models.ForeignKey(User) - view_count = models.IntegerField(blank=True, default=0) - sticky = models.BooleanField(blank=True, default=False) - locked = models.BooleanField(blank=True, default=False) - subscribers = models.ManyToManyField(User, related_name='subscriptions', - verbose_name='subscribers', blank=True) - bookmarkers = models.ManyToManyField(User, related_name='favorite_topics', - verbose_name='bookmarkers', blank=True) - - # denormalized fields to reduce database hits - post_count = models.IntegerField(blank=True, default=0) - update_date = models.DateTimeField(db_index=True) - last_post = models.OneToOneField('Post', blank=True, null=True, - related_name='parent_topic') - - class Meta: - ordering = ('-sticky', '-update_date', ) - - def __unicode__(self): - return self.name - - @models.permalink - def get_absolute_url(self): - return ('forums-topic_index', [self.pk]) - - @models.permalink - def get_latest_post_url(self): - return ('forums-topic_latest', [self.pk]) - - def post_count_update(self): - """ - Call this function to notify the topic instance that its post count - has changed. - """ - my_posts = Post.objects.filter(topic=self) - self.post_count = my_posts.count() - if self.post_count > 0: - self.last_post = my_posts[self.post_count - 1] - self.update_date = self.last_post.creation_date - else: - self.last_post = None - self.update_date = self.creation_date - - def reply_count(self): - """ - Returns the number of replies to a topic. The first post - doesn't count as a reply. - """ - if self.post_count > 1: - return self.post_count - 1 - return 0 - - def save(self, *args, **kwargs): - if not self.id: - now = datetime.datetime.now() - self.creation_date = now - self.update_date = now - - super(Topic, self).save(*args, **kwargs) - - def last_post_pre_delete(self): - """ - Call this function prior to deleting the last post in the topic. - A new last post will be found, if one exists. - This is to avoid the Django cascading delete issue. - """ - try: - self.last_post = \ - Post.objects.filter(topic=self).exclude(pk=self.last_post.pk).latest() - except Post.DoesNotExist: - self.last_post = None - - def search_title(self): - if self.post_count == 1: - post_text = "(1 post)" - else: - post_text = "(%d posts)" % self.post_count - - return u"%s by %s; %s" % (self.name, self.user.username, post_text) - - def search_summary(self): - return u'' - - def ogp_tags(self): - """ - Returns a dict of Open Graph Protocol meta tags. - - """ - desc = 'Forum topic created by %s on %s.' % ( - self.user.username, - self.creation_date.strftime('%B %d, %Y')) - - return { - 'og:title': self.name, - 'og:type': 'article', - 'og:url': self.get_absolute_url(), - 'og:description': desc, - } - - -class Post(models.Model): - """ - A post is an instance of a user's single contribution to a topic. - """ - topic = models.ForeignKey(Topic, related_name='posts') - user = models.ForeignKey(User, related_name='posts') - creation_date = models.DateTimeField(db_index=True) - update_date = models.DateTimeField(db_index=True) - body = models.TextField() - html = models.TextField() - user_ip = models.IPAddressField(blank=True, default='', null=True) - attachments = models.ManyToManyField(Oembed, through='Attachment') - - class Meta: - ordering = ('creation_date', ) - get_latest_by = 'creation_date' - verbose_name = 'forum post' - verbose_name_plural = 'forum posts' - - @models.permalink - def get_absolute_url(self): - return ('forums-goto_post', [self.pk]) - - def summary(self): - limit = 65 - if len(self.body) < limit: - return self.body - return self.body[:limit] + '...' - - def __unicode__(self): - return self.summary() - - def save(self, *args, **kwargs): - if not self.id: - self.creation_date = datetime.datetime.now() - self.update_date = self.creation_date - - self.html = site_markup(self.body) - super(Post, self).save(*args, **kwargs) - - def delete(self, *args, **kwargs): - first_post_id = self.topic.posts.all()[0].id - super(Post, self).delete(*args, **kwargs) - if self.id == first_post_id: - self.topic.delete() - - def has_been_edited(self): - return self.update_date > self.creation_date - - def touch(self): - """Call this function to indicate the post has been edited.""" - self.update_date = datetime.datetime.now() - - def search_title(self): - return u"%s by %s" % (self.topic.name, self.user.username) - - def search_summary(self): - return self.body - - -class FlaggedPost(models.Model): - """This model represents a user flagging a post as inappropriate.""" - user = models.ForeignKey(User) - post = models.ForeignKey(Post) - flag_date = models.DateTimeField(auto_now_add=True) - - def __unicode__(self): - return u'Post ID %s flagged by %s' % (self.post.id, self.user.username) - - class Meta: - ordering = ('flag_date', ) - - def get_post_url(self): - return 'Post' % self.post.get_absolute_url() - get_post_url.allow_tags = True - - -class ForumLastVisit(models.Model): - """ - This model records the last time a user visited a forum. - It is used to compute if a user has unread topics in a forum. - We keep track of a window of time, delimited by begin_date and end_date. - Topics updated within this window are tracked, and may have TopicLastVisit - objects. - Marking a forum as all read sets the begin_date equal to the end_date. - """ - user = models.ForeignKey(User) - forum = models.ForeignKey(Forum) - begin_date = models.DateTimeField() - end_date = models.DateTimeField() - - class Meta: - unique_together = ('user', 'forum') - ordering = ('-end_date', ) - - def __unicode__(self): - return u'Forum: %d User: %d Date: %s' % (self.forum.id, self.user.id, - self.end_date.strftime('%Y-%m-%d %H:%M:%S')) - - def is_caught_up(self): - return self.begin_date == self.end_date - - -class TopicLastVisit(models.Model): - """ - This model records the last time a user read a topic. - Objects of this class exist for the window specified in the - corresponding ForumLastVisit object. - """ - user = models.ForeignKey(User) - topic = models.ForeignKey(Topic) - last_visit = models.DateTimeField(db_index=True) - - class Meta: - unique_together = ('user', 'topic') - ordering = ('-last_visit', ) - - def __unicode__(self): - return u'Topic: %d User: %d Date: %s' % (self.topic.id, self.user.id, - self.last_visit.strftime('%Y-%m-%d %H:%M:%S')) - - def save(self, *args, **kwargs): - if self.last_visit is None: - self.touch() - super(TopicLastVisit, self).save(*args, **kwargs) - - def touch(self): - self.last_visit = datetime.datetime.now() - - -class Attachment(models.Model): - """ - This model is a "through" table for the M2M relationship between forum - posts and Oembed objects. - """ - post = models.ForeignKey(Post) - embed = models.ForeignKey(Oembed) - order = models.IntegerField() - - class Meta: - ordering = ('order', ) - - def __unicode__(self): - return u'Post %d, %s' % (self.post.pk, self.embed.title) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/permissions.py --- a/gpp/forums/permissions.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,114 +0,0 @@ -""" -This module does permissions checking for the forums application. - -""" -from django.core.cache import cache - -# How long (in secs) to cache group information for various entities: -CATEGORY_TIMEOUT = 4 * 60 * 60 -FORUM_TIMEOUT = 4 * 60 * 60 -USER_TIMEOUT = 15 * 60 - - -def can_access(category, user): - """ - This function returns True if the given user can access the forum category - and False otherwise. - - """ - if user.is_superuser: - return True - - # If this category has no groups assigned to it, return True. Else, return - # True if the user belongs to a group that has been assigned to this - # category, and False otherwise. - - # Get the groups assigned to this category. - cat_groups = get_category_groups(category) - - if len(cat_groups) == 0: - return True # No groups => public category - - user_groups = get_user_groups(user) - return bool(user_groups & cat_groups) - - -def can_moderate(forum, user): - """ - Returns True if the user can moderate the forum. - - """ - # Get the simple cases out of the way first: - if not user.is_authenticated(): - return False - elif user.is_superuser: - return True - - # If we get here, we have to see if there is an intersection between the - # user's groups and the forum's moderator groups. - - forum_groups = get_forum_groups(forum) - user_groups = get_user_groups(user) - - return bool(user_groups & forum_groups) - - -def can_post(topic, user): - """ - Returns True if the user can post in the topic and False otherwise. - - """ - if not user.is_authenticated(): - return False - if user.is_superuser or can_moderate(topic.forum, user): - return True - - return not topic.locked and can_access(topic.forum.category, user) - - -def get_user_groups(user): - """ - Returns a set of group ID's that the user belongs to. - - """ - user_groups_key = '%s_groups' % user.username - return _get_groups(user_groups_key, user.groups.all(), USER_TIMEOUT) - - -def get_forum_groups(forum): - """ - Returns a set of group ID's of the forum's moderator groups. - - """ - forum_groups_key = 'forum_%d_mods' % forum.id - return _get_groups(forum_groups_key, forum.moderators.all(), FORUM_TIMEOUT) - - -def get_category_groups(category): - """ - Returns a set of group ID's of the groups that can access this forum - category. - - """ - cat_groups_key = 'cat_%d_groups' % category.id - return _get_groups(cat_groups_key, category.groups.all(), CATEGORY_TIMEOUT) - - -def _get_groups(key, qs, timeout): - """ - This internal function contains the code common to the get_xxx_groups() - functions. Returns a set of group ID's from the cache. If the set is not - found in the cache, the set is generated from the queryset qs and cached - with the given timeout. - - key - the cache key for the set of group ID's - qs - the query set of groups to query if the set is not in the cache - timeout - the cache timeout to use - - """ - groups = cache.get(key) - if groups is None: - groups = set([g.id for g in qs]) - cache.set(key, groups, timeout) - - return groups diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/search_indexes.py --- a/gpp/forums/search_indexes.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,60 +0,0 @@ -"""Haystack search index for the weblinks application.""" -from haystack.indexes import * -from haystack import site -from custom_search.indexes import CondQueuedSearchIndex - -from forums.models import Forum, Topic, Post -from forums.signals import topic_content_update, post_content_update - - -class TopicIndex(CondQueuedSearchIndex): - text = CharField(document=True, use_template=True) - author = CharField(model_attr='user') - pub_date = DateTimeField(model_attr='creation_date') - - def index_queryset(self): - return Topic.objects.filter(forum__in=Forum.objects.public_forum_ids()) - - def get_updated_field(self): - return 'update_date' - - def _setup_save(self, model): - topic_content_update.connect(self.enqueue_save) - - def _teardown_save(self, model): - topic_content_update.disconnect(self.enqueue_save) - - def enqueue_save(self, sender, **kwargs): - return self.enqueue('update', sender) - - def can_index(self, instance): - return instance.forum.id in Forum.objects.public_forum_ids() - - -class PostIndex(CondQueuedSearchIndex): - text = CharField(document=True, use_template=True) - author = CharField(model_attr='user') - pub_date = DateTimeField(model_attr='creation_date') - - def index_queryset(self): - return Post.objects.filter( - topic__forum__in=Forum.objects.public_forum_ids()) - - def get_updated_field(self): - return 'update_date' - - def _setup_save(self, model): - post_content_update.connect(self.enqueue_save) - - def _teardown_save(self, model): - post_content_update.disconnect(self.enqueue_save) - - def enqueue_save(self, sender, **kwargs): - return self.enqueue('update', sender) - - def can_index(self, instance): - return instance.topic.forum.id in Forum.objects.public_forum_ids() - - -site.register(Topic, TopicIndex) -site.register(Post, PostIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/signals.py --- a/gpp/forums/signals.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,114 +0,0 @@ -""" -Signal handlers & signals for the forums application. - -""" -from django.db.models.signals import post_save -from django.db.models.signals import post_delete -import django.dispatch - -from forums.models import Forum, Topic, Post - - -def on_topic_save(sender, **kwargs): - if kwargs['created']: - topic = kwargs['instance'] - topic.forum.topic_count_update() - topic.forum.save() - - -def on_topic_delete(sender, **kwargs): - topic = kwargs['instance'] - topic.forum.topic_count_update() - topic.forum.save() - forums.latest.notify_topic_delete(topic) - - -def on_post_save(sender, **kwargs): - if kwargs['created']: - post = kwargs['instance'] - - # update the topic - post.topic.post_count_update() - post.topic.save() - - # update the forum - post.topic.forum.post_count_update() - post.topic.forum.save() - - -def on_post_delete(sender, **kwargs): - post = kwargs['instance'] - - # update the topic - try: - post.topic.post_count_update() - post.topic.save() - except Topic.DoesNotExist: - pass - else: - # update the forum - try: - post.topic.forum.post_count_update() - post.topic.forum.save() - except Forum.DoesNotExist: - pass - - -post_save.connect(on_topic_save, sender=Topic, dispatch_uid='forums.signals') -post_delete.connect(on_topic_delete, sender=Topic, dispatch_uid='forums.signals') - -post_save.connect(on_post_save, sender=Post, dispatch_uid='forums.signals') -post_delete.connect(on_post_delete, sender=Post, dispatch_uid='forums.signals') - - -# Signals for the forums application. -# -# This signal is sent when a topic has had its textual content (title) changed. -# The provided arguments are: -# sender - the topic model instance -# created - True if the topic is new, False if updated - -topic_content_update = django.dispatch.Signal(providing_args=['created']) - -# This signal is sent when a post has had its textual content (body) changed. -# The provided arguments are: -# sender - the post model instance -# created - True if the post is new, False if updated - -post_content_update = django.dispatch.Signal(providing_args=['created']) - - -def notify_new_topic(topic): - """ - Sends the topic_content_update signal for a new topic instance. - - """ - topic_content_update.send_robust(topic, created=True) - - -def notify_updated_topic(topic): - """ - Sends the topic_content_update signal for an updated topic instance. - - """ - topic_content_update.send_robust(topic, created=False) - - -def notify_new_post(post): - """ - Sends the post_content_update signal for a new post instance. - - """ - post_content_update.send_robust(post, created=True) - - -def notify_updated_post(post): - """ - Sends the post_content_update signal for an updated post instance. - - """ - post_content_update.send_robust(post, created=False) - - -# Avoid circular imports -import forums.latest diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/static/js/forums.js --- a/gpp/forums/static/js/forums.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,297 +0,0 @@ -$(document).ready(function() { - var postText = $('#id_body'); - var postButton = $('#forums-reply-post'); - postButton.click(function () { - var text = $.trim(postText.val()); - $(this).attr('disabled', 'disabled').val('Posting reply...'); - - var attachments = new Array() - $('#attachment div input').each(function(index) { - attachments[index] = $(this).val(); - }); - - $.ajax({ - url: '/forums/quick-reply/', - type: 'POST', - data: { - body : postText.val(), - topic_id : $('#id_topic_id').val(), - attachment : attachments - }, - traditional: true, - dataType: 'html', - success: function (data, textStatus) { - postText.val(''); - var lastTr = $('#forum-topic tr:last'); - var newClass = lastTr.hasClass('odd') ? 'even' : 'odd'; - lastTr.after(data); - lastTr = $('#forum-topic tr:last'); - lastTr.addClass(newClass); - lastTr.hide(); - lastTr.fadeIn(3000); - postButton.removeAttr('disabled').val('Submit Reply'); - initAttachments(); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - postButton.removeAttr('disabled').val('Submit Reply'); - initAttachments(); - } - }); - return false; - }); - $('a.post-flag').click(function () { - var id = this.id; - if (id.match(/fp-(\d+)/)) { - id = RegExp.$1; - if (confirm('Only flag a post if you feel it is spam, abuse, violates site rules, ' + - 'or is not appropriate. ' + - 'A moderator will be notified and will review the post. ' + - 'Are you sure you want to flag this post?')) { - $.ajax({ - url: '/forums/flag-post/', - type: 'POST', - data: {id: id}, - dataType: 'text', - success: function (response, textStatus) { - alert(response); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); - } - }); - } - } - return false; - }); - $('a.post-delete').click(function () { - var id = this.id; - if (id.match(/dp-(\d+)/)) { - id = RegExp.$1; - if (confirm('Are you sure you want to delete this post?')) { - $.ajax({ - url: '/forums/delete-post/', - type: 'POST', - data: {id: id}, - dataType: 'text', - success: function (response, textStatus) { - alert(response); - $('#post-' + id).fadeOut(3000); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); - } - }); - } - } - return false; - }); - $('#forum-mod-del-topic').click(function () { - return confirm('Are you sure you want to delete this topic?\n' + - 'WARNING: all posts will be lost.'); - }); - - var vid = 0; - var vidDiv = $('#attachment'); - - function clearAttachments() - { - $('#attachment div').remove(); - $('#attach-another').remove(); - } - - function processEmbeds(data, textStatus) - { - vidDiv.find('img').remove(); - $.each(data, function(index, value) { - var html = '
' + value.html + - '' + - 'Remove ' + - 'Remove' + - ''; - '
'; - vidDiv.append(html); - $('#video-' + index + ' a').click(function() { - $('#video-' + index).remove(); - relabelAttachLink(); - return false; - }); - }); - vid = data.length; - $('#video-' + (vid-1)).after('Attach another video'); - $('#attach-another').click(function() { - addVideo(); - relabelAttachLink(); - return false; - }); - } - - function initAttachments() - { - clearAttachments(); - - var post_input = $('#id_post_id'); - var attachments = $("#forums_post_form input:hidden[name='attachment']"); - if (post_input.length == 1) - { - post_id = post_input.val(); - vidDiv.prepend('Busy'); - $.ajax({ - url: '/forums/fetch_attachments/', - type: 'GET', - data: { - pid : post_id - }, - dataType: 'json', - success: processEmbeds, - error: function (xhr, textStatus, ex) { - vidDiv.find('img').remove(); - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - } - }); - } - else if (attachments.length > 0) - { - vidDiv.prepend('Busy'); - var embeds = new Array(); - attachments.each(function(index) { - embeds[index] = $(this).val(); - }); - attachments.remove(); - $.ajax({ - url: '/oembed/fetch_saved/', - type: 'GET', - data: { - embeds: embeds - }, - traditional: true, - dataType: 'json', - success: processEmbeds, - error: function (xhr, textStatus, ex) { - vidDiv.find('img').remove(); - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - } - }); - } - else - { - vid = 0; - var s = '
' + - 'Add ' + - 'Attach Video
'; - vidDiv.prepend(s); - $('#attachment a').click(function () { - $('#init-add').remove(); - addVideo(); - return false; - }); - } - } - - function relabelAttachLink() - { - var another = $('#attach-another'); - var n = $('#attachment div').length; - if (n == 0) - { - another.html("Attach a video"); - } - else - { - another.html("Attach another video"); - } - } - - function addVideo() - { - var id = "video-" + vid; - - var fakeForm = '
' + - 'Attach ' + - ' ' + - 'Remove
'; - - var n = $('#attachment div').length; - - var another = $('#attach-another'); - if (n == 0) - { - if (another.length > 0) - { - another.before(fakeForm); - } - else - { - vidDiv.append(fakeForm); - } - } - else - { - $('#attachment div:last').after(fakeForm); - } - - $('#' + id + ' a').click(function() { - $('#' + id).remove(); - relabelAttachLink(); - return false; - }); - - var vidText = $('#' + id + ' input'); - - $('#' + id + ' button').click(function() { - var button = $(this); - button.attr('disabled', 'disabled'); - $.ajax({ - url: '/oembed/fetch/', - type: 'POST', - data: { - q : vidText.val() - }, - dataType: 'json', - success: function (data, textStatus) { - $('#' + id + " .r").remove(); - var myDiv = $('#' + id); - var html = '' + - 'Remove ' + - 'Remove' + - ''; - myDiv.prepend(html); - myDiv.prepend(data.embed); - $('#' + id + ' a').click(function() { - myDiv.remove(); - relabelAttachLink(); - return false; - }); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - button.removeAttr('disabled'); - } - }); - }); - - if (vid == 0) - { - $('#video-0').after('Attach another video'); - $('#attach-another').click(function() { - addVideo(); - relabelAttachLink(); - return false; - }); - } - ++vid; - } - - initAttachments(); - - $('div.forum-post-body img').fadeIn('fast', function() { - var pic = $(this); - if (pic.width() > 720) { - pic.css('width', '720px'); - } - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/static/js/forums_mod.js --- a/gpp/forums/static/js/forums_mod.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,15 +0,0 @@ -$(document).ready(function() { - var master = $('#forums-master-topic'); - var topics = $('.forums-topic_check'); - master.click(function() { - var state = this.checked; - topics.each(function() { - this.checked = state; - }); - }); - topics.click(function() { - if (master[0].checked && !this.checked) { - master[0].checked = false; - } - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/tasks.py --- a/gpp/forums/tasks.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,25 +0,0 @@ -""" -Celery tasks for the forums application. - -""" -from celery.task import task - -import forums.latest - - -@task -def new_post_task(post_id): - """ - This task performs new post processing on a Celery task. - - """ - forums.latest.process_new_post(post_id) - - -@task -def new_topic_task(topic_id): - """ - This task performs new topic processing on a Celery task. - - """ - forums.latest.process_new_topic(topic_id) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/templatetags/forum_tags.py --- a/gpp/forums/templatetags/forum_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,203 +0,0 @@ -""" -Template tags for the forums application. -""" -import datetime - -from pytz import timezone -from django import template -from django.conf import settings -from django.core.cache import cache -from django.contrib.auth.models import User - -from forums.models import Forum -from forums.models import Topic -from forums.models import Post -from forums.models import Category -from forums.latest import get_stats, get_latest_topics - - -register = template.Library() - -TIME_FMT_24 = "%H:%M" -TIME_FMT_12 = "%I:%M %p" - -DATE_FMT = "%b %d %Y" -SHORT_DATE_FMT = "%b %d" - -DATE_FMT_24 = ( - "%s %s" % (DATE_FMT, TIME_FMT_24), # long format - "%s %s" % (TIME_FMT_24, SHORT_DATE_FMT), # short format -) -DATE_FMT_12 = ( - "%s %s" % (DATE_FMT, TIME_FMT_12), # long format - "%s %s" % (TIME_FMT_12, SHORT_DATE_FMT), # short format -) - -SERVER_TZ = timezone(settings.TIME_ZONE) - - -@register.inclusion_tag('forums/last_post_info.html', takes_context=True) -def last_post_info(context, post): - return { - 'post': post, - 'STATIC_URL': context['STATIC_URL'], - 'user': context['user'], - } - - -@register.inclusion_tag('forums/pagination.html') -def forum_page_navigation(page): - return {'page': page} - - -@register.inclusion_tag('forums/post_edit_button.html') -def post_edit_button(post, user, can_moderate): - show_button = post.user.id == user.id or can_moderate - return { - 'post': post, - 'show_button': show_button, - 'STATIC_URL': settings.STATIC_URL, - } - - -def get_time_prefs(user): - """ - Return the supplied user's time preferences in the form of a 2-tuple: - (use_24_time, time_zone_name) - - These preferences are cached to reduce database hits. - - """ - cache_key = '%s_tz_prefs' % user.username - tz_prefs = cache.get(cache_key) - if tz_prefs is None: - profile = user.get_profile() - tz_prefs = profile.use_24_time, profile.time_zone - cache.set(cache_key, tz_prefs) - - return tz_prefs - - -@register.simple_tag -def current_forum_time(user): - """ - This tag displays the current forum time, adjusted by the user's - time zone preferences. - """ - curr_time = SERVER_TZ.localize(datetime.datetime.now()) - - if user.is_authenticated(): - tz_prefs = get_time_prefs(user) - user_tz = timezone(tz_prefs[1]) - curr_time = curr_time.astimezone(user_tz) - fmt = TIME_FMT_24 if tz_prefs[0] else TIME_FMT_12 - else: - fmt = TIME_FMT_12 - - return '

The current time is %s. All times shown are %s.

' % ( - curr_time.strftime(fmt), curr_time.strftime('%Z%z')) - - -@register.simple_tag -def forum_date(date, user, long_format=True): - """ - This tag displays an arbitrary datetime, adjusted by the user's - time zone preferences. - """ - fmt_index = 0 if long_format else 1 - - date = SERVER_TZ.localize(date) - if user.is_authenticated(): - tz_prefs = get_time_prefs(user) - user_tz = timezone(tz_prefs[1]) - date = date.astimezone(user_tz) - fmt = DATE_FMT_24 if tz_prefs[0] else DATE_FMT_12 - else: - fmt = DATE_FMT_12 - - return date.strftime(fmt[fmt_index]) - - -@register.inclusion_tag('forums/show_form.html') -def show_form(legend_text, form, submit_value, is_ajax): - """ - This tag displays the common HTML for a forum form. - """ - return { - 'legend_text': legend_text, - 'form': form, - 'submit_value': submit_value, - 'is_ajax': is_ajax, - 'STATIC_URL': settings.STATIC_URL, - } - - -@register.inclusion_tag('forums/new_posts_tag.html') -def new_posts(): - """ - This tag displays the topics that have the newest posts. - Only the "public" forums are displayed. - """ - return { - 'topics': get_latest_topics(20), - } - - -@register.inclusion_tag('forums/forum_stats_tag.html') -def forum_stats(): - """ - Displays forum statistics. - """ - topic_count, post_count = get_stats() - - return { - 'topic_count': topic_count, - 'post_count': post_count, - } - - -@register.inclusion_tag('forums/topic_icons_tag.html') -def topic_icons(topic): - """Displays the "unread", "sticky", and "locked" icons for a given topic.""" - return { - 'topic': topic, - 'STATIC_URL': settings.STATIC_URL, - } - - -@register.inclusion_tag('forums/topic_page_range_tag.html') -def topic_page_range(topic): - """Displays the page range links for a topic.""" - return { - 'topic': topic, - } - - -@register.inclusion_tag('forums/navigation_tag.html') -def forum_navigation(obj, subtitle=None): - """ - Generates forum navigation links based upon the arguments passed. - If obj is: - * a string: Index >> String Text - * a forum: Index >> Forum Name - * a topic: Index >> Forum Name >> Topic Name - - If the optional subtitle argument is passed, it is assumed to be - a string, and is added as one more "level" in the navigation. - - """ - nav_list = [] - - if isinstance(obj, str) or isinstance(obj, unicode): - nav_list.append(dict(name=obj, url=None)) - elif isinstance(obj, Forum): - nav_list.append(dict(name=obj.name, url=obj.get_absolute_url())) - elif isinstance(obj, Topic): - forum = obj.forum - nav_list.append(dict(name=forum.name, url=forum.get_absolute_url())) - nav_list.append(dict(name=obj.name, url=obj.get_absolute_url())) - - if subtitle: - nav_list.append(dict(name=subtitle, url=None)) - - return dict(nav_list=nav_list, STATIC_URL=settings.STATIC_URL) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/tests/__init__.py --- a/gpp/forums/tests/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1 +0,0 @@ -from view_tests import * diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/tests/view_tests.py --- a/gpp/forums/tests/view_tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,135 +0,0 @@ -""" -Tests for the views in the forums application. - -""" -from django.test import TestCase -from django.contrib.auth.models import User -from django.core.urlresolvers import reverse - -from forums.models import Forum, Topic, Post - - -class ForumPostTestCase(TestCase): - fixtures = ['forums.json'] - - def setUp(self): - self.username = 'test_user' - self.pw = 'password' - self.user = User.objects.create_user(self.username, '', self.pw) - self.user.save() - self.assertTrue(self.client.login(username=self.username, - password=self.pw)) - - def tearDown(self): - self.client.logout() - - def testBasicForumsTest(self): - - forum_slug = 'shallow-end' - topic_name = 'A test topic' - topic_body = 'testing 1, 2, 3...' - - response = self.client.post( - reverse('forums-new_topic', kwargs={'slug': forum_slug}), - {'name': topic_name, 'body': topic_body}, - follow=True) - - self.assertEqual(len(response.redirect_chain), 1) - - if response.redirect_chain: - self.assertEqual(response.redirect_chain[0][0], - 'http://testserver' + reverse('forums-new_topic_thanks', - kwargs={'tid': '1'})) - self.assertEqual(response.redirect_chain[0][1], 302) - - self.assertEqual(response.status_code, 200) - - forum = Forum.objects.get(slug=forum_slug) - try: - topic = Topic.objects.get(pk=1) - except Topic.DoesNotExist: - self.fail("topic doesn't exist") - - self.assertEqual(topic.forum.pk, forum.pk) - self.assertEqual(topic.user.pk, self.user.pk) - self.assertEqual(topic.name, topic_name) - self.assertEqual(topic.post_count, 1) - - post = topic.last_post - self.failIf(post is None) - - if post: - self.assertEqual(post.body, topic_body) - self.assertEqual(post.user.pk, self.user.pk) - - # post to the thread - response = self.client.get( - reverse('forums-topic_index', kwargs={'id': '1'})) - self.assertEqual(response.status_code, 200) - - post2_body = 'test quick post' - response = self.client.post( - reverse('forums-quick_reply'), - {'body': post2_body, 'topic_id': 1}) - self.assertEqual(response.status_code, 200) - try: - topic = Topic.objects.get(pk=1) - except Topic.DoesNotExist: - self.fail("topic doesn't exist") - - post = topic.last_post - self.failIf(post is None) - if post: - self.assertEqual(post.body, post2_body) - self.assertEqual(post.user.pk, self.user.pk) - self.assertEqual(topic.post_count, 2) - - # quote last post - response = self.client.get( - reverse('forums-new_post', kwargs={'topic_id': 1}), - {'quote_id': 2}) - self.assertEqual(response.status_code, 200) - - post3_body = 'new post 3 content' - response = self.client.post( - reverse('forums-new_post', kwargs={'topic_id': 1}), - {'body': post3_body, 'post_id': 2}, - follow=True) - self.assertEqual(response.status_code, 200) - try: - topic = Topic.objects.get(pk=1) - except Topic.DoesNotExist: - self.fail("topic doesn't exist") - - post = topic.last_post - self.failIf(post is None) - if post: - self.assertEqual(post.body, post3_body) - self.assertEqual(post.user.pk, self.user.pk) - self.assertEqual(topic.post_count, 3) - - # edit last post - response = self.client.get( - reverse('forums-edit_post', kwargs={'id': 3})) - self.assertEqual(response.status_code, 200) - - post3_body = 'edited post 3 content' - response = self.client.post( - reverse('forums-edit_post', kwargs={'id': 3}), - {'body': post3_body}, - follow=True) - self.assertEqual(response.status_code, 200) - try: - topic = Topic.objects.get(pk=1) - except Topic.DoesNotExist: - self.fail("topic doesn't exist") - - post = topic.last_post - self.failIf(post is None) - if post: - self.assertEqual(post.body, post3_body) - self.assertEqual(post.user.pk, self.user.pk) - self.assertEqual(topic.post_count, 3) - - profile = self.user.get_profile() - self.assertEqual(profile.forum_post_count, 3) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/tools.py --- a/gpp/forums/tools.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,130 +0,0 @@ -""" -This module contains misc. utility functions for forum management. -""" -import logging - -from forums.models import Post, Topic, Forum, ForumLastVisit, TopicLastVisit - - -def delete_user_posts(user): - """ - This function deletes all the posts for a given user. - It also cleans up any last visit database records for the user. - This function adjusts the last post foreign keys before deleting - the posts to avoid the cascading delete behavior. - """ - posts = Post.objects.filter(user=user).select_related() - - # delete attachments - for post in posts: - post.attachments.clear() - - # build a set of topics and forums affected by the post deletions - - topics = set(post.topic for post in posts) - forums = set(topic.forum for topic in topics) - - post_ids = [post.pk for post in posts] - pending_delete = [] - - for topic in topics: - if topic.last_post.pk in post_ids: - topic_posts = Post.objects.filter(topic=topic).exclude( - pk__in=post_ids) - topic.post_count = topic_posts.count() - if topic.post_count > 0: - topic.last_post = topic_posts.latest() - topic.update_date = topic.last_post.creation_date - topic.save() - else: - # Topic should be deleted, it has no posts; - # We can't delete it now as it could cascade and take out a - # forum. Remember it for later deletion. - pending_delete.append(topic) - - for forum in forums: - if forum.last_post.pk in post_ids: - forum_posts = Post.objects.filter(topic__forum=forum).exclude( - pk__in=post_ids) - forum.post_count = forum_posts.count() - if forum.post_count > 0: - forum.last_post = forum_posts.latest() - else: - forum.last_post = None - forum.save() - - # Delete pending topics now because forums have just adjusted their - # foreign keys into Post - if pending_delete: - topic_ids = [topic.pk for topic in pending_delete] - Topic.objects.filter(pk__in=topic_ids).delete() - - # Topics have been deleted, re-compute topic counts for forums - for forum in forums: - forum.topic_count = Topic.objects.filter(forum=forum).count() - forum.save() - - # All foreign keys are accounted for, we can now delete the posts in bulk. - # Since some posts in our original queryset may have been deleted already, - # run a new query (although it may be ok) - Post.objects.filter(pk__in=post_ids).delete() - - # delete all the last visit records for this user - TopicLastVisit.objects.filter(user=user).delete() - ForumLastVisit.objects.filter(user=user).delete() - - -def create_topic(forum_slug, user, topic_name, post_body, ip='', sticky=False, - locked=False): - """Programmatically create a topic & first post in a given forum. - - This function creates a new topic in the forum that has the slug - specified by the 'forum_slug' argument. Other arguments are as follows: - 'user' - create the topic and post with this user as the owner - 'topic_name' - topic name (title) - 'post_body' - topic post body (as markup, not HTML) - 'ip' - IP address for the post (as a string) - 'sticky' - if True, the post will be stickied - 'locked' - if True, the post will be locked - - """ - try: - forum = Forum.objects.get(slug=forum_slug) - except Forum.DoesNotExist: - logging.error('could not create_topic for forum_slug=%s', forum_slug) - raise - - topic = Topic(forum=forum, - name=topic_name, - user=user, - sticky=sticky, - locked=locked) - topic.save() - - post = Post(topic=topic, - user=user, - body=post_body, - user_ip=ip) - post.save() - - -def auto_favorite(post): - """ - Given a newly created post, perform an auto-favorite action if the post - creator has that option set in their profile. - - """ - profile = post.user.get_profile() - if profile.auto_favorite: - post.topic.bookmarkers.add(post.user) - - -def auto_subscribe(post): - """ - Given a newly created post, perform an auto-subscribe action if the post - creator has that option set in their profile. - - """ - profile = post.user.get_profile() - if profile.auto_subscribe: - post.topic.subscribers.add(post.user) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/unread.py --- a/gpp/forums/unread.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,257 +0,0 @@ -""" -This file contains routines for implementing the "has unread" feature. -Forums, topics, and posts are displayed with a visual indication if they have -been read or not. -""" -import datetime -import logging - -from django.db import IntegrityError - -from forums.models import ForumLastVisit, TopicLastVisit, Topic, Forum - - -THRESHOLD = datetime.timedelta(days=14) - -####################################################################### - -def get_forum_unread_status(qs, user): - if not user.is_authenticated(): - for forum in qs: - forum.has_unread = False - return - - now = datetime.datetime.now() - min_date = now - THRESHOLD - - # retrieve ForumLastVisit records in one SQL query - forum_ids = [forum.id for forum in qs] - flvs = ForumLastVisit.objects.filter(user=user, - forum__in=forum_ids).select_related() - flvs = dict([(flv.forum.id, flv) for flv in flvs]) - - for forum in qs: - # Edge case: forum has no posts - if forum.last_post is None: - forum.has_unread = False - continue - - # Get the ForumLastVisit record - if forum.id in flvs: - flv = flvs[forum.id] - else: - # One doesn't exist, create a default one for next time, - # mark it as having no unread topics, and bail. - flv = ForumLastVisit(user=user, forum=forum) - flv.begin_date = now - flv.end_date = now - - # There is a race condition and sometimes another thread - # saves a record before we do; just log this if it happens. - try: - flv.save() - except IntegrityError: - logging.exception('get_forum_unread_status') - - forum.has_unread = False - continue - - # If the last visit record was too far in the past, - # catch that user up and mark as no unreads. - if now - flv.end_date > THRESHOLD: - forum.catchup(user, flv) - forum.has_unread = False - continue - - # Check the easy cases first. Check the last_post in the - # forum. If created after the end_date in our window, there - # are new posts. Likewise, if before the begin_date in our window, - # there are no new posts. - if forum.last_post.creation_date > flv.end_date: - forum.has_unread = True - elif forum.last_post.creation_date < flv.begin_date: - if not flv.is_caught_up(): - forum.catchup(user, flv) - forum.has_unread = False - else: - # Going to have to examine the topics in our window. - # First adjust our window if it is too old. - if now - flv.begin_date > THRESHOLD: - flv.begin_date = min_date - flv.save() - TopicLastVisit.objects.filter(user=user, topic__forum=forum, - last_visit__lt=min_date).delete() - - topics = Topic.objects.filter(forum=forum, - update_date__gt=flv.begin_date) - tracked_topics = TopicLastVisit.objects.filter( - user=user, - topic__forum=forum, - last_visit__gt=flv.begin_date).select_related('topic') - - # If the number of topics created since our window was started - # is greater than the tracked topic records, then there are new - # posts. - if topics.count() > tracked_topics.count(): - forum.has_unread = True - continue - - tracked_dict = dict((t.topic.id, t) for t in tracked_topics) - - for topic in topics: - if topic.id in tracked_dict: - if topic.update_date > tracked_dict[topic.id].last_visit: - forum.has_unread = True - break - else: - forum.has_unread = True - break - else: - # If we made it through the above loop without breaking out, - # then we are all caught up. - forum.catchup(user, flv) - forum.has_unread = False - -####################################################################### - -def get_topic_unread_status(forum, topics, user): - - # Edge case: no topics - if forum.last_post is None: - return - - # This service isn't provided to unauthenticated users - if not user.is_authenticated(): - for topic in topics: - topic.has_unread = False - return - - now = datetime.datetime.now() - - # Get the ForumLastVisit record - try: - flv = ForumLastVisit.objects.get(forum=forum, user=user) - except ForumLastVisit.DoesNotExist: - # One doesn't exist, create a default one for next time, - # mark it as having no unread topics, and bail. - flv = ForumLastVisit(user=user, forum=forum) - flv.begin_date = now - flv.end_date = now - - # There is a race condition and sometimes another thread - # saves a record before we do; just log this if it happens. - try: - flv.save() - except IntegrityError: - logging.exception('get_topic_unread_status') - - for topic in topics: - topic.has_unread = False - return - - # Are all the posts before our window? If so, all have been read. - if forum.last_post.creation_date < flv.begin_date: - for topic in topics: - topic.has_unread = False - return - - topic_ids = [topic.id for topic in topics] - tlvs = TopicLastVisit.objects.filter(user=user, topic__id__in=topic_ids) - tlvs = dict([(tlv.topic.id, tlv) for tlv in tlvs]) - - # Otherwise we have to go through the topics one by one: - for topic in topics: - if topic.update_date < flv.begin_date: - topic.has_unread = False - elif topic.update_date > flv.end_date: - topic.has_unread = True - elif topic.id in tlvs: - topic.has_unread = topic.update_date > tlvs[topic.id].last_visit - else: - topic.has_unread = True - -####################################################################### - -def get_post_unread_status(topic, posts, user): - # This service isn't provided to unauthenticated users - if not user.is_authenticated(): - for post in posts: - post.unread = False - return - - # Get the ForumLastVisit record - try: - flv = ForumLastVisit.objects.get(forum=topic.forum, user=user) - except ForumLastVisit.DoesNotExist: - # One doesn't exist, all posts are old. - for post in posts: - post.unread = False - return - - # Are all the posts before our window? If so, all have been read. - if topic.last_post.creation_date < flv.begin_date: - for post in posts: - post.unread = False - return - - # Do we have a topic last visit record for this topic? - - try: - tlv = TopicLastVisit.objects.get(user=user, topic=topic) - except TopicLastVisit.DoesNotExist: - # No we don't, we could be all caught up, or all are new - for post in posts: - post.unread = post.creation_date > flv.end_date - else: - for post in posts: - post.unread = post.creation_date > tlv.last_visit - -####################################################################### - -def get_unread_topics(user): - """Returns a list of topics the user hasn't read yet.""" - - # This is only available to authenticated users - if not user.is_authenticated(): - return [] - - now = datetime.datetime.now() - - # Obtain list of forums the user can view - forums = Forum.objects.forums_for_user(user) - - # Get forum last visit records for the forum ids - flvs = ForumLastVisit.objects.filter(user=user, - forum__in=forums).select_related() - flvs = dict([(flv.forum.id, flv) for flv in flvs]) - - unread_topics = [] - topics = Topic.objects.none() - for forum in forums: - # if the user hasn't visited the forum, create a last - # visit record set to "now" - if not forum.id in flvs: - flv = ForumLastVisit(user=user, forum=forum, begin_date=now, - end_date=now) - flv.save() - else: - flv = flvs[forum.id] - topics |= Topic.objects.filter(forum=forum, - update_date__gt=flv.begin_date).order_by('-update_date').select_related( - 'forum', 'user', 'last_post', 'last_post__user') - - if topics is not None: - # get all topic last visit records for the topics of interest - - tlvs = TopicLastVisit.objects.filter(user=user, topic__in=topics) - tlvs = dict([(tlv.topic.id, tlv) for tlv in tlvs]) - - for topic in topics: - if topic.id in tlvs: - tlv = tlvs[topic.id] - if topic.update_date > tlv.last_visit: - unread_topics.append(topic) - else: - unread_topics.append(topic) - - return unread_topics diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/urls.py --- a/gpp/forums/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,58 +0,0 @@ -""" -URLs for the forums application. -""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('forums.views.main', - url(r'^$', 'index', name='forums-index'), - url(r'^catchup/$', 'catchup_all', name='forums-catchup_all'), - url(r'^new-topic-success/(?P\d+)$', 'new_topic_thanks', name='forums-new_topic_thanks'), - url(r'^topic/(?P\d+)/$', 'topic_index', name='forums-topic_index'), - url(r'^topic/(?P\d+)/unread/$', 'topic_unread', name='forums-topic_unread'), - url(r'^topic/(?P\d+)/latest/$', 'topic_latest', name='forums-topic_latest'), - url(r'^topic/active/(\d+)/$', 'active_topics', name='forums-active_topics'), - url(r'^delete-post/$', 'delete_post', name='forums-delete_post'), - url(r'^edit/(?P\d+)/$', 'edit_post', name='forums-edit_post'), - url(r'^flag-post/$', 'flag_post', name='forums-flag_post'), - url(r'^forum/(?P[\w\d-]+)/$', 'forum_index', name='forums-forum_index'), - url(r'^forum/(?P[\w\d-]+)/catchup/$', 'forum_catchup', name='forums-catchup'), - url(r'^forum/(?P[\w\d-]+)/new-topic/$', 'new_topic', name='forums-new_topic'), - url(r'^mod/forum/(?P[\w\d-]+)/$', 'mod_forum', name='forums-mod_forum'), - url(r'^mod/topic/delete/(\d+)/$', 'mod_topic_delete', name='forums-mod_topic_delete'), - url(r'^mod/topic/lock/(\d+)/$', 'mod_topic_lock', name='forums-mod_topic_lock'), - url(r'^mod/topic/move/(\d+)/$', 'mod_topic_move', name='forums-mod_topic_move'), - url(r'^mod/topic/split/(\d+)/$', 'mod_topic_split', name='forums-mod_topic_split'), - url(r'^mod/topic/stick/(\d+)/$', 'mod_topic_stick', name='forums-mod_topic_stick'), - url(r'^my-posts/$', 'my_posts', name='forums-my_posts'), - url(r'^post/(\d+)/$', 'goto_post', name='forums-goto_post'), - url(r'^post/ip/(\d+)/$', 'post_ip_info', name='forums-post_ip_info'), - url(r'^post/new/(?P\d+)/$', 'new_post', name='forums-new_post'), - url(r'^posts/(?P[\w.@+-]{1,30})/$', 'posts_for_user', name='forums-posts_for_user'), - url(r'^quick-reply/$', 'quick_reply_ajax', name='forums-quick_reply'), - url(r'^unanswered/$', 'unanswered_topics', name='forums-unanswered_topics'), - url(r'^unread/$', 'unread_topics', name='forums-unread_topics'), -) - -urlpatterns += patterns('forums.views.favorites', - url(r'^favorite/(\d+)/$', 'favorite_topic', name='forums-favorite_topic'), - url(r'^favorites/$', 'manage_favorites', name='forums-manage_favorites'), - url(r'^favorites/(\d+)/$', 'favorites_status', name='forums-favorites_status'), - url(r'^unfavorite/(\d+)/$', 'unfavorite_topic', name='forums-unfavorite_topic'), -) - -urlpatterns += patterns('forums.views.subscriptions', - url(r'^subscribe/(\d+)/$', 'subscribe_topic', name='forums-subscribe_topic'), - url(r'^subscriptions/$', 'manage_subscriptions', name='forums-manage_subscriptions'), - url(r'^subscriptions/(\d+)/$', 'subscription_status', name='forums-subscription_status'), - url(r'^unsubscribe/(\d+)/$', 'unsubscribe_topic', name='forums-unsubscribe_topic'), -) - -urlpatterns += patterns('forums.views.spam', - url(r'^spammer/(\d+)/$', 'spammer', name='forums-spammer'), - url(r'^spammer/nailed/(\d+)/$', 'spammer_nailed', name='forums-spammer_nailed'), - url(r'^stranger/(\d+)/$', 'stranger', name='forums-stranger'), -) - -urlpatterns += patterns('forums.views.attachments', - url(r'^fetch_attachments/$', 'fetch_attachments', name='forums-fetch_attachments'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/views/attachments.py --- a/gpp/forums/views/attachments.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,35 +0,0 @@ -""" -This module contains views for working with post attachments. -""" -from django.http import HttpResponse -from django.http import HttpResponseForbidden -from django.http import HttpResponseBadRequest -from django.http import HttpResponseNotFound -import django.utils.simplejson as json - -from forums.models import Post - - -def fetch_attachments(request): - """ - This view is the target of an AJAX GET request to retrieve the - attachment embed data for a given forum post. - - """ - if not request.user.is_authenticated(): - return HttpResponseForbidden('Please login or register.') - - post_id = request.GET.get('pid') - if post_id is None: - return HttpResponseBadRequest('Missing post ID.') - - try: - post = Post.objects.get(pk=post_id) - except Post.DoesNotExist: - return HttpResponseNotFound("That post doesn't exist.") - - embeds = post.attachments.all().select_related('embed') - data = [{'id': embed.id, 'html': embed.html} for embed in embeds] - - return HttpResponse(json.dumps(data), content_type='application/json') - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/views/favorites.py --- a/gpp/forums/views/favorites.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,92 +0,0 @@ -""" -This module contains view functions related to forum favorites (bookmarks). -""" -from django.contrib.auth.decorators import login_required -from django.core.urlresolvers import reverse -from django.views.decorators.http import require_POST -from django.shortcuts import get_object_or_404 -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.http import HttpResponseRedirect -from django.http import HttpResponseForbidden -from django.http import Http404 - -from core.paginator import DiggPaginator -from forums.models import Topic -import forums.permissions as perms - - -@login_required -@require_POST -def favorite_topic(request, topic_id): - """ - This function handles the "favoriting" (bookmarking) of a forum topic by a - user. - """ - topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) - if perms.can_access(topic.forum.category, request.user): - topic.bookmarkers.add(request.user) - return HttpResponseRedirect( - reverse("forums-favorites_status", args=[topic.id])) - return HttpResponseForbidden() - - -@login_required -def manage_favorites(request): - """Display a user's favorite topics and allow them to be deleted.""" - - user = request.user - if request.method == "POST": - if request.POST.get('delete_all'): - user.favorite_topics.clear() - else: - delete_ids = request.POST.getlist('delete_ids') - try: - delete_ids = [int(id) for id in delete_ids] - except ValueError: - raise Http404 - for topic in user.favorite_topics.filter(id__in=delete_ids): - user.favorite_topics.remove(topic) - - return HttpResponseRedirect(reverse("forums-manage_favorites")) - - page_num = request.GET.get('page', 1) - topics = user.favorite_topics.select_related().order_by('-update_date') - paginator = DiggPaginator(topics, 20, body=5, tail=2, margin=3, padding=2) - try: - page_num = int(page_num) - except ValueError: - page_num = 1 - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - return render_to_response('forums/manage_topics.html', { - 'page_title': 'Favorite Topics', - 'description': 'Your favorite topics are listed below.', - 'page': page, - }, - context_instance=RequestContext(request)) - -@login_required -def favorites_status(request, topic_id): - """Display the favorite status for the given topic.""" - topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) - is_favorite = request.user in topic.bookmarkers.all() - return render_to_response('forums/favorite_status.html', { - 'topic': topic, - 'is_favorite': is_favorite, - }, - context_instance=RequestContext(request)) - -@login_required -@require_POST -def unfavorite_topic(request, topic_id): - """ - Un-favorite the user from the requested topic. - """ - topic = get_object_or_404(Topic, id=topic_id) - topic.bookmarkers.remove(request.user) - return HttpResponseRedirect( - reverse("forums-favorites_status", args=[topic.id])) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/views/main.py --- a/gpp/forums/views/main.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1126 +0,0 @@ -""" -Views for the forums application. -""" -import collections -import datetime - -from django.contrib.auth.decorators import login_required -from django.contrib.auth.models import User -from django.http import Http404 -from django.http import HttpResponse -from django.http import HttpResponseBadRequest -from django.http import HttpResponseForbidden -from django.http import HttpResponseRedirect -from django.core.urlresolvers import reverse -from django.core.paginator import InvalidPage -from django.shortcuts import get_object_or_404 -from django.shortcuts import render_to_response -from django.template.loader import render_to_string -from django.template import RequestContext -from django.views.decorators.http import require_POST -from django.db.models import F - -import antispam -import antispam.utils -from bio.models import UserProfile, BadgeOwnership -from core.paginator import DiggPaginator -from core.functions import email_admins, quote_message - -from forums.models import (Forum, Topic, Post, FlaggedPost, TopicLastVisit, - ForumLastVisit, Attachment) -from forums.forms import (NewTopicForm, NewPostForm, PostForm, MoveTopicForm, - SplitTopicForm) -from forums.unread import (get_forum_unread_status, get_topic_unread_status, - get_post_unread_status, get_unread_topics) - -import forums.permissions as perms -from forums.signals import (notify_new_topic, notify_updated_topic, - notify_new_post, notify_updated_post) -from forums.latest import get_latest_topic_ids - -####################################################################### - -TOPICS_PER_PAGE = 50 -POSTS_PER_PAGE = 20 -FEED_BASE = '/feeds/forums/' -FORUM_FEED = FEED_BASE + '%s/' - - -def get_page_num(request): - """Returns the value of the 'page' variable in GET if it exists, or 1 - if it does not.""" - - try: - page_num = int(request.GET.get('page', 1)) - except ValueError: - page_num = 1 - - return page_num - - -def create_topic_paginator(topics): - return DiggPaginator(topics, TOPICS_PER_PAGE, body=5, tail=2, margin=3, padding=2) - -def create_post_paginator(posts): - return DiggPaginator(posts, POSTS_PER_PAGE, body=5, tail=2, margin=3, padding=2) - - -def attach_topic_page_ranges(topics): - """Attaches a page_range attribute to each topic in the supplied list. - This attribute will be None if it is a single page topic. This is used - by the templates to generate "goto page x" links. - """ - for topic in topics: - if topic.post_count > POSTS_PER_PAGE: - pp = DiggPaginator(range(topic.post_count), POSTS_PER_PAGE, - body=2, tail=3, margin=1) - topic.page_range = pp.page(1).page_range - else: - topic.page_range = None - -####################################################################### - -def index(request): - """ - This view displays all the forums available, ordered in each category. - """ - public_forums = Forum.objects.public_forums() - feeds = [{'name': 'All Forums', 'feed': FEED_BASE}] - - forums = Forum.objects.forums_for_user(request.user) - get_forum_unread_status(forums, request.user) - cats = {} - for forum in forums: - forum.has_feed = forum in public_forums - if forum.has_feed: - feeds.append({ - 'name': '%s Forum' % forum.name, - 'feed': FORUM_FEED % forum.slug, - }) - - cat = cats.setdefault(forum.category.id, { - 'cat': forum.category, - 'forums': [], - }) - cat['forums'].append(forum) - - cmpdef = lambda a, b: cmp(a['cat'].position, b['cat'].position) - cats = sorted(cats.values(), cmpdef) - - return render_to_response('forums/index.html', { - 'cats': cats, - 'feeds': feeds, - }, - context_instance=RequestContext(request)) - - -def forum_index(request, slug): - """ - Displays all the topics in a forum. - """ - forum = get_object_or_404(Forum.objects.select_related(), slug=slug) - - if not perms.can_access(forum.category, request.user): - return HttpResponseForbidden() - - feed = None - if not forum.category.groups.all(): - feed = { - 'name': '%s Forum' % forum.name, - 'feed': FORUM_FEED % forum.slug, - } - - topics = forum.topics.select_related('user', 'last_post', 'last_post__user') - paginator = create_topic_paginator(topics) - page_num = get_page_num(request) - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - get_topic_unread_status(forum, page.object_list, request.user) - attach_topic_page_ranges(page.object_list) - - # we do this for the template since it is rendered twice - page_nav = render_to_string('forums/pagination.html', {'page': page}) - - can_moderate = perms.can_moderate(forum, request.user) - - return render_to_response('forums/forum_index.html', { - 'forum': forum, - 'feed': feed, - 'page': page, - 'page_nav': page_nav, - 'can_moderate': can_moderate, - }, - context_instance=RequestContext(request)) - - -def topic_index(request, id): - """ - Displays all the posts in a topic. - """ - topic = get_object_or_404(Topic.objects.select_related( - 'forum', 'forum__category', 'last_post'), pk=id) - - if not perms.can_access(topic.forum.category, request.user): - return HttpResponseForbidden() - - topic.view_count = F('view_count') + 1 - topic.save(force_update=True) - - posts = topic.posts.select_related(depth=1) - - paginator = create_post_paginator(posts) - page_num = get_page_num(request) - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - get_post_unread_status(topic, page.object_list, request.user) - - # Attach user profiles to each post's user to avoid using - # get_user_profile() in the template. - users = set(post.user.id for post in page.object_list) - - profiles = UserProfile.objects.filter(user__id__in=users).select_related() - profile_keys = [profile.id for profile in profiles] - user_profiles = dict((profile.user.id, profile) for profile in profiles) - - last_post_on_page = None - for post in page.object_list: - post.user.user_profile = user_profiles[post.user.id] - post.attach_list = [] - last_post_on_page = post - - # Attach badge ownership info to the user profiles to avoid lots - # of database hits in the template: - bos_qs = BadgeOwnership.objects.filter( - profile__id__in=profile_keys).select_related() - bos = collections.defaultdict(list) - for bo in bos_qs: - bos[bo.profile.id].append(bo) - - for user_id, profile in user_profiles.iteritems(): - profile.badge_ownership = bos[profile.id] - - # Attach any attachments - post_ids = [post.pk for post in page.object_list] - attachments = Attachment.objects.filter(post__in=post_ids).select_related( - 'embed').order_by('order') - - post_dict = dict((post.pk, post) for post in page.object_list) - for item in attachments: - post_dict[item.post.id].attach_list.append(item.embed) - - last_page = page_num == paginator.num_pages - - if request.user.is_authenticated(): - if last_page or last_post_on_page is None: - visit_time = datetime.datetime.now() - else: - visit_time = last_post_on_page.creation_date - _update_last_visit(request.user, topic, visit_time) - - # we do this for the template since it is rendered twice - page_nav = render_to_string('forums/pagination.html', {'page': page}) - - can_moderate = perms.can_moderate(topic.forum, request.user) - - can_reply = request.user.is_authenticated() and ( - not topic.locked or can_moderate) - - is_favorite = request.user.is_authenticated() and ( - topic in request.user.favorite_topics.all()) - - is_subscribed = request.user.is_authenticated() and ( - topic in request.user.subscriptions.all()) - - return render_to_response('forums/topic.html', { - 'forum': topic.forum, - 'topic': topic, - 'page': page, - 'page_nav': page_nav, - 'last_page': last_page, - 'can_moderate': can_moderate, - 'can_reply': can_reply, - 'form': NewPostForm(initial={'topic_id': topic.id}), - 'is_favorite': is_favorite, - 'is_subscribed': is_subscribed, - }, - context_instance=RequestContext(request)) - - -def topic_unread(request, id): - """ - This view redirects to the first post the user hasn't read, if we can - figure that out. Otherwise we redirect to the topic. - - """ - topic_url = reverse('forums-topic_index', kwargs={'id': id}) - - if request.user.is_authenticated(): - topic = get_object_or_404(Topic.objects.select_related(depth=1), pk=id) - try: - tlv = TopicLastVisit.objects.get(user=request.user, topic=topic) - except TopicLastVisit.DoesNotExist: - try: - flv = ForumLastVisit.objects.get(user=request.user, - forum=topic.forum) - except ForumLastVisit.DoesNotExist: - return HttpResponseRedirect(topic_url) - else: - last_visit = flv.begin_date - else: - last_visit = tlv.last_visit - - posts = Post.objects.filter(topic=topic, creation_date__gt=last_visit) - if posts: - return _goto_post(posts[0]) - else: - # just go to the last post in the topic - return _goto_post(topic.last_post) - - # user isn't authenticated, just go to the topic - return HttpResponseRedirect(topic_url) - - -def topic_latest(request, id): - """ - This view shows the latest (last) post in a given topic. - - """ - topic = get_object_or_404(Topic.objects.select_related(depth=1), pk=id) - - if topic.last_post: - return _goto_post(topic.last_post) - - raise Http404 - - -@login_required -def new_topic(request, slug): - """ - This view handles the creation of new topics. - """ - forum = get_object_or_404(Forum.objects.select_related(), slug=slug) - - if not perms.can_access(forum.category, request.user): - return HttpResponseForbidden() - - if request.method == 'POST': - form = NewTopicForm(request.user, forum, request.POST) - if form.is_valid(): - if antispam.utils.spam_check(request, form.cleaned_data['body']): - return HttpResponseRedirect(reverse('antispam-suspended')) - - topic = form.save(request.META.get("REMOTE_ADDR")) - _bump_post_count(request.user) - return HttpResponseRedirect(reverse('forums-new_topic_thanks', - kwargs={'tid': topic.pk})) - else: - form = NewTopicForm(request.user, forum) - - return render_to_response('forums/new_topic.html', { - 'forum': forum, - 'form': form, - }, - context_instance=RequestContext(request)) - - -@login_required -def new_topic_thanks(request, tid): - """ - This view displays the success page for a newly created topic. - """ - topic = get_object_or_404(Topic.objects.select_related(), pk=tid) - return render_to_response('forums/new_topic_thanks.html', { - 'forum': topic.forum, - 'topic': topic, - }, - context_instance=RequestContext(request)) - - -@require_POST -def quick_reply_ajax(request): - """ - This function handles the quick reply to a thread function. This - function is meant to be the target of an AJAX post, and returns - the HTML for the new post, which the client-side script appends - to the document. - """ - if not request.user.is_authenticated(): - return HttpResponseForbidden('Please login or register to post.') - - form = NewPostForm(request.POST) - if form.is_valid(): - if not perms.can_post(form.topic, request.user): - return HttpResponseForbidden("You don't have permission to post in this topic.") - if antispam.utils.spam_check(request, form.cleaned_data['body']): - return HttpResponseForbidden(antispam.BUSTED_MESSAGE) - - post = form.save(request.user, request.META.get("REMOTE_ADDR", "")) - post.unread = True - post.user.user_profile = request.user.get_profile() - post.attach_list = post.attachments.all() - _bump_post_count(request.user) - _update_last_visit(request.user, form.topic, datetime.datetime.now()) - - return render_to_response('forums/display_post.html', { - 'post': post, - 'can_moderate': perms.can_moderate(form.topic.forum, request.user), - 'can_reply': True, - }, - context_instance=RequestContext(request)) - - return HttpResponseBadRequest("Oops, did you forget some text?"); - - -def _goto_post(post): - """ - Calculate what page the given post is on in its parent topic, then - return a redirect to it. - - """ - count = post.topic.posts.filter(creation_date__lt=post.creation_date).count() - page = count / POSTS_PER_PAGE + 1 - url = (reverse('forums-topic_index', kwargs={'id': post.topic.id}) + - '?page=%s#p%s' % (page, post.id)) - return HttpResponseRedirect(url) - - -def goto_post(request, post_id): - """ - This function calculates what page a given post is on, then redirects - to that URL. This function is the target of get_absolute_url() for - Post objects. - """ - post = get_object_or_404(Post.objects.select_related(), pk=post_id) - return _goto_post(post) - - -@require_POST -def flag_post(request): - """ - This function handles the flagging of posts by users. This function should - be the target of an AJAX post. - """ - if not request.user.is_authenticated(): - return HttpResponseForbidden('Please login or register to flag a post.') - - id = request.POST.get('id') - if id is None: - return HttpResponseBadRequest('No post id') - - try: - post = Post.objects.get(pk=id) - except Post.DoesNotExist: - return HttpResponseBadRequest('No post with id %s' % id) - - flag = FlaggedPost(user=request.user, post=post) - flag.save() - email_admins('A Post Has Been Flagged', """Hello, - -A user has flagged a forum post for review. -""") - return HttpResponse('The post was flagged. A moderator will review the post shortly. ' \ - 'Thanks for helping to improve the discussions on this site.') - - -@login_required -def edit_post(request, id): - """ - This view function allows authorized users to edit posts. - The superuser, forum moderators, and original author can edit posts. - """ - post = get_object_or_404(Post.objects.select_related(), pk=id) - - can_moderate = perms.can_moderate(post.topic.forum, request.user) - can_edit = can_moderate or request.user == post.user - - if not can_edit: - return HttpResponseForbidden("You don't have permission to edit that post.") - - topic_name = None - first_post = Post.objects.filter(topic=post.topic).order_by('creation_date')[0] - if first_post.id == post.id: - topic_name = post.topic.name - - if request.method == "POST": - form = PostForm(request.POST, instance=post, topic_name=topic_name) - if form.is_valid(): - if antispam.utils.spam_check(request, form.cleaned_data['body']): - return HttpResponseRedirect(reverse('antispam-suspended')) - post = form.save(commit=False) - post.touch() - post.save() - notify_updated_post(post) - - # if we are editing a first post, save the parent topic as well - if topic_name: - post.topic.save() - notify_updated_topic(post.topic) - - # Save any attachments - form.attach_proc.save_attachments(post) - - return HttpResponseRedirect(post.get_absolute_url()) - else: - form = PostForm(instance=post, topic_name=topic_name) - - post.user.user_profile = post.user.get_profile() - - return render_to_response('forums/edit_post.html', { - 'forum': post.topic.forum, - 'topic': post.topic, - 'post': post, - 'form': form, - 'can_moderate': can_moderate, - }, - context_instance=RequestContext(request)) - - -@require_POST -def delete_post(request): - """ - This view function allows superusers and forum moderators to delete posts. - This function is the target of AJAX calls from the client. - """ - if not request.user.is_authenticated(): - return HttpResponseForbidden('Please login to delete a post.') - - id = request.POST.get('id') - if id is None: - return HttpResponseBadRequest('No post id') - - post = get_object_or_404(Post.objects.select_related(), pk=id) - - if not perms.can_moderate(post.topic.forum, request.user): - return HttpResponseForbidden("You don't have permission to delete that post.") - - delete_single_post(post) - return HttpResponse("The post has been deleted.") - - -def delete_single_post(post): - """ - This function deletes a single post. It handles the case of where - a post is the sole post in a topic by deleting the topic also. It - adjusts any foreign keys in Topic or Forum objects that might be pointing - to this post before deleting the post to avoid a cascading delete. - """ - if post.topic.post_count == 1 and post == post.topic.last_post: - _delete_topic(post.topic) - else: - _delete_post(post) - - -def _delete_post(post): - """ - Internal function to delete a single post object. - Decrements the post author's post count. - Adjusts the parent topic and forum's last_post as needed. - """ - # Adjust post creator's post count - profile = post.user.get_profile() - if profile.forum_post_count > 0: - profile.forum_post_count -= 1 - profile.save(content_update=False) - - # If this post is the last_post in a topic, we need to update - # both the topic and parent forum's last post fields. If we don't - # the cascading delete will delete them also! - - topic = post.topic - if topic.last_post == post: - topic.last_post_pre_delete() - topic.save() - - forum = topic.forum - if forum.last_post == post: - forum.last_post_pre_delete() - forum.save() - - # delete any attachments - post.attachments.clear() - - # Should be safe to delete the post now: - post.delete() - - -def _delete_topic(topic): - """ - Internal function to delete an entire topic. - Deletes the topic and all posts contained within. - Adjusts the parent forum's last_post as needed. - Note that we don't bother adjusting all the users' - post counts as that doesn't seem to be worth the effort. - """ - parent_forum = topic.forum - if parent_forum.last_post and parent_forum.last_post.topic == topic: - parent_forum.last_post_pre_delete(deleting_topic=True) - parent_forum.save() - - # delete subscriptions to this topic - topic.subscribers.clear() - topic.bookmarkers.clear() - - # delete all attachments - posts = Post.objects.filter(topic=topic) - for post in posts: - post.attachments.clear() - - # Null out the topic's last post so we don't have a foreign key pointing - # to a post when we delete posts. - topic.last_post = None - topic.save() - - # delete all posts in bulk - posts.delete() - - # It should be safe to just delete the topic now. - topic.delete() - - # Resync parent forum's post and topic counts - parent_forum.sync() - parent_forum.save() - - -@login_required -def new_post(request, topic_id): - """ - This function is the view for creating a normal, non-quick reply - to a topic. - """ - topic = get_object_or_404(Topic.objects.select_related(), pk=topic_id) - can_post = perms.can_post(topic, request.user) - - if can_post: - if request.method == 'POST': - form = PostForm(request.POST) - if form.is_valid(): - if antispam.utils.spam_check(request, form.cleaned_data['body']): - return HttpResponseRedirect(reverse('antispam-suspended')) - post = form.save(commit=False) - post.topic = topic - post.user = request.user - post.user_ip = request.META.get("REMOTE_ADDR", "") - post.save() - notify_new_post(post) - - # Save any attachments - form.attach_proc.save_attachments(post) - - _bump_post_count(request.user) - _update_last_visit(request.user, topic, datetime.datetime.now()) - return HttpResponseRedirect(post.get_absolute_url()) - else: - quote_id = request.GET.get('quote') - if quote_id: - quote_post = get_object_or_404(Post.objects.select_related(), - pk=quote_id) - form = PostForm(initial={'body': quote_message(quote_post.user.username, - quote_post.body)}) - else: - form = PostForm() - else: - form = None - - return render_to_response('forums/new_post.html', { - 'forum': topic.forum, - 'topic': topic, - 'form': form, - 'can_post': can_post, - }, - context_instance=RequestContext(request)) - - -@login_required -def mod_topic_stick(request, id): - """ - This view function is for moderators to toggle the sticky status of a topic. - """ - topic = get_object_or_404(Topic.objects.select_related(), pk=id) - if perms.can_moderate(topic.forum, request.user): - topic.sticky = not topic.sticky - topic.save() - return HttpResponseRedirect(topic.get_absolute_url()) - - return HttpResponseForbidden() - - -@login_required -def mod_topic_lock(request, id): - """ - This view function is for moderators to toggle the locked status of a topic. - """ - topic = get_object_or_404(Topic.objects.select_related(), pk=id) - if perms.can_moderate(topic.forum, request.user): - topic.locked = not topic.locked - topic.save() - return HttpResponseRedirect(topic.get_absolute_url()) - - return HttpResponseForbidden() - - -@login_required -def mod_topic_delete(request, id): - """ - This view function is for moderators to delete an entire topic. - """ - topic = get_object_or_404(Topic.objects.select_related(), pk=id) - if perms.can_moderate(topic.forum, request.user): - forum_url = topic.forum.get_absolute_url() - _delete_topic(topic) - return HttpResponseRedirect(forum_url) - - return HttpResponseForbidden() - - -@login_required -def mod_topic_move(request, id): - """ - This view function is for moderators to move a topic to a different forum. - """ - topic = get_object_or_404(Topic.objects.select_related(), pk=id) - if not perms.can_moderate(topic.forum, request.user): - return HttpResponseForbidden() - - if request.method == 'POST': - form = MoveTopicForm(request.user, request.POST) - if form.is_valid(): - new_forum = form.cleaned_data['forums'] - old_forum = topic.forum - _move_topic(topic, old_forum, new_forum) - return HttpResponseRedirect(topic.get_absolute_url()) - else: - form = MoveTopicForm(request.user) - - return render_to_response('forums/move_topic.html', { - 'forum': topic.forum, - 'topic': topic, - 'form': form, - }, - context_instance=RequestContext(request)) - - -@login_required -def mod_forum(request, slug): - """ - Displays a view to allow moderators to perform various operations - on topics in a forum in bulk. We currently support mass locking/unlocking, - stickying and unstickying, moving, and deleting topics. - """ - forum = get_object_or_404(Forum.objects.select_related(), slug=slug) - if not perms.can_moderate(forum, request.user): - return HttpResponseForbidden() - - topics = forum.topics.select_related('user', 'last_post', 'last_post__user') - paginator = create_topic_paginator(topics) - page_num = get_page_num(request) - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - # we do this for the template since it is rendered twice - page_nav = render_to_string('forums/pagination.html', {'page': page}) - form = None - - if request.method == 'POST': - topic_ids = request.POST.getlist('topic_ids') - url = reverse('forums-mod_forum', kwargs={'slug':forum.slug}) - url += '?page=%s' % page_num - - if len(topic_ids): - if request.POST.get('sticky'): - _bulk_sticky(forum, topic_ids) - return HttpResponseRedirect(url) - elif request.POST.get('lock'): - _bulk_lock(forum, topic_ids) - return HttpResponseRedirect(url) - elif request.POST.get('delete'): - _bulk_delete(forum, topic_ids) - return HttpResponseRedirect(url) - elif request.POST.get('move'): - form = MoveTopicForm(request.user, request.POST, hide_label=True) - if form.is_valid(): - _bulk_move(topic_ids, forum, form.cleaned_data['forums']) - return HttpResponseRedirect(url) - - if form is None: - form = MoveTopicForm(request.user, hide_label=True) - - return render_to_response('forums/mod_forum.html', { - 'forum': forum, - 'page': page, - 'page_nav': page_nav, - 'form': form, - }, - context_instance=RequestContext(request)) - - -@login_required -@require_POST -def catchup_all(request): - """ - This view marks all forums as being read. - """ - forum_ids = Forum.objects.forum_ids_for_user(request.user) - - tlvs = TopicLastVisit.objects.filter(user=request.user, - topic__forum__id__in=forum_ids).delete() - - now = datetime.datetime.now() - ForumLastVisit.objects.filter(user=request.user, - forum__in=forum_ids).update(begin_date=now, end_date=now) - - return HttpResponseRedirect(reverse('forums-index')) - - -@login_required -@require_POST -def forum_catchup(request, slug): - """ - This view marks all the topics in the forum as being read. - """ - forum = get_object_or_404(Forum.objects.select_related(), slug=slug) - - if not perms.can_access(forum.category, request.user): - return HttpResponseForbidden() - - forum.catchup(request.user) - return HttpResponseRedirect(forum.get_absolute_url()) - - -@login_required -def mod_topic_split(request, id): - """ - This view function allows moderators to split posts off to a new topic. - """ - topic = get_object_or_404(Topic.objects.select_related(), pk=id) - if not perms.can_moderate(topic.forum, request.user): - return HttpResponseRedirect(topic.get_absolute_url()) - - if request.method == "POST": - form = SplitTopicForm(request.user, request.POST) - if form.is_valid(): - if form.split_at: - _split_topic_at(topic, form.post_ids[0], - form.cleaned_data['forums'], - form.cleaned_data['name']) - else: - _split_topic(topic, form.post_ids, - form.cleaned_data['forums'], - form.cleaned_data['name']) - - return HttpResponseRedirect(topic.get_absolute_url()) - else: - form = SplitTopicForm(request.user) - - posts = topic.posts.select_related() - - return render_to_response('forums/mod_split_topic.html', { - 'forum': topic.forum, - 'topic': topic, - 'posts': posts, - 'form': form, - }, - context_instance=RequestContext(request)) - - -@login_required -def unread_topics(request): - """Displays the topics with unread posts for a given user.""" - - topics = get_unread_topics(request.user) - - paginator = create_topic_paginator(topics) - page_num = get_page_num(request) - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - attach_topic_page_ranges(page.object_list) - - # we do this for the template since it is rendered twice - page_nav = render_to_string('forums/pagination.html', {'page': page}) - - return render_to_response('forums/topic_list.html', { - 'title': 'Topics With Unread Posts', - 'page': page, - 'page_nav': page_nav, - 'unread': True, - }, - context_instance=RequestContext(request)) - - -def unanswered_topics(request): - """Displays the topics with no replies.""" - - forum_ids = Forum.objects.forum_ids_for_user(request.user) - topics = Topic.objects.filter(forum__id__in=forum_ids, - post_count=1).select_related( - 'forum', 'user', 'last_post', 'last_post__user') - - paginator = create_topic_paginator(topics) - page_num = get_page_num(request) - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - attach_topic_page_ranges(page.object_list) - - # we do this for the template since it is rendered twice - page_nav = render_to_string('forums/pagination.html', {'page': page}) - - return render_to_response('forums/topic_list.html', { - 'title': 'Unanswered Topics', - 'page': page, - 'page_nav': page_nav, - 'unread': False, - }, - context_instance=RequestContext(request)) - - -def active_topics(request, num): - """Displays the last num topics that have been posted to.""" - - # sanity check num - num = min(50, max(10, int(num))) - - # MySQL didn't do this query very well unfortunately... - # - #public_forum_ids = Forum.objects.public_forum_ids() - #topics = Topic.objects.filter(forum__in=public_forum_ids).select_related( - # 'forum', 'user', 'last_post', 'last_post__user').order_by( - # '-update_date')[:num] - - # Save 1 query by using forums.latest to give us a list of the most recent - # topics; forums.latest doesn't save enough info to give us everything we - # need so we hit the database for the rest. - - topic_ids = get_latest_topic_ids(num) - topics = Topic.objects.filter(id__in=topic_ids).select_related( - 'forum', 'user', 'last_post', 'last_post__user').order_by( - '-update_date') - - paginator = create_topic_paginator(topics) - page_num = get_page_num(request) - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - attach_topic_page_ranges(page.object_list) - - # we do this for the template since it is rendered twice - page_nav = render_to_string('forums/pagination.html', {'page': page}) - - title = 'Last %d Active Topics' % num - - return render_to_response('forums/topic_list.html', { - 'title': title, - 'page': page, - 'page_nav': page_nav, - 'unread': False, - }, - context_instance=RequestContext(request)) - - -@login_required -def my_posts(request): - """Displays a list of posts the requesting user made.""" - return _user_posts(request, request.user, request.user, 'My Posts') - - -@login_required -def posts_for_user(request, username): - """Displays a list of posts by the given user. - Only the forums that the requesting user can see are examined. - """ - target_user = get_object_or_404(User, username=username) - return _user_posts(request, target_user, request.user, 'Posts by %s' % username) - - -@login_required -def post_ip_info(request, post_id): - """Displays information about the IP address the post was made from.""" - post = get_object_or_404(Post.objects.select_related(), pk=post_id) - - if not perms.can_moderate(post.topic.forum, request.user): - return HttpResponseForbidden("You don't have permission for this post.") - - ip_users = sorted(set(Post.objects.filter( - user_ip=post.user_ip).values_list('user__username', flat=True))) - - return render_to_response('forums/post_ip.html', { - 'post': post, - 'ip_users': ip_users, - }, - context_instance=RequestContext(request)) - - -def _user_posts(request, target_user, req_user, page_title): - """Displays a list of posts made by the target user. - req_user is the user trying to view the posts. Only the forums - req_user can see are searched. - """ - forum_ids = Forum.objects.forum_ids_for_user(req_user) - posts = Post.objects.filter(user=target_user, - topic__forum__id__in=forum_ids).order_by( - '-creation_date').select_related() - - paginator = create_post_paginator(posts) - page_num = get_page_num(request) - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - # we do this for the template since it is rendered twice - page_nav = render_to_string('forums/pagination.html', {'page': page}) - - return render_to_response('forums/post_list.html', { - 'title': page_title, - 'page': page, - 'page_nav': page_nav, - }, - context_instance=RequestContext(request)) - - -def _bump_post_count(user): - """ - Increments the forum_post_count for the given user. - """ - profile = user.get_profile() - profile.forum_post_count += 1 - profile.save(content_update=False) - - -def _move_topic(topic, old_forum, new_forum): - if new_forum != old_forum: - topic.forum = new_forum - topic.save() - # Have to adjust foreign keys to last_post, denormalized counts, etc.: - old_forum.sync() - old_forum.save() - new_forum.sync() - new_forum.save() - - -def _bulk_sticky(forum, topic_ids): - """ - Performs a toggle on the sticky status for a given list of topic ids. - """ - topics = Topic.objects.filter(pk__in=topic_ids) - for topic in topics: - if topic.forum == forum: - topic.sticky = not topic.sticky - topic.save() - - -def _bulk_lock(forum, topic_ids): - """ - Performs a toggle on the locked status for a given list of topic ids. - """ - topics = Topic.objects.filter(pk__in=topic_ids) - for topic in topics: - if topic.forum == forum: - topic.locked = not topic.locked - topic.save() - - -def _bulk_delete(forum, topic_ids): - """ - Deletes the list of topics. - """ - # Because we are deleting stuff, retrieve each topic one at a - # time since we are going to be adjusting de-normalized fields - # during deletes. In particular, we can't do this: - # topics = Topic.objects.filter(pk__in=topic_ids).select_related() - # for topic in topics: - # since topic.forum.last_post can go stale after a delete. - - for id in topic_ids: - try: - topic = Topic.objects.select_related().get(pk=id) - except Topic.DoesNotExist: - continue - _delete_topic(topic) - - -def _bulk_move(topic_ids, old_forum, new_forum): - """ - Moves the list of topics to a new forum. - """ - topics = Topic.objects.filter(pk__in=topic_ids).select_related() - for topic in topics: - if topic.forum == old_forum: - _move_topic(topic, old_forum, new_forum) - - -def _update_last_visit(user, topic, visit_time): - """ - Does the bookkeeping for the last visit status for the user to the - topic/forum. - """ - now = datetime.datetime.now() - try: - flv = ForumLastVisit.objects.get(user=user, forum=topic.forum) - except ForumLastVisit.DoesNotExist: - flv = ForumLastVisit(user=user, forum=topic.forum) - flv.begin_date = now - - flv.end_date = now - flv.save() - - if topic.update_date > flv.begin_date: - try: - tlv = TopicLastVisit.objects.get(user=user, topic=topic) - except TopicLastVisit.DoesNotExist: - tlv = TopicLastVisit(user=user, topic=topic, last_visit=datetime.datetime.min) - - if visit_time > tlv.last_visit: - tlv.last_visit = visit_time - tlv.save() - - -def _split_topic_at(topic, post_id, new_forum, new_name): - """ - This function splits the post given by post_id and all posts that come - after it in the given topic to a new topic in a new forum. - It is assumed the caller has been checked for moderator rights. - """ - post = get_object_or_404(Post, id=post_id) - if post.topic == topic: - post_ids = Post.objects.filter(topic=topic, - creation_date__gte=post.creation_date).values_list('id', flat=True) - _split_topic(topic, post_ids, new_forum, new_name) - - -def _split_topic(topic, post_ids, new_forum, new_name): - """ - This function splits the posts given by the post_ids list in the - given topic to a new topic in a new forum. - It is assumed the caller has been checked for moderator rights. - """ - posts = Post.objects.filter(topic=topic, id__in=post_ids) - if len(posts) > 0: - new_topic = Topic(forum=new_forum, name=new_name, user=posts[0].user) - new_topic.save() - notify_new_topic(new_topic) - for post in posts: - post.topic = new_topic - post.save() - - topic.post_count_update() - topic.save() - new_topic.post_count_update() - new_topic.save() - topic.forum.sync() - topic.forum.save() - new_forum.sync() - new_forum.save() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/views/spam.py --- a/gpp/forums/views/spam.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,119 +0,0 @@ -""" -This module contains views for dealing with spam and spammers. -""" -import datetime -import logging -import textwrap - -from django.contrib.auth.decorators import login_required -from django.core.urlresolvers import reverse -from django.http import HttpResponseRedirect -from django.shortcuts import get_object_or_404 -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.contrib.auth.models import User - -from forums.models import Post -import forums.permissions as perms -import bio.models -from core.functions import email_admins -from antispam.utils import deactivate_spammer - - -SPAMMER_NAILED_SUBJECT = "Spammer Nailed: %s" -SPAMMER_NAILED_MSG_BODY = """ -The admin/moderator user %s has just deactivated the account of %s for spam. -""" - - -def promote_stranger(user): - """This function upgrades the user from stranger status to a regular user. - """ - profile = user.get_profile() - if user.is_active and profile.status == bio.models.STA_STRANGER: - profile.status = bio.models.STA_ACTIVE - profile.status_date = datetime.datetime.now() - profile.save(content_update=False) - - -@login_required -def spammer(request, post_id): - """This view allows moderators to deactivate spammer accounts.""" - - post = get_object_or_404(Post.objects.select_related(), pk=post_id) - poster = post.user - poster_profile = poster.get_profile() - - can_moderate = perms.can_moderate(post.topic.forum, request.user) - can_deactivate = (poster_profile.status == bio.models.STA_STRANGER and not - poster.is_superuser) - - if request.method == "POST" and can_moderate and can_deactivate: - deactivate_spammer(poster) - - email_admins(SPAMMER_NAILED_SUBJECT % poster.username, - SPAMMER_NAILED_MSG_BODY % ( - request.user.username, poster.username)) - - logging.info(textwrap.dedent("""\ - SPAMMER DEACTIVATED: %s nailed %s for spam. - IP: %s - Message: - %s - """), - request.user.username, poster.username, post.user_ip, post.body) - - return HttpResponseRedirect(reverse('forums-spammer_nailed', args=[ - poster.id])) - - return render_to_response('forums/spammer.html', { - 'can_moderate': can_moderate, - 'can_deactivate': can_deactivate, - 'post': post, - }, - context_instance=RequestContext(request)) - - -@login_required -def spammer_nailed(request, spammer_id): - """This view presents a confirmation screen that the spammer has been - deactivated. - """ - user = get_object_or_404(User, pk=spammer_id) - profile = user.get_profile() - - success = not user.is_active and profile.status == bio.models.STA_SPAMMER - - return render_to_response('forums/spammer_nailed.html', { - 'spammer': user, - 'success': success, - }, - context_instance=RequestContext(request)) - - -@login_required -def stranger(request, post_id): - """This view allows a forum moderator or super user to promote a user from - stranger status to regular user. - """ - post = get_object_or_404(Post.objects.select_related(), pk=post_id) - poster = post.user - poster_profile = poster.get_profile() - - can_moderate = perms.can_moderate(post.topic.forum, request.user) - can_promote = poster_profile.status == bio.models.STA_STRANGER - - if request.method == "POST" and can_moderate and can_promote: - promote_stranger(poster) - - logging.info("STRANGER PROMOTED: %s promoted %s.", - request.user.username, poster.username) - - return HttpResponseRedirect(post.get_absolute_url()) - - return render_to_response('forums/stranger.html', { - 'can_moderate': can_moderate, - 'can_promote': can_promote, - 'post': post, - }, - context_instance=RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/forums/views/subscriptions.py --- a/gpp/forums/views/subscriptions.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,122 +0,0 @@ -"""This module handles the subscriptions of users to forum topics.""" -from django.conf import settings -from django.contrib.auth.decorators import login_required -from django.contrib.sites.models import Site -from django.core.paginator import InvalidPage -from django.core.urlresolvers import reverse -from django.http import HttpResponseRedirect -from django.http import Http404 -from django.template.loader import render_to_string -from django.shortcuts import get_object_or_404 -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.views.decorators.http import require_POST - -from forums.models import Topic -import forums.permissions as perms -from core.functions import send_mail -from core.paginator import DiggPaginator - - -def notify_topic_subscribers(post, defer=True): - """ - The argument post is a newly created post. Send out an email - notification to all subscribers of the post's parent Topic. - - The defer flag is passed to core.functions.send_mail. If True, the mail is - sent on a Celery task. If False, the mail is sent on the caller's thread. - """ - topic = post.topic - recipients = topic.subscribers.exclude(id=post.user.id).values_list( - 'email', flat=True) - - if recipients: - site = Site.objects.get_current() - subject = "[%s] Topic Reply: %s" % (site.name, topic.name) - url_prefix = "http://%s" % site.domain - post_url = url_prefix + post.get_absolute_url() - unsubscribe_url = url_prefix + reverse("forums-manage_subscriptions") - msg = render_to_string("forums/topic_notify_email.txt", { - 'poster': post.user.username, - 'topic_name': topic.name, - 'message': post.body, - 'post_url': post_url, - 'unsubscribe_url': unsubscribe_url, - }) - for recipient in recipients: - send_mail(subject, msg, settings.DEFAULT_FROM_EMAIL, [recipient], - defer=defer) - - -@login_required -@require_POST -def subscribe_topic(request, topic_id): - """Subscribe the user to the requested topic.""" - topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) - if perms.can_access(topic.forum.category, request.user): - topic.subscribers.add(request.user) - return HttpResponseRedirect( - reverse("forums-subscription_status", args=[topic.id])) - raise Http404 - - -@login_required -@require_POST -def unsubscribe_topic(request, topic_id): - """Unsubscribe the user to the requested topic.""" - topic = get_object_or_404(Topic, id=topic_id) - topic.subscribers.remove(request.user) - return HttpResponseRedirect( - reverse("forums-subscription_status", args=[topic.id])) - - -@login_required -def subscription_status(request, topic_id): - """Display the subscription status for the given topic.""" - topic = get_object_or_404(Topic.objects.select_related(), id=topic_id) - is_subscribed = request.user in topic.subscribers.all() - return render_to_response('forums/subscription_status.html', { - 'topic': topic, - 'is_subscribed': is_subscribed, - }, - context_instance=RequestContext(request)) - - -@login_required -def manage_subscriptions(request): - """Display a user's topic subscriptions, and allow them to be deleted.""" - - user = request.user - if request.method == "POST": - if request.POST.get('delete_all'): - user.subscriptions.clear() - else: - delete_ids = request.POST.getlist('delete_ids') - try: - delete_ids = [int(id) for id in delete_ids] - except ValueError: - raise Http404 - - for topic in user.subscriptions.filter(id__in=delete_ids): - user.subscriptions.remove(topic) - - return HttpResponseRedirect(reverse("forums-manage_subscriptions")) - - page_num = request.GET.get('page', 1) - topics = user.subscriptions.select_related().order_by('-update_date') - paginator = DiggPaginator(topics, 20, body=5, tail=2, margin=3, padding=2) - try: - page_num = int(page_num) - except ValueError: - page_num = 1 - try: - page = paginator.page(page_num) - except InvalidPage: - raise Http404 - - return render_to_response('forums/manage_topics.html', { - 'page_title': 'Topic Subscriptions', - 'description': 'The forum topics you are currently subscribed to are listed below.', - 'page': page, - }, - context_instance=RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/admin.py --- a/gpp/gcalendar/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,152 +0,0 @@ -""" -This file contains the automatic admin site definitions for the gcalendar application. - -""" -from django.conf import settings -from django.conf.urls import patterns, url -from django.contrib import admin -from django.contrib import messages -from django.contrib.sites.models import Site -from django.core.urlresolvers import reverse -from django.http import HttpResponseRedirect -from django.shortcuts import render - -import gdata.client - -from gcalendar.models import Event, AccessToken -from gcalendar.calendar import Calendar, CalendarError -from gcalendar import oauth - -import bio.badges - - -SCOPES = ['https://www.google.com/calendar/feeds/'] - - -class EventAdmin(admin.ModelAdmin): - list_display = ('what', 'user', 'start_date', 'where', 'date_submitted', - 'status', 'is_approved', 'google_html') - list_filter = ('start_date', 'status') - date_hierarchy = 'start_date' - search_fields = ('what', 'where', 'description') - raw_id_fields = ('user', ) - exclude = ('html', 'google_id', 'google_url') - save_on_top = True - actions = ('approve_events', ) - - pending_states = { - Event.NEW: Event.NEW_APRV, - Event.EDIT_REQ: Event.EDIT_APRV, - Event.DEL_REQ: Event.DEL_APRV, - } - - def get_urls(self): - urls = super(EventAdmin, self).get_urls() - my_urls = patterns('', - url(r'^google_sync/$', - self.admin_site.admin_view(self.google_sync), - name="gcalendar-google_sync"), - url(r'^fetch_auth/$', - self.admin_site.admin_view(self.fetch_auth), - name="gcalendar-fetch_auth"), - url(r'^get_access_token/$', - self.admin_site.admin_view(self.get_access_token), - name="gcalendar-get_access_token"), - ) - return my_urls + urls - - def approve_events(self, request, qs): - """ - Ratchets the selected events forward to the approved state. - Ignores events that aren't in the proper state. - """ - count = 0 - for event in qs: - if event.status in self.pending_states: - event.status = self.pending_states[event.status] - event.save() - count += 1 - - if event.status == Event.NEW_APRV: - bio.badges.award_badge(bio.badges.CALENDAR_PIN, event.user) - - msg = "1 event was" if count == 1 else "%d events were" % count - msg += " approved." - self.message_user(request, msg) - - approve_events.short_description = "Approve selected events" - - def google_sync(self, request): - """ - View to synchronize approved event changes with Google calendar. - - """ - # Get pending events - events = Event.pending_events.all() - - # Attempt to get saved access token to the Google calendar - access_token = AccessToken.objects.get_token().access_token() - - messages = [] - err_msg = '' - if request.method == 'POST': - if access_token: - try: - cal = Calendar(source=oauth.USER_AGENT, - calendar_id=settings.GCAL_CALENDAR_ID, - access_token=access_token) - cal.sync_events(events) - except CalendarError, e: - err_msg = e.msg - events = Event.pending_events.all() - else: - messages.append('All events processed successfully.') - events = Event.objects.none() - - return render(request, 'gcalendar/google_sync.html', { - 'current_app': self.admin_site.name, - 'access_token': access_token, - 'messages': messages, - 'err_msg': err_msg, - 'events': events, - }) - - def fetch_auth(self, request): - """ - This view fetches a request token and then redirects the user to - authorize it. - - """ - site = Site.objects.get_current() - callback_url = 'http://%s%s' % (site.domain, - reverse('admin:gcalendar-get_access_token')) - try: - auth_url = oauth.fetch_auth(request, SCOPES, callback_url) - except gdata.client.Error, e: - messages.error(request, str(e)) - return HttpResponseRedirect(reverse('admin:gcalendar-google_sync')) - else: - return HttpResponseRedirect(auth_url) - - def get_access_token(self, request): - """ - This view is called by Google after the user has authorized us access to - their data. We call into the oauth module to upgrade the oauth token to - an access token. We then save the access token in the database and - redirect back to our admin Google sync view. - - """ - try: - access_token = oauth.get_access_token(request) - except gdata.client.Error, e: - messages.error(request, str(e)) - else: - token = AccessToken.objects.get_token() - token.update(access_token) - token.save() - - return HttpResponseRedirect(reverse('admin:gcalendar-google_sync')) - - -admin.site.register(Event, EventAdmin) -admin.site.register(AccessToken) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/calendar.py --- a/gpp/gcalendar/calendar.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,148 +0,0 @@ -""" -This file contains the calendar class wich abstracts the Google gdata API for working with -Google Calendars. - -""" -import datetime -import pytz - -from django.utils.tzinfo import FixedOffset -from gdata.calendar.client import CalendarClient -from gdata.calendar.data import (CalendarEventEntry, CalendarEventFeed, - CalendarWhere, When, EventWho) -import atom.data - -from gcalendar.models import Event - - -class CalendarError(Exception): - def __init__(self, msg): - self.msg = msg - - def __str__(self): - return repr(self.msg) - - -class Calendar(object): - DATE_FMT = '%Y-%m-%d' - DATE_TIME_FMT = DATE_FMT + 'T%H:%M:%S' - DATE_TIME_TZ_FMT = DATE_TIME_FMT + '.000Z' - - def __init__(self, source=None, calendar_id='default', access_token=None): - self.client = CalendarClient(source=source, auth_token=access_token) - - self.insert_feed = ('https://www.google.com/calendar/feeds/' - '%s/private/full' % calendar_id) - self.batch_feed = '%s/batch' % self.insert_feed - - def sync_events(self, qs): - request_feed = CalendarEventFeed() - for model in qs: - if model.status == Event.NEW_APRV: - event = CalendarEventEntry() - request_feed.AddInsert(entry=self._populate_event(model, event)) - elif model.status == Event.EDIT_APRV: - event = self._retrieve_event(model) - request_feed.AddUpdate(entry=self._populate_event(model, event)) - elif model.status == Event.DEL_APRV: - event = self._retrieve_event(model) - request_feed.AddDelete(entry=event) - else: - assert False, 'unexpected status in sync_events' - - try: - response_feed = self.client.ExecuteBatch(request_feed, self.batch_feed) - except Exception, e: - raise CalendarError('ExecuteBatch exception: %s' % e) - - err_msgs = [] - for entry in response_feed.entry: - i = int(entry.batch_id.text) - code = int(entry.batch_status.code) - - error = False - if qs[i].status == Event.NEW_APRV: - if code == 201: - qs[i].status = Event.ON_CAL - qs[i].google_id = entry.GetEditLink().href - qs[i].google_url = entry.GetHtmlLink().href - qs[i].save() - qs[i].notify_on_calendar() - else: - error = True - - elif qs[i].status == Event.EDIT_APRV: - if code == 200: - qs[i].status = Event.ON_CAL - qs[i].save() - else: - error = True - - elif qs[i].status == Event.DEL_APRV: - if code == 200: - qs[i].delete() - else: - error = True - - if error: - err_msgs.append('%s - (%d) %s' % ( - qs[i].what, code, entry.batch_status.reason)) - - if len(err_msgs) > 0: - raise CalendarError(', '.join(err_msgs)) - - def _retrieve_event(self, model): - try: - event = self.client.GetEventEntry(model.google_id) - except Exception, e: - raise CalendarError('Could not retrieve event from Google: %s, %s' \ - % (model.what, e)) - return event - - def _populate_event(self, model, event): - """Populates a gdata event from an Event model object.""" - event.title = atom.data.Title(text=model.what) - event.content = atom.data.Content(text=model.html) - event.where = [CalendarWhere(value=model.where)] - event.who = [EventWho(email=model.user.email)] - - if model.all_day: - start_time = self._make_time(model.start_date) - if model.start_date == model.end_date: - end_time = None - else: - end_time = self._make_time(model.end_date) - else: - start_time = self._make_time(model.start_date, model.start_time, model.time_zone) - end_time = self._make_time(model.end_date, model.end_time, model.time_zone) - - event.when = [When(start=start_time, end=end_time)] - return event - - def _make_time(self, date, time=None, tz_name=None): - """ - Returns the gdata formatted date/time string given a date, optional time, - and optional time zone name (e.g. 'US/Pacific'). If the time zone name is None, - no time zone info will be added to the string. - """ - - if time is not None: - d = datetime.datetime.combine(date, time) - else: - d = datetime.datetime(date.year, date.month, date.day) - - if time is None: - s = d.strftime(self.DATE_FMT) - elif tz_name is None: - s = d.strftime(self.DATE_TIME_FMT) - else: - try: - tz = pytz.timezone(tz_name) - except pytz.UnknownTimeZoneError: - raise CalendarError('Invalid time zone: %s' (tz_name,)) - local = tz.localize(d) - zulu = local.astimezone(FixedOffset(0)) - s = zulu.strftime(self.DATE_TIME_TZ_FMT) - - return s - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/forms.py --- a/gpp/gcalendar/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,157 +0,0 @@ -""" -Forms for the gcalendar application. -""" -import datetime -import pytz -from django import forms -from django.conf import settings - -from gcalendar.models import Event - - -TIME_CHOICES = ( - ('00:00', '12:00 am (00:00)'), - ('00:30', '12:30 am (00:30)'), - ('01:00', '1:00 am (01:00)'), - ('01:30', '1:30 am (01:30)'), - ('02:00', '2:00 am (02:00)'), - ('02:30', '2:30 am (02:30)'), - ('03:00', '3:00 am (03:00)'), - ('03:30', '3:30 am (03:30)'), - ('04:00', '4:00 am (04:00)'), - ('04:30', '4:30 am (04:30)'), - ('05:00', '5:00 am (05:00)'), - ('05:30', '5:30 am (05:30)'), - ('06:00', '6:00 am (06:00)'), - ('06:30', '6:30 am (06:30)'), - ('07:00', '7:00 am (07:00)'), - ('07:30', '7:30 am (07:30)'), - ('08:00', '8:00 am (08:00)'), - ('08:30', '8:30 am (08:30)'), - ('09:00', '9:00 am (09:00)'), - ('09:30', '9:30 am (09:30)'), - ('10:00', '10:00 am (10:00)'), - ('10:30', '10:30 am (10:30)'), - ('11:00', '11:00 am (11:00)'), - ('11:30', '11:30 am (11:30)'), - ('12:00', '12:00 am (12:00)'), - ('12:30', '12:30 am (12:30)'), - ('13:00', '1:00 pm (13:00)'), - ('13:30', '1:30 pm (13:30)'), - ('14:00', '2:00 pm (14:00)'), - ('14:30', '2:30 pm (14:30)'), - ('15:00', '3:00 pm (15:00)'), - ('15:30', '3:30 pm (15:30)'), - ('16:00', '4:00 pm (16:00)'), - ('16:30', '4:30 pm (16:30)'), - ('17:00', '5:00 pm (17:00)'), - ('17:30', '5:30 pm (17:30)'), - ('18:00', '6:00 pm (18:00)'), - ('18:30', '6:30 pm (18:30)'), - ('19:00', '7:00 pm (19:00)'), - ('19:30', '7:30 pm (19:30)'), - ('20:00', '8:00 pm (20:00)'), - ('20:30', '8:30 pm (20:30)'), - ('21:00', '9:00 pm (21:00)'), - ('21:30', '9:30 pm (21:30)'), - ('22:00', '10:00 pm (22:00)'), - ('22:30', '10:30 pm (22:30)'), - ('23:00', '11:00 pm (23:00)'), - ('23:30', '11:30 pm (23:30)'), -) - - -class EventEntryForm(forms.ModelForm): - what = forms.CharField(widget=forms.TextInput(attrs={'size': 60})) - start_date = forms.DateField(widget=forms.TextInput(attrs={'size': 10})) - start_time = forms.TimeField(required=False, widget=forms.Select(choices=TIME_CHOICES)) - end_date = forms.DateField(widget=forms.TextInput(attrs={'size': 10})) - end_time = forms.TimeField(required=False, widget=forms.Select(choices=TIME_CHOICES)) - time_zone = forms.CharField(required=False, widget=forms.HiddenInput()) - where = forms.CharField(required=False, widget=forms.TextInput(attrs={'size': 60})) - description = forms.CharField(required=False, - widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) - - DATE_FORMAT = '%m/%d/%Y' # must match the jQuery UI datepicker config - TIME_FORMAT = '%H:%M' - DEFAULT_START_TIME = '19:00' - DEFAULT_END_TIME = '20:00' - - class Meta: - model = Event - fields = ('what', 'start_date', 'start_time', 'end_date', 'end_time', - 'all_day', 'time_zone', 'where', 'description', 'create_forum_thread') - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui'] + - ['css/gcalendar.css']) - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui'] + - ['js/timezone.js', 'js/gcalendar.js']) - - def __init__(self, *args, **kwargs): - initial = kwargs.get('initial', {}) - instance = kwargs.get('instance', None) - - if len(args) == 0: # no POST arguments - if instance is None: - init_day = datetime.date.today().strftime(self.DATE_FORMAT) - if 'start_date' not in initial: - initial['start_date'] = init_day - if 'end_date' not in initial: - initial['end_date'] = init_day - if 'start_time' not in initial: - initial['start_time'] = self.DEFAULT_START_TIME - if 'end_time' not in initial: - initial['end_time'] = self.DEFAULT_END_TIME - else: - initial['start_date'] = instance.start_date.strftime(self.DATE_FORMAT) - initial['end_date'] = instance.end_date.strftime(self.DATE_FORMAT) - if instance.all_day: - initial['start_time'] = self.DEFAULT_START_TIME - initial['end_time'] = self.DEFAULT_END_TIME - else: - if 'start_time' not in initial: - initial['start_time'] = instance.start_time.strftime(self.TIME_FORMAT) - if 'end_time' not in initial: - initial['end_time'] = instance.end_time.strftime(self.TIME_FORMAT) - - kwargs['initial'] = initial - - super(EventEntryForm, self).__init__(*args, **kwargs) - - # We don't want the user to create a forum thread on an existing event - if instance is not None: - del self.fields['create_forum_thread'] - - def clean(self): - start_date = self.cleaned_data.get('start_date') - start_time = self.cleaned_data.get('start_time') - all_day = self.cleaned_data.get('all_day') - end_date = self.cleaned_data.get('end_date') - end_time = self.cleaned_data.get('end_time') - - if start_date and start_time and (all_day or (end_date and end_time)): - if all_day: - start = start_date - end = end_date - else: - start = datetime.datetime.combine(start_date, start_time) - end = datetime.datetime.combine(end_date, end_time) - if start > end: - raise forms.ValidationError("The start date of the event " - "is after the ending time!") - - return self.cleaned_data - - def clean_time_zone(self): - tz = self.cleaned_data['time_zone'] - try: - pytz.timezone(tz) - except pytz.UnknownTimeZoneError: - raise forms.ValidationError("Invalid timezone.") - return tz - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/models.py --- a/gpp/gcalendar/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,159 +0,0 @@ -""" -Models for the gcalendar application. - -""" -import datetime - -from django.db import models -from django.db.models import Q -from django.contrib.auth.models import User - -from core.markup import site_markup -import forums.tools -from gcalendar.oauth import serialize_token, deserialize_token - - -GIG_FORUM_SLUG = "gigs" - -class PendingEventManager(models.Manager): - """A manager for pending events.""" - - def get_query_set(self): - """Returns a queryset of events that have been approved to update - the Google calendar.""" - return super(PendingEventManager, self).get_query_set().filter( - Q(status=Event.NEW_APRV) | - Q(status=Event.EDIT_APRV) | - Q(status=Event.DEL_APRV) - ) - - -class Event(models.Model): - """Model to represent calendar events.""" - - # Event status codes: - (NEW, NEW_APRV, EDIT_REQ, EDIT_APRV, DEL_REQ, DEL_APRV, ON_CAL) = range(7) - - STATUS_CHOICES = ( - (NEW, 'New'), - (NEW_APRV, 'New Approved'), - (EDIT_REQ, 'Edit Request'), - (EDIT_APRV, 'Edit Approved'), - (DEL_REQ, 'Delete Request'), - (DEL_APRV, 'Delete Approved'), - (ON_CAL, 'On Calendar'), - ) - - user = models.ForeignKey(User) - what = models.CharField(max_length=255) - start_date = models.DateField() - start_time = models.TimeField(null=True, blank=True) - end_date = models.DateField() - end_time = models.TimeField(null=True, blank=True) - time_zone = models.CharField(max_length=64, blank=True) - all_day = models.BooleanField(default=False) - where = models.CharField(max_length=255, blank=True) - description = models.TextField(blank=True) - html = models.TextField(blank=True) - date_submitted = models.DateTimeField(auto_now_add=True) - google_id = models.CharField(max_length=255, blank=True) - google_url = models.URLField(max_length=255, blank=True) - status = models.SmallIntegerField(choices=STATUS_CHOICES, default=NEW, - db_index=True) - create_forum_thread = models.BooleanField(default=False) - - objects = models.Manager() - pending_events = PendingEventManager() - - def __unicode__(self): - return self.what - - class Meta: - ordering = ('-date_submitted', ) - - def save(self, *args, **kwargs): - self.html = site_markup(self.description) - super(Event, self).save(*args, **kwargs) - - def is_approved(self): - return self.status not in (self.NEW, self.EDIT_REQ, self.DEL_REQ) - is_approved.boolean = True - - def google_html(self): - """Returns a HTML tag to the event if it exits.""" - if self.google_url: - return u'On Google' % self.google_url - return u'' - google_html.allow_tags = True - google_html.short_description = 'Google Link' - - def notify_on_calendar(self): - """ - This function should be called when the event has been added to the - Google calendar for the first time. This gives us a chance to perform - any first-time processing, like creating a forum thread. - """ - if self.create_forum_thread: - topic_name = '%s: %s' % (self.start_date.strftime('%m/%d/%Y'), - self.what) - post_body = "%s\n\n[Link to event on Google Calendar](%s)" % ( - self.description, self.google_url) - - forums.tools.create_topic( - forum_slug=GIG_FORUM_SLUG, - user=self.user, - topic_name=topic_name, - post_body=post_body) - - self.create_forum_thread = False - self.save() - - -class AccessTokenManager(models.Manager): - """ - A manager for the AccessToken table. Only one access token is saved in the - database. This manager provides a convenience method to either return that - access token or a brand new one. - - """ - def get_token(self): - try: - token = self.get(pk=1) - except AccessToken.DoesNotExist: - token = AccessToken() - - return token - - -class AccessToken(models.Model): - """ - This model represents serialized OAuth access tokens for reading and - updating the Google Calendar. - - """ - auth_date = models.DateTimeField() - token = models.TextField() - - objects = AccessTokenManager() - - def __unicode__(self): - return u'Access token created on ' + unicode(self.auth_date) - - def update(self, access_token, auth_date=None): - """ - This function updates the AccessToken object with the input parameters: - access_token - an access token from Google's OAuth dance - auth_date - a datetime or None. If None, now() is used. - - """ - self.auth_date = auth_date if auth_date else datetime.datetime.now() - self.token = serialize_token(access_token) - - def access_token(self): - """ - This function returns a Google OAuth access token by deserializing the - token field from the database. - If the token attribute is empty, None is returned. - - """ - return deserialize_token(self.token) if self.token else None diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/oauth.py --- a/gpp/gcalendar/oauth.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,99 +0,0 @@ -""" -This module handles the OAuth integration with Google. - -""" -from __future__ import with_statement -import logging - -import gdata.gauth -from gdata.calendar_resource.client import CalendarResourceClient - -from django.conf import settings - - -logger = logging.getLogger(__name__) -USER_AGENT = 'surfguitar101-gcalendar-v1' -REQ_TOKEN_SESSION_KEY = 'gcalendar oauth request token' - - -def fetch_auth(request, scopes, callback_url): - """ - This function fetches a request token from Google and stores it in the - session. It then returns the authorization URL as a string. - - request - the HttpRequest object for the user requesting the token. The - token is stored in the session object attached to this request. - - scopes - a list of scope strings that the request token is for. See - http://code.google.com/apis/gdata/faq.html#AuthScopes - - callback_url - a string that is the URL that Google should redirect the user - to after the user has authorized our application access to their data. - - This function only supports RSA-SHA1 authentication. Settings in the Django - settings module determine the consumer key and path to the RSA private key. - """ - logger.info("fetch_auth started; callback url='%s'", callback_url) - client = CalendarResourceClient(None, source=USER_AGENT) - - with open(settings.GOOGLE_OAUTH_PRIVATE_KEY_PATH, 'r') as f: - rsa_key = f.read() - logger.info("read RSA key; now getting request token") - - request_token = client.GetOAuthToken( - scopes, - callback_url, - settings.GOOGLE_OAUTH_CONSUMER_KEY, - rsa_private_key=rsa_key) - - logger.info("received token") - request.session[REQ_TOKEN_SESSION_KEY] = request_token - - auth_url = request_token.generate_authorization_url() - logger.info("generated auth url '%s'", str(auth_url)) - - return str(auth_url) - - -def get_access_token(request): - """ - This function should be called after Google has sent the user back to us - after the user authorized us. We retrieve the oauth token from the request - URL and then upgrade it to an access token. We then return the access token. - - """ - logger.info("get_access_token called as '%s'", request.get_full_path()) - - saved_token = request.session.get(REQ_TOKEN_SESSION_KEY) - if saved_token is None: - logger.error("saved request token not found in session!") - return None - - logger.info("extracting token...") - request_token = gdata.gauth.AuthorizeRequestToken(saved_token, - request.build_absolute_uri()) - - logger.info("upgrading to access token...") - - client = CalendarResourceClient(None, source=USER_AGENT) - access_token = client.GetAccessToken(request_token) - - logger.info("upgraded to access token...") - return access_token - - -def serialize_token(token): - """ - This function turns a token into a string and returns it. - - """ - return gdata.gauth.TokenToBlob(token) - - -def deserialize_token(s): - """ - This function turns a string into a token returns it. The string must have - previously been created with serialize_token(). - - """ - return gdata.gauth.TokenFromBlob(s) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/static/css/gcalendar.css --- a/gpp/gcalendar/static/css/gcalendar.css Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,7 +0,0 @@ -.markItUp { - width: 600px; -} -.markItUpEditor { - width:543px; - height:200px; -} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/static/js/gcalendar.js --- a/gpp/gcalendar/static/js/gcalendar.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,33 +0,0 @@ -$(document).ready(function() { - $('#id_start_date').datepicker({constrainInput: true, - dateFormat: 'mm/dd/yy', - onClose: function () { - var end = $('#id_end_date'); - if (this.value > end.val()) - { - end.val(this.value); - } - } - }); - $('#id_end_date').datepicker({constrainInput: true, - dateFormat: 'mm/dd/yy', - onClose: function () { - var start = $('#id_start_date'); - if (this.value < start.val()) - { - start.val(this.value); - } - } - }); - if ($('#id_all_day:checked').length) - { - $('#id_start_time').hide(); - $('#id_end_time').hide(); - $('#id_tz_stuff').hide(); - } - $('#id_all_day').click(function () { - $('#id_start_time').toggle(); - $('#id_end_time').toggle(); - $('#id_tz_stuff').toggle(); - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/static/js/gcalendar_edit.js --- a/gpp/gcalendar/static/js/gcalendar_edit.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,24 +0,0 @@ -$(document).ready(function() { - $('.gcal-del').click(function () { - if (confirm('Really delete this event?')) { - var id = this.id; - if (id.match(/gcal-(\d+)/)) { - $.ajax({ - url: '/calendar/delete/', - type: 'POST', - data: { id : RegExp.$1 }, - dataType: 'text', - success: function (id) { - var id = '#gcal-' + id; - $(id).parents('li').hide('normal'); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + - xhr.responseText); - } - }); - } - } - return false; - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/urls.py --- a/gpp/gcalendar/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,14 +0,0 @@ -""" -URLs for the gcalendar application. -""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('gcalendar.views', - url(r'^$', 'index', name='gcalendar-index'), - url(r'^add/$', 'add_event', name='gcalendar-add'), - url(r'^change/$', 'edit_events', name='gcalendar-edit_events'), - url(r'^change/(\d+)/$', 'edit_event', name='gcalendar-edit_event'), - url(r'^delete/$', 'delete_event', name='gcalendar-delete'), - url(r'^thanks/add/$', 'add_thanks', name='gcalendar-add_thanks'), - url(r'^thanks/change/$', 'edit_thanks', name='gcalendar-edit_thanks'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/gcalendar/views.py --- a/gpp/gcalendar/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,122 +0,0 @@ -""" -Views for the gcalendar application. -""" - -from django.contrib.auth.decorators import login_required -from django.core.urlresolvers import reverse -from django.http import HttpResponse -from django.http import HttpResponseBadRequest -from django.http import HttpResponseForbidden -from django.http import HttpResponseRedirect -from django.http import Http404 -from django.shortcuts import render_to_response -from django.shortcuts import get_object_or_404 -from django.template import RequestContext - -from gcalendar.forms import EventEntryForm -from gcalendar.models import Event - - -def index(request): - user = request.user - if user.is_authenticated(): - profile = user.get_profile() - tz = profile.time_zone - else: - tz = 'US/Pacific' - - return render_to_response('gcalendar/index.html', { - 'tz': tz, - }, - context_instance = RequestContext(request)) - - -@login_required -def add_event(request): - if request.method == 'POST': - form = EventEntryForm(request.POST) - if form.is_valid(): - event = form.save(commit=False) - event.user = request.user - event.repeat = 'none' - event.save() - return HttpResponseRedirect(reverse('gcalendar-add_thanks')) - else: - form = EventEntryForm() - - return render_to_response('gcalendar/event.html', { - 'title': 'Add Calendar Event', - 'form': form, - }, - context_instance = RequestContext(request)) - - -@login_required -def add_thanks(request): - return render_to_response('gcalendar/thanks_add.html', { - }, - context_instance = RequestContext(request)) - - -@login_required -def edit_events(request): - events = Event.objects.filter(user=request.user, status=Event.ON_CAL).order_by('start_date') - return render_to_response('gcalendar/edit.html', { - 'events': events, - }, - context_instance = RequestContext(request)) - - -@login_required -def edit_event(request, event_id): - event = get_object_or_404(Event, pk=event_id) - if event.user != request.user: - raise Http404 - - if request.method == 'POST': - form = EventEntryForm(request.POST, instance=event) - if form.is_valid(): - event = form.save(commit=False) - event.user = request.user - event.repeat = 'none' - event.status = Event.EDIT_REQ - event.save() - return HttpResponseRedirect(reverse('gcalendar-edit_thanks')) - else: - form = EventEntryForm(instance=event) - - return render_to_response('gcalendar/event.html', { - 'title': 'Change Calendar Event', - 'form': form, - }, - context_instance = RequestContext(request)) - - -@login_required -def edit_thanks(request): - return render_to_response('gcalendar/thanks_edit.html', { - }, - context_instance = RequestContext(request)) - - -def delete_event(request): - """This view marks an event for deletion. It is called via AJAX.""" - if request.user.is_authenticated(): - id = request.POST.get('id', None) - if id is None or not id.isdigit(): - return HttpResponseBadRequest() - try: - event = Event.objects.get(pk=id) - except Event.DoesNotExist: - return HttpResponseBadRequest() - if request.user != event.user: - return HttpResponseForbidden() - - event.status = Event.DEL_REQ - event.save() - return HttpResponse(id) - - return HttpResponseForbidden() - - -# vim: ts=4 sw=4 diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/irc/models.py --- a/gpp/irc/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,15 +0,0 @@ -"""Models for the IRC application. -The IRC application simply reports who is in the site's IRC chatroom. A bot in the channel updates -the table and we read it. -""" -from django.db import models - -class IrcChannel(models.Model): - name = models.CharField(max_length=30) - last_update = models.DateTimeField() - - def __unicode__(self): - return self.name - - class Meta: - ordering = ('name', ) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/irc/templatetags/irc_tags.py --- a/gpp/irc/templatetags/irc_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,14 +0,0 @@ -""" -Template tags for the IRC application. -""" -from django import template -from irc.models import IrcChannel - -register = template.Library() - -@register.inclusion_tag('irc/irc_block.html') -def irc_status(): - nicks = IrcChannel.objects.all() - return { - 'nicks': nicks, - } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/irc/urls.py --- a/gpp/irc/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,6 +0,0 @@ -"""urls for the IRC application""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('irc.views', - url(r'^$', 'view', name='irc-main'), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/irc/views.py --- a/gpp/irc/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,12 +0,0 @@ -"""views for the IRC application""" - -from django.shortcuts import render_to_response -from django.template import RequestContext - -from irc.models import IrcChannel - -def view(request): - nicks = IrcChannel.objects.all() - return render_to_response('irc/view.html', - {'nicks': nicks}, - context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/data.py --- a/gpp/legacy/data.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,31 +0,0 @@ -""" -Misc data for the legacy management commands. - -""" - -# Over time various users asked me to change their username. The legacy site -# rarely stored foreign keys to users; instead it stored the name of the user -# at the time. This dictionary contains mappings from old usernames to new -# usernames. - -KNOWN_USERNAME_CHANGES = { - 'cavefishbutchdelux': 'butchdelux', - 'findicator1': 'WaveOhhh', - 'tikimania': 'Tikitena', - 'sandyfeet': 'RickRhoades', - 'crumb': 'crumble', - 'allenbridgewater': 'Outerwave_Allen', - 'reddtyde': 'Redd_Tyde', - 'fendershowman63': 'Abe', - 'hearteater': 'JoshHeartless', - 'surfdaddy': 'zzero', - 'frisbie': 'zzero', - 'retroactivegammarays': 'Retroactive_Taj', - 'mrrebel': 'Eddie_Bertrand', - 'doublecoil': 'Showman', - 'tsunami_tom': 'TomH', - 'davidj': 'davidphantomatic', - 'svd': 'Bilge_Rat', - 'dave_ledude': 'DaveF', -} - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/html2md.py --- a/gpp/legacy/html2md.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,291 +0,0 @@ -""" -This module contains a class derived from Python's HTMLParser to convert HTML to -Markdown. Currently this class only supports those HTML tags that have counter- -parts in BBCode used by stock phpBB 2.x. - -In other words, this class was created to help convert data from a phpBB -forum to Markdown syntax and its scope is currently limited to that task. - -""" -from HTMLParser import HTMLParser -import htmlentitydefs - - -# Let's call Markdown markup entities "elements" to avoid confusion -# with HTML tags. - -class ElementBase(object): - """ - Base class for all Markdown elements. - - """ - def __init__(self, attrs=None): - self.data = u'' - self.attrs = dict(attrs) if attrs else {} - - def add_data(self, data): - self.data += data - - def markdown(self): - return self.data - - -class TextElement(ElementBase): - """ - TextElements represent text fragments not inside HTML tags. - """ - pass - - -class EmphasisElement(ElementBase): - """ - An EmphasisElement is a Markdown element used to indicate emphasis and is - represented by placing characters around text. E.g. _em_, **bold** - - """ - def __init__(self, tag, attrs): - super(EmphasisElement, self).__init__(attrs) - self.tag = tag - - def markdown(self): - return u'%s%s%s' % (self.tag, self.data, self.tag) - - -def create_emphasis(tag): - """ - Returns a function that creates an EmphasisElement using the supplied - tag. - - """ - def inner(attrs): - return EmphasisElement(tag, attrs) - return inner - - -class HtmlElement(ElementBase): - """ - Markdown also accepts HTML markup. This element represents a HTML tag that - maps to itself in Markdown. - - """ - def __init__(self, tag, attrs): - super(HtmlElement, self).__init__(attrs) - self.tag = tag - - def markdown(self): - return u'<%s>%s' % (self.tag, self.data, self.tag) - - -def create_html(tag): - """ - Returns a function that creates a HtmlElement using the supplied tag. - - """ - def inner(attrs): - return HtmlElement(tag, attrs) - return inner - - -class QuoteElement(ElementBase): - """ - Class to represent a blockquote in Markdown. - - """ - def markdown(self): - return u'> %s\n\n' % self.data.replace('\n', '\n> ') - - -class BreakElement(ElementBase): - """ - Class to represent a linebreak in Markdown. - - """ - def markdown(self): - return u' \n' - - -class DivElement(ElementBase): - """ - This class maps a HTML
into a block of text surrounded by newlines. - - """ - def markdown(self): - return u'\n%s\n' % self.data - - -class LinkElement(ElementBase): - """ - This class maps HTML tags into Markdown links. - If no data is present, the actual href is used for the link text. - - """ - def markdown(self): - try: - url = self.attrs['href'] - except KeyError: - return self.data if self.data else u'' - - text = self.data if self.data else url - return u'[%s](%s)' % (text, url) - - -class ImageElement(ElementBase): - """ - This class maps HTML tags into Markdown. - This element assumes no alt text is present, and simply uses the word - 'image' for the alt text. - - """ - def markdown(self): - try: - url = self.attrs['src'] - except KeyError: - return u' (missing image) ' - return u'![image](%s)' % url - - -class CodeElement(ElementBase): - """ - This class is used to create code blocks in Markdown. - - """ - def markdown(self): - return u' %s\n' % self.data.replace('\n', '\n ') - - -# List (ordered & unordered) support: - -class ListElement(ElementBase): - """ - This class creates Markdown for unordered lists. The bullet() method can be - overridden to create ordered lists. - - """ - def __init__(self, attrs=None): - super(ListElement, self).__init__(attrs) - self.items = [] - self.list_nesting = 1 - - def add_data(self, data): - self.items.append(data) - - def bullet(self): - return u'*' - - def markdown(self): - bullet_str = self.bullet() - indent = u' ' * (4 * (self.list_nesting - 1)) - s = u'' - for item in self.items: - s += u'\n%s%s %s' % (indent, bullet_str, item) - return s - - -class OrderedListElement(ListElement): - """ - This class creates Markdown for ordered lists. - - """ - def bullet(self): - return '1.' - - -class ItemElement(ElementBase): - """ - This element is used to represent ordered & unordered list items. - - """ - pass - -############################################################################### -############################################################################### - -class MarkdownWriter(HTMLParser): - """ - This class is an HTMLParser that converts a subset of HTML to Markdown. - - """ - - elem_factories = { - 'a': LinkElement, - 'blockquote': QuoteElement, - 'br': BreakElement, - 'div': DivElement, - 'em': create_emphasis('_'), - 'img': ImageElement, - 'li': ItemElement, - 'ol': OrderedListElement, - 'pre': CodeElement, - 's': create_html('strike'), - 'strong': create_emphasis('**'), - 'u': create_html('u'), - 'ul': ListElement, - } - - def __init__(self): - HTMLParser.__init__(self) - self.reset() - - def handle_starttag(self, tag, attrs): - if tag in self.elem_factories: - factory = self.elem_factories[tag] - element = factory(attrs) - else: - element = TextElement() - - self._push_elem(element) - - def handle_endtag(self, tag): - self._pop_elem() - - def handle_data(self, data): - if len(self.elem_stack) == 0: - self._push_elem(TextElement()) - self._add_data(data) - - def handle_entityref(self, name): - try: - text = unichr(htmlentitydefs.name2codepoint[name]) - except KeyError: - text = name - self.handle_data(text) - - def handle_charref(self, name): - self.handle_data(unichr(int(name))) - - def reset(self): - HTMLParser.reset(self) - self.elem_stack = [] - self.elements = [] - self.list_nesting = 0 - - def _push_elem(self, tag): - if len(self.elem_stack) and isinstance(self.elem_stack[-1], TextElement): - self._pop_elem() - if isinstance(tag, ListElement): - self.list_nesting += 1 - tag.list_nesting = self.list_nesting - self.elem_stack.append(tag) - - def _pop_elem(self): - try: - element = self.elem_stack.pop() - except IndexError: - # pop from empty list => bad HTML input; ignore it - return - - if isinstance(element, ListElement): - self.list_nesting -= 1 - if len(self.elem_stack): - self.elem_stack[-1].add_data(element.markdown()) - else: - self.elements.append(element) - - def _add_data(self, data): - self.elem_stack[-1].add_data(data) - - def markdown(self): - while len(self.elem_stack): - self._pop_elem() - text_list = [e.markdown() for e in self.elements] - return u''.join(text_list) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/fix_potd_smiles.py --- a/gpp/legacy/management/commands/fix_potd_smiles.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,44 +0,0 @@ -""" -This command fixes the old 1.0 smiley system to match the new scheme. - -""" -from django.core.management.base import NoArgsCommand -from comments.models import Comment - - -SMILEY_MAP = { - ':confused:': ':?', - ':upset:': ':argh:', - ':eek:': ':shock:', - ':rolleyes:': ':whatever:', - ':mad:': 'X-(', - ':shy:': ':oops:', - ':laugh:': ':lol:', - ':dead:': 'x_x', - ':cry:': ':-(', - ';)': ':wink:', - ':|': ':-|', - ';-)': ':wink:', - ':D': ':-D', - ':P': ':-P', - 'B)': '8)', - ':(': ':-(', - ':)': ':-)', -} - - -class Command(NoArgsCommand): - - def handle_noargs(self, **opts): - - comments = Comment.objects.filter(id__gt=3000) - for comment in comments: - save = False - for key, val in SMILEY_MAP.items(): - if key in comment.comment: - comment.comment = comment.comment.replace(key, val) - save = True - - if save: - comment.save() - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_download_comments.py --- a/gpp/legacy/management/commands/import_old_download_comments.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,91 +0,0 @@ -""" -import_old_download_comments.py - For importing download comments from SG101 1.0 -as csv files. - -""" -from __future__ import with_statement -import csv -from datetime import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User -from django.contrib.contenttypes.models import ContentType - -from downloads.models import Download, VoteRecord -from comments.models import Comment -from legacy.html2md import MarkdownWriter -import legacy.data - - -class Command(LabelCommand): - args = '' - help = 'Imports download comments from the old database in CSV format' - md_writer = MarkdownWriter() - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new object and saving it to the database. - - """ - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - try: - for row in self.reader: - self.process_row(row) - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create an object for the row - and save it in the database. - - """ - dl_id = int(row['ratinglid']) - if dl_id in (1, 2, 3, 4): - return - - try: - dl = Download.objects.get(pk=dl_id) - except Download.DoesNotExist: - return - - try: - user = User.objects.get(username=row['ratinguser']) - except User.DoesNotExist: - old_name = row['ratinguser'].lower() - try: - user = User.objects.get( - username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) - except (User.DoesNotExist, KeyError): - return - - vote_date = datetime.strptime(row['ratingtimestamp'], "%Y-%m-%d %H:%M:%S") - - comment_text = row['ratingcomments'].decode('latin-1').strip() - if comment_text: - comment = Comment( - content_type=ContentType.objects.get_for_model(dl), - object_id=dl.id, - user=user, - comment=comment_text, - creation_date=vote_date, - ip_address = row['ratinghostname'], - is_public = True, - is_removed = False, - ) - comment.save() - - vr = VoteRecord(download=dl, user=user, vote_date=vote_date) - vr.save() - - def to_markdown(self, s): - self.md_writer.reset() - self.md_writer.feed(s) - return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_downloads.py --- a/gpp/legacy/management/commands/import_old_downloads.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,133 +0,0 @@ -""" -import_old_downloads.py - For importing downloads from SG101 1.0 as csv files. -""" -from __future__ import with_statement -import csv -import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User - -from downloads.models import Download, Category -from legacy.html2md import MarkdownWriter - - -# downloads with these lid's will be excluded -EXCLUDE_SET = set([1, 2, 3, 4, 277]) - -# Mapping of old category IDs to new; None means we don't plan on importing -CAT_MAP = { - 4: None, # Misc - 3: None, # Music - 1: None, # Demos - 6: 2, # Gear Samples - 8: 4, # Ringtones - 9: 8, # Tablature - 10: 6, # Interviews - 11: None, # 2008 MP3 Comp - 12: 1, # Backing Tracks - 13: None, # 2009 MP3 Comp -} - -SG101_PREFIX = 'http://surfguitar101.com/' - - -class Command(LabelCommand): - args = '' - help = 'Imports downloads from the old database in CSV format' - md_writer = MarkdownWriter() - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new object and saving it to the database. - - """ - self.cats = {} - try: - self.default_user = User.objects.get(pk=2) - except User.DoesNotExist: - raise CommandError("Need a default user with pk=2") - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - try: - for row in self.reader: - self.process_row(row) - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def get_category(self, old_cat_id): - """ - Return the Category object for the row. - - """ - cat_id = CAT_MAP[old_cat_id] - if cat_id not in self.cats: - try: - cat = Category.objects.get(pk=cat_id) - except Category.DoesNotExist: - raise CommandError("Category does not exist: %s on line %s" % ( - cat_id, self.reader.line_num)) - else: - self.cats[cat_id] = cat - return self.cats[cat_id] - - def get_user(self, username): - """ - Return the user object for the given username. - If the user cannot be found, self.default_user is returned. - - """ - try: - return User.objects.get(username=username) - except User.DoesNotExist: - return self.default_user - - def process_row(self, row): - """ - Process one row from the CSV file: create an object for the row - and save it in the database. - - """ - lid = int(row['lid']) - if lid in EXCLUDE_SET: - return # skip - - cat = int(row['cid']) - if CAT_MAP.get(cat) is None: - return # skip this one; we aren't carrying these over - - dl_date = datetime.datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S") - old_url = row['url'].decode('latin-1') - if old_url.startswith(SG101_PREFIX): - old_url = old_url[len(SG101_PREFIX):] - if old_url.startswith('dls/'): - old_url = old_url[4:] - new_url = u'downloads/1.0/%s' % old_url - - dl = Download( - id=lid, - title=row['title'].decode('latin-1'), - category=self.get_category(cat), - description=self.to_markdown(row['description'].decode('latin-1')), - file=new_url, - user=self.get_user(row['submitter']), - date_added=dl_date, - ip_address='127.0.0.1', # not available - hits=int(row['hits']), - average_score=float(row['downloadratingsummary']) / 2.0, - total_votes=int(row['totalvotes']), - is_public=True) - dl.save() - #print "cp %s %s" % (old_url, '/home/var/django-sites/sg101/sg101-trunk/media/' + new_url) - - def to_markdown(self, s): - self.md_writer.reset() - self.md_writer.feed(s) - return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_links.py --- a/gpp/legacy/management/commands/import_old_links.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,84 +0,0 @@ -""" -import_old_links.py - For importing links from SG101 1.0 as csv files. -""" -from __future__ import with_statement -import csv -import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User - -from weblinks.models import Link, Category - - -class Command(LabelCommand): - args = '' - help = 'Imports weblinks from the old database in CSV format' - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new weblink object and saving it to the database. - - """ - self.cats = {} - try: - self.default_user = User.objects.get(pk=2) - except User.DoesNotExist: - raise CommandError("Need a default user with pk=2") - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - try: - for row in self.reader: - self.process_row(row) - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def get_category(self, row): - """ - Return the Category object for the row. - - """ - cat_id = row['cid'] - if cat_id not in self.cats: - try: - cat = Category.objects.get(pk=cat_id) - except Category.DoesNotExist: - raise CommandError("Category does not exist: %s on line %s" % ( - cat_id, self.reader.line_num)) - else: - self.cats[cat_id] = cat - return self.cats[cat_id] - - def get_user(self, username): - """ - Return the user object for the given username. - If the user cannot be found, self.default_user is returned. - - """ - try: - return User.objects.get(username=username) - except User.DoesNotExist: - return self.default_user - - def process_row(self, row): - """ - Process one row from the CSV file: create an object for the row - and save it in the database. - - """ - link = Link(category=self.get_category(row), - title=row['title'].decode('latin-1'), - url=row['url'].decode('latin-1'), - description=row['description'].decode('latin-1'), - user=self.get_user(row['submitter']), - date_added=datetime.datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S"), - hits=int(row['hits']), - is_public=True) - link.save() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_news.py --- a/gpp/legacy/management/commands/import_old_news.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,121 +0,0 @@ -""" -import_old_news.py - For importing news stories from SG101 1.0 as csv files. -""" -from __future__ import with_statement -import csv -import optparse -import sys -from datetime import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User - -from news.models import Category, Story -from legacy.phpbb import unescape -import legacy.data - - -class Command(LabelCommand): - args = '' - help = 'Imports news stories from the old database in CSV format' - option_list = LabelCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 20 stories to show progress"), - ) - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new story. - - """ - self.show_progress = options.get('progress') - self.users = {} - - # Create a mapping from the old database's topics to our - # Categories. - self.topics = {} - try: - self.topics[2] = Category.objects.get(slug='site-news') - self.topics[3] = Category.objects.get(slug='bands') - self.topics[4] = Category.objects.get(slug='show-announcements') - self.topics[5] = Category.objects.get(slug='show-reports') - self.topics[6] = Category.objects.get(slug='gear') - self.topics[7] = Category.objects.get(slug='reviews') - self.topics[8] = Category.objects.get(slug='surf-scene-news') - self.topics[9] = Category.objects.get(slug='articles') - self.topics[10] = Category.objects.get(slug='interviews') - self.topics[11] = Category.objects.get(slug='tablature') - self.topics[12] = Category.objects.get(slug='featured-videos') - except Category.DoesNotExist: - sys.exit("Category does not exist; check topic mapping.") - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - num_rows = 0 - try: - for row in self.reader: - self.process_row(row) - num_rows += 1 - if self.show_progress and num_rows % 20 == 0: - sys.stdout.write('.') - sys.stdout.flush() - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - print - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create a Story object for - the row and save it in the database. - - """ - row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) - - try: - submitter = self._get_user(row['informant']) - except User.DoesNotExist: - print "Could not find user %s for story %s; skipping." % ( - row['informant'], row['sid']) - return - - story = Story(id=int(row['sid']), - title=unescape(row['title'].decode('latin-1')), - submitter=submitter, - category=self.topics[int(row['topic'])], - short_text=row['hometext'].decode('latin-1'), - long_text=row['bodytext'].decode('latin-1'), - date_submitted=datetime.strptime(row['time'], "%Y-%m-%d %H:%M:%S"), - allow_comments=True) - - story.save() - - def _get_user(self, username): - """ - Returns the user object with the given username. - Throws User.DoesNotExist if not found. - - """ - try: - return self.users[username] - except KeyError: - pass - - try: - user = User.objects.get(username=username) - except User.DoesNotExist: - old_name = username.lower() - try: - user = User.objects.get( - username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) - except KeyError: - raise User.DoesNotExist - - self.users[username] = user - return user diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_news_comments.py --- a/gpp/legacy/management/commands/import_old_news_comments.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,125 +0,0 @@ -""" -import_old_news_comments.py - For importing comments on news stories from SG101 1.0 as csv files. -""" -from __future__ import with_statement -import csv -import optparse -import sys -from datetime import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User -from django.contrib.contenttypes.models import ContentType - -from comments.models import Comment -from news.models import Story -import legacy.data -from legacy.html2md import MarkdownWriter - - -class Command(LabelCommand): - args = '' - help = 'Imports news story comments from the old database in CSV format' - option_list = LabelCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 20 comments to show progress"), - ) - md_writer = MarkdownWriter() - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new story comment. - - """ - self.show_progress = options.get('progress') - self.users = {} - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - num_rows = 0 - try: - for row in self.reader: - self.process_row(row) - num_rows += 1 - if self.show_progress and num_rows % 20 == 0: - sys.stdout.write('.') - sys.stdout.flush() - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - print - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create a Comment object for - the row and save it in the database. - - """ - row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) - - try: - user = self._get_user(row['name']) - except User.DoesNotExist: - print "Could not find user %s for comment %s; skipping." % ( - row['name'], row['tid']) - return - - try: - story = Story.objects.get(id=int(row['sid'])) - except Story.DoesNotExist: - print "Could not find story %s for comment %s; skipping." % ( - row['sid'], row['tid']) - return - - comment = Comment( - id=int(row['tid']), - content_type = ContentType.objects.get_for_model(story), - object_id = story.id, - user = user, - comment = self.to_markdown(row['comment']), - creation_date = datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S"), - ip_address = row['host_name'], - is_public = True, - is_removed = False, - ) - - comment.save() - - def _get_user(self, username): - """ - Returns the user object with the given username. - Throws User.DoesNotExist if not found. - - """ - try: - return self.users[username] - except KeyError: - pass - - try: - user = User.objects.get(username=username) - except User.DoesNotExist: - old_name = username.lower() - try: - user = User.objects.get( - username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) - except KeyError: - raise User.DoesNotExist - - self.users[username] = user - return user - - def to_markdown(self, s): - self.md_writer.reset() - - if not isinstance(s, unicode): - s = s.decode('latin-1', 'replace') - - self.md_writer.feed(s) - return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_podcasts.py --- a/gpp/legacy/management/commands/import_old_podcasts.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,63 +0,0 @@ -""" -import_old_podcasts.py - For importing podcasts from SG101 1.0 as csv files. -""" -from __future__ import with_statement -import csv -import datetime - -from django.core.management.base import LabelCommand, CommandError - -from podcast.models import Channel, Item - - -class Command(LabelCommand): - args = '' - help = 'Imports podcasts from the old database in CSV format' - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new weblink object and saving it to the database. - - """ - try: - self.channel = Channel.objects.get(pk=1) - except Channel.DoesNotExist: - raise CommandError("Need a default channel with pk=1") - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - try: - for row in self.reader: - self.process_row(row) - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create an object for the row - and save it in the database. - - """ - item = Item(channel=self.channel, - title=row['title'], - author=row['author'], - subtitle=row['subtitle'], - summary=row['summary'], - enclosure_url=row['enclosure_url'], - alt_enclosure_url='', - enclosure_length=int(row['enclosure_length']), - enclosure_type=row['enclosure_type'], - guid=row['guid'], - pubdate=datetime.datetime.strptime(row['pubdate'], - "%Y-%m-%d %H:%M:%S"), - duration=row['duration'], - keywords=row['keywords'], - explicit=row['explicit']) - - item.save() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_potd.py --- a/gpp/legacy/management/commands/import_old_potd.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,127 +0,0 @@ -""" -import_old_potd.py - For importing POTD's from SG101 1.0 as csv files. - -""" -from __future__ import with_statement -import csv -import optparse -import sys -from datetime import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User - -from potd.models import Photo -from legacy.phpbb import unescape -import legacy.data - - -ID_OFFSET = 100 - - -class PathError(Exception): - pass - -def convert_path(old_path): - """ - Converts the old POTD path to a new one. - - """ - if old_path.startswith('images/potd/'): - return "potd/1.0/%s" % old_path[12:] - else: - raise PathError("Unknown path %s" % old_path) - - -class Command(LabelCommand): - args = '' - help = "Imports POTD's from the old database in CSV format" - option_list = LabelCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 20 items to show progress"), - ) - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new Photo - - """ - self.show_progress = options.get('progress') - self.users = {} - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - num_rows = 0 - try: - for row in self.reader: - self.process_row(row) - num_rows += 1 - if self.show_progress and num_rows % 20 == 0: - sys.stdout.write('.') - sys.stdout.flush() - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - print - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create a Photo object for - the row and save it in the database. - - """ - try: - submitter = self._get_user(row['submitted_by'].decode('latin-1')) - except User.DoesNotExist: - print "Could not find user %s for potd %s; skipping." % ( - row['submitted_by'], row['pid']) - return - - desc = row['description'].decode('latin-1').replace('\n', '\n
') - - try: - photo = Photo( - id=int(row['pid']) + ID_OFFSET, - photo=convert_path(row['photo_path']), - thumb=convert_path(row['thumb_path']), - caption=unescape(row['title'].decode('latin-1')), - description=desc, - user=submitter, - date_added=datetime.strptime(row['date_added'], - "%Y-%m-%d %H:%M:%S"), - potd_count=int(row['chosen_count'])) - except PathError, ex: - self.stderr.write("\n%s, skipping\n" % ex) - return - - photo.save() - - def _get_user(self, username): - """ - Returns the user object with the given username. - Throws User.DoesNotExist if not found. - - """ - try: - return self.users[username] - except KeyError: - pass - - try: - user = User.objects.get(username=username) - except User.DoesNotExist: - old_name = username.lower() - try: - user = User.objects.get( - username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) - except KeyError: - raise User.DoesNotExist - - self.users[username] = user - return user diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_potd_comments.py --- a/gpp/legacy/management/commands/import_old_potd_comments.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,141 +0,0 @@ -""" -import_old_potd_comments.py - For importing comments on POTD's from SG101 1.0 -as csv files. - -""" -from __future__ import with_statement -import csv -import optparse -import sys -from datetime import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User -from django.contrib.contenttypes.models import ContentType - -from comments.models import Comment -from potd.models import Photo -import legacy.data -from legacy.html2md import MarkdownWriter - - -PHOTO_ID_OFFSET = 100 -ID_OFFSET = 3000 - - -class Command(LabelCommand): - args = '' - help = 'Imports POTD comments from the old database in CSV format' - option_list = LabelCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 20 items to show progress"), - optparse.make_option("--fix-mode", action="store_true", - help="Only create comments if they don't exist already"), - ) - md_writer = MarkdownWriter() - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new POTD comment. - - """ - self.show_progress = options.get('progress') - self.fix_mode = options.get('fix_mode') - self.users = {} - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - num_rows = 0 - try: - for row in self.reader: - self.process_row(row) - num_rows += 1 - if self.show_progress and num_rows % 20 == 0: - sys.stdout.write('.') - sys.stdout.flush() - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - print - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create a Comment object for - the row and save it in the database. - - """ - comment_id = int(row['cid']) + ID_OFFSET - - if self.fix_mode: - try: - c = Comment.objects.get(pk=comment_id) - except Comment.DoesNotExist: - pass - else: - return - - try: - user = self._get_user(row['username'].decode('latin-1')) - except User.DoesNotExist: - print "Could not find user %s for comment %s; skipping." % ( - row['username'], row['cid']) - return - - pid = int(row['pid']) + PHOTO_ID_OFFSET - try: - photo = Photo.objects.get(id=pid) - except Photo.DoesNotExist: - print "Could not find photo %s for comment %s; skipping." % ( - pid, row['cid']) - return - - comment = Comment( - id=comment_id, - content_type=ContentType.objects.get_for_model(photo), - object_id=photo.id, - user=user, - comment=self.to_markdown(row['comment'].decode('latin-1')), - creation_date=datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S"), - ip_address='192.0.2.0', # TEST-NET - is_public=True, - is_removed=False, - ) - - comment.save() - - def _get_user(self, username): - """ - Returns the user object with the given username. - Throws User.DoesNotExist if not found. - - """ - try: - return self.users[username] - except KeyError: - pass - - try: - user = User.objects.get(username=username) - except User.DoesNotExist: - old_name = username.lower() - try: - user = User.objects.get( - username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) - except KeyError: - raise User.DoesNotExist - - self.users[username] = user - return user - - def to_markdown(self, s): - - s = s.replace('\n', '\n
') - self.md_writer.reset() - self.md_writer.feed(s) - return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_topics.py --- a/gpp/legacy/management/commands/import_old_topics.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,117 +0,0 @@ -""" -import_old_topics.py - For importing forum topics (threads) from SG101 1.0 as -csv files. - -""" -from __future__ import with_statement -import csv -import optparse -import sys -from datetime import datetime - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User - -from forums.models import Forum, Topic -from legacy.phpbb import unescape - - -class Command(LabelCommand): - args = '' - help = 'Imports forum topics from the old database in CSV format' - option_list = LabelCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 20 topics to show progress"), - ) - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new topic. - - """ - self.show_progress = options.get('progress') - self.users = {} - - # Create a mapping from the old database's forums to our - # forums - self.forums = {} - try: - self.forums[2] = Forum.objects.get(slug='suggestion-box') - self.forums[3] = Forum.objects.get(slug='surf-music') - self.forums[4] = Forum.objects.get(slug='surf-musician') - self.forums[5] = Forum.objects.get(slug='gear') - self.forums[6] = Forum.objects.get(slug='recording-corner') - self.forums[7] = Forum.objects.get(slug='shallow-end') - self.forums[8] = Forum.objects.get(slug='surfguitar101-website') - self.forums[9] = Forum.objects.get(id=15) - self.forums[10] = Forum.objects.get(slug='for-sale-trade') - self.forums[11] = Forum.objects.get(slug='musicians-gigs-wanted') - self.forums[12] = Forum.objects.get(slug='surf-videos') - self.forums[13] = Forum.objects.get(slug='sg101-podcast') - self.forums[14] = Forum.objects.get(slug='gigs') - self.forums[15] = Forum.objects.get(slug='music-reviews') - self.forums[18] = Forum.objects.get(slug='best-sg101') - except Forum.DoesNotExist: - sys.exit("Forum does not exist; check forum mapping.") - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - num_rows = 0 - try: - for row in self.reader: - self.process_row(row) - num_rows += 1 - if self.show_progress and num_rows % 20 == 0: - sys.stdout.write('.') - sys.stdout.flush() - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - print - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create a Story object for - the row and save it in the database. - - """ - row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) - - if row['topic_moved_id'] != '0': - return - - try: - user = User.objects.get(id=int(row['topic_poster'])) - except User.DoesNotExist: - print "Could not find user %s for topic %s; skipping." % ( - row['topic_poster'], row['topic_id']) - return - - creation_date = datetime.fromtimestamp(float(row['topic_time'])) - - title = row['topic_title'].decode('latin-1', 'replace') - - try: - forum = self.forums[int(row['forum_id'])] - except KeyError: - print 'skipping topic "%s"' % title - return - - topic = Topic(id=int(row['topic_id']), - forum=forum, - name=unescape(title), - creation_date=creation_date, - user=user, - view_count=int(row['topic_views']), - sticky=(int(row['topic_type']) != 0), - locked=(int(row['topic_status']) != 0), - update_date=creation_date) - - topic.save() - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/import_old_users.py --- a/gpp/legacy/management/commands/import_old_users.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,163 +0,0 @@ -""" -import_old_users.py - For importing users from SG101 1.0 as csv files. -""" -from __future__ import with_statement -import csv -import optparse -import re -import sys -from datetime import datetime - -import postmarkup - -from django.core.management.base import LabelCommand, CommandError -from django.contrib.auth.models import User - -import bio.models -from legacy.phpbb import unphpbb -from legacy.html2md import MarkdownWriter - -TIME_ZONES = { - '-5': 'US/Eastern', - '-6': 'US/Central', - '-7': 'US/Mountain', - '-8': 'US/Pacific', -} -USERNAME_RE = re.compile(r'^[\w.@+-]+$') -USERNAME_LEN = (1, 30) # min & max length values - - -def _valid_username(username): - """ - Return true if the username is valid. - """ - return (USERNAME_LEN[0] <= len(username) <= USERNAME_LEN[1] and - USERNAME_RE.match(username)) - - -def _break_name(name): - """ - Break name into a first and last name. - Return a 2-tuple of first_name, last_name. - """ - parts = name.split() - n = len(parts) - if n == 0: - t = '', '' - elif n == 1: - t = parts[0], '' - else: - t = ' '.join(parts[:-1]), parts[-1] - return t[0][:USERNAME_LEN[1]], t[1][:USERNAME_LEN[1]] - - -class Command(LabelCommand): - args = '' - help = 'Imports users from the old database in CSV format' - option_list = LabelCommand.option_list + ( - optparse.make_option("-s", "--super-user", - help="Make the user with this name a superuser"), - optparse.make_option("-a", "--anon-user", - help="Make the user with this name the anonymous user " - "[default: Anonymous]"), - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 20 users to show progress"), - ) - bb_parser = postmarkup.create(use_pygments=False, annotate_links=False) - md_writer = MarkdownWriter() - - def handle_label(self, filename, **options): - """ - Process each line in the CSV file given by filename by - creating a new user and profile. - - """ - self.superuser = options.get('super_user') - self.anonymous = options.get('anon_user') - if self.anonymous is None: - self.anonymous = 'Anonymous' - self.show_progress = options.get('progress') - - if self.superuser == self.anonymous: - raise CommandError("super-user name should not match anon-user") - - try: - with open(filename, "rb") as f: - self.reader = csv.DictReader(f) - num_rows = 0 - try: - for row in self.reader: - self.process_row(row) - num_rows += 1 - if self.show_progress and num_rows % 20 == 0: - sys.stdout.write('.') - sys.stdout.flush() - except csv.Error, e: - raise CommandError("CSV error: %s %s %s" % ( - filename, self.reader.line_num, e)) - - print - - except IOError: - raise CommandError("Could not open file: %s" % filename) - - def process_row(self, row): - """ - Process one row from the CSV file: create a user and user profile for - the row and save it in the database. - - """ - row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) - - if not _valid_username(row['username']): - print "Skipping import of %s; invalid username" % row['username'] - return - - n = User.objects.filter(username=row['username']).count() - if n > 0: - print "Skipping import of %s; user already exists" % row['username'] - return - - first_name, last_name = _break_name(row['name']) - is_superuser = self.superuser == row['username'] - is_anonymous = self.anonymous == row['username'] - - u = User(id=int(row['user_id']), - username=row['username'], - first_name=first_name, - last_name=last_name, - email=row['user_email'], - password=row['user_password'] if row['user_password'] else None, - is_staff=is_superuser, - is_active=True if not is_anonymous else False, - is_superuser=is_superuser, - last_login=datetime.fromtimestamp(int(row['user_lastvisit'])), - date_joined=datetime.strptime(row['user_regdate'], "%b %d, %Y")) - - if is_anonymous: - u.set_unusable_password() - - u.save() - - p = u.get_profile() - p.location = row['user_from'].decode('latin-1') - p.occupation = row['user_occ'].decode('latin-1') - p.interests = row['user_interests'].decode('latin-1') - p.profile_text = u'' - p.hide_email = True if row['user_viewemail'] != '1' else False - p.signature = self.to_markdown(row['user_sig']) if row['user_sig'] else u'' - p.time_zone = TIME_ZONES.get(row['user_timezone'], 'US/Pacific') - p.use_24_time = False - p.forum_post_count = int(row['user_posts']) - p.status = bio.models.STA_ACTIVE if p.forum_post_count > 10 else bio.models.STA_STRANGER - p.status_date = datetime.now() - p.update_date = p.status_date - p.save() - - def to_html(self, s): - return self.bb_parser.render_to_html(unphpbb(s), cosmetic_replace=False) - - def to_markdown(self, s): - self.md_writer.reset() - self.md_writer.feed(self.to_html(s)) - return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/management/commands/translate_old_posts.py --- a/gpp/legacy/management/commands/translate_old_posts.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,134 +0,0 @@ -""" -translate_old_posts.py - A management command to join the bbposts and -bbposts_text tables together and output as a .csv file, suitable for use as an -input to mysqlimport into the new database. This method bypasses the Django ORM -as it was too slow given the number of old posts to import. - -""" -from __future__ import with_statement -import csv -import optparse -from datetime import datetime - -import MySQLdb -import postmarkup - -from django.core.management.base import NoArgsCommand, CommandError - -from legacy.phpbb import unphpbb -from legacy.html2md import MarkdownWriter -from core.markup import SiteMarkup - - -def convert_ip(s): - """ - Converts a hex string representing an IP address into dotted notation. - """ - n = int(s, 16) - return "%d.%d.%d.%d" % ( - ((n >> 24) & 0xff), - ((n >> 16) & 0xff), - ((n >> 8) & 0xff), - n & 0xff) - - -class Command(NoArgsCommand): - help = """\ -This command joins the SG101 1.0 posts to 2.0 format and outputs the -data as a .csv file suitable for importing into the new database scheme with -the mysqlimport utility. -""" - option_list = NoArgsCommand.option_list + ( - optparse.make_option("-s", "--progress", action="store_true", - help="Output a . after every 100 posts to show progress"), - optparse.make_option("-a", "--host", help="set MySQL host name"), - optparse.make_option("-u", "--user", help="set MySQL user name"), - optparse.make_option("-p", "--password", help="set MySQL user password"), - optparse.make_option("-d", "--database", help="set MySQL database name"), - optparse.make_option("-o", "--out-file", help="set output filename"), - ) - bb_parser = postmarkup.create(use_pygments=False, annotate_links=False) - md_writer = MarkdownWriter() - site_markup = SiteMarkup() - - def handle_noargs(self, **opts): - - host = opts.get('host', 'localhost') or 'localhost' - user = opts.get('user', 'root') or 'root' - password = opts.get('password', '') or '' - database = opts.get('database') - out_filename = opts.get('out_file', 'forums_post.csv') or 'forums_post.csv' - - if database is None: - raise CommandError("Please specify a database option") - - out_file = open(out_filename, "wb") - - # database columns (fieldnames) for the output CSV file: - cols = ('id', 'topic_id', 'user_id', 'creation_date', 'update_date', - 'body', 'html', 'user_ip') - self.writer = csv.writer(out_file) - - # Write an initial row of fieldnames to the output file - self.writer.writerow(cols) - - # connect to the legacy database - try: - db = MySQLdb.connect(host=host, - user=user, - passwd=password, - db=database) - except MySQLdb.DatabaseError, e: - raise CommandError(str(e)) - - c = db.cursor(MySQLdb.cursors.DictCursor) - - # query the legacy database - sql = ('SELECT * FROM sln_bbposts as p, sln_bbposts_text as t WHERE ' - 'p.post_id = t.post_id ORDER BY p.post_id') - c.execute(sql) - - # convert the old data and write the output to the file - while True: - row = c.fetchone() - if row is None: - break - - self.process_row(row) - - c.close() - db.close() - out_file.close() - - def to_html(self, s): - return self.bb_parser.render_to_html(unphpbb(s), cosmetic_replace=False) - - def to_markdown(self, s): - self.md_writer.reset() - self.md_writer.feed(self.to_html(s)) - return self.md_writer.markdown() - - def process_row(self, row): - """ - This function accepts one row from the legacy database and converts the - contents to the new database format, and calls the writer to write the new - row to the output file. - """ - creation_date = datetime.fromtimestamp(float(row['post_time'])) - - if row['post_edit_time']: - update_date = datetime.fromtimestamp(float(row['post_edit_time'])) - else: - update_date = creation_date - - body = self.to_markdown(row['post_text']) - html = self.site_markup.convert(body) - - self.writer.writerow([row['post_id'], - row['topic_id'], - row['poster_id'], - creation_date, - update_date, - body.encode("utf-8"), - html.encode("utf-8"), - convert_ip(row['poster_ip'])]) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/models.py --- a/gpp/legacy/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,3 +0,0 @@ -from django.db import models - -# Create your models here. diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/phpbb.py --- a/gpp/legacy/phpbb.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,68 +0,0 @@ -""" -This module contains functions for working with data from the legacy phpBB -based website. -""" -import re -import htmlentitydefs - - -# BBCode tags used by the old site -BBCODE_TAGS = "b i u s url quote img list * code color size".split() - -# Regular expressions used to get rid of phpBB's uid inside BBCode tags. -# This is a list of regular expression pairs. Element 0 of each pair -# is for the opening tag & element 1 is for the closing tag. - -BBCODE_RES = [( - re.compile(r"(\[%s):(?:[0-9a-fu]+:)?[0-9a-f]{10}" % tag), - re.compile(r"(\[/%s):(?:[0-9a-fu]+:)?[0-9a-f]{10}\]" % tag) -) for tag in BBCODE_TAGS] - - -## -# Removes HTML or XML character references and entities from a text string. -# -# @param text The HTML (or XML) source text. -# @return The plain text, as a Unicode string, if necessary. -# Source: http://effbot.org/zone/re-sub.htm#unescape-html -# -def unescape(text): - def fixup(m): - text = m.group(0) - if text[:2] == "&#": - # character reference - try: - if text[:3] == "&#x": - return unichr(int(text[3:-1], 16)) - else: - return unichr(int(text[2:-1])) - except ValueError: - pass - else: - # named entity - try: - text = unichr(htmlentitydefs.name2codepoint[text[1:-1]]) - except KeyError: - pass - return text # leave as is - return re.sub("&#?\w+;", fixup, text) - - -def unphpbb(s, encoding='latin-1'): - """Converts BBCode from phpBB database data into 'pure' BBCode. - - phpBB doesn't store plain BBCode in its database. The BBCode tags have - "uids" added to them and the data has already been HTML entity'ized. - This function removes the uid stuff and undoes the entity'ification and - returns the result as a unicode string. - - If the input 's' is not already unicode, it will be decoded using the - supplied encoding. - - """ - if not isinstance(s, unicode): - s = s.decode(encoding, 'replace') - for start, end in BBCODE_RES: - s = re.sub(start, r'\1', s, re.MULTILINE) - s = re.sub(end, r'\1]', s, re.MULTILINE) - return unescape(s) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/tests.py --- a/gpp/legacy/tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,38 +0,0 @@ -""" -Tests for legacy app functions. -""" - -from django.test import TestCase - -from legacy.phpbb import unphpbb -from legacy.html2md import MarkdownWriter - -class UnPhpBbTest(TestCase): - - def test_unentities(self): - s1 = ""Look! No head!" - Laika & The Cosmonauts" - s2 = unphpbb(s1) - s3 = u'"Look! No head!" - Laika & The Cosmonauts' - self.failUnlessEqual(s2, s3) - - def test_rem_uuid1(self): - s1 = ("[url=http://www.thesurfites.com][color=black:3fdb565c83]" - "T H E - S U R F I T E S[/color:3fdb565c83][/url]") - s2 = unphpbb(s1) - s3 = (u'[url=http://www.thesurfites.com][color=black]' - 'T H E - S U R F I T E S[/color][/url]') - self.failUnlessEqual(s2, s3) - - -class Html2MdTest(TestCase): - - def test_sig1(self): - s1 = """

Pollo Del Mar
-Frankie & The Pool Boys
-PDM on FaceBook
-

""" - md_writer = MarkdownWriter() - md_writer.feed(s1) - s2 = md_writer.markdown() - s3 = u'[Pollo Del Mar](http://surfguitar101.com/modules.php?name=Web_Links&l_op=visit&lid=50) \n\n[Frankie & The Pool Boys](http://tinyurl.com/yjfmspj) \n\n[PDM on FaceBook](http://tinyurl.com/cnr27t) \n\n' - self.failUnlessEqual(s2, s3) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/legacy/views.py --- a/gpp/legacy/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1 +0,0 @@ -# Create your views here. diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/manage.py --- a/gpp/manage.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,11 +0,0 @@ -#!/usr/bin/env python -from django.core.management import execute_manager -try: - import settings # Assumed to be in the same directory. -except ImportError: - import sys - sys.stderr.write("Error: Can't find the file 'settings.py' in the directory containing %r. It appears you've customized things.\nYou'll have to run django-admin.py, passing it your settings module.\n(If the file settings.py does indeed exist, it's causing an ImportError somehow.)\n" % __file__) - sys.exit(1) - -if __name__ == "__main__": - execute_manager(settings) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/membermap/admin.py --- a/gpp/membermap/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,18 +0,0 @@ -""" -Admin definitions for the member map application models. -""" - -from django.contrib import admin - -from membermap.models import MapEntry - -class MapEntryAdmin(admin.ModelAdmin): - exclude = ('html', ) - list_display = ('user', 'location', 'lat', 'lon', 'date_updated') - list_filter = ('date_updated', ) - date_hierarchy = 'date_updated' - ordering = ('-date_updated', ) - search_fields = ('user', 'location', 'message') - raw_id_fields = ('user', ) - -admin.site.register(MapEntry, MapEntryAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/membermap/forms.py --- a/gpp/membermap/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,27 +0,0 @@ -""" -Forms for the member map application. -""" -from django import forms -from django.conf import settings - -from membermap.models import MapEntry - - -class MapEntryForm(forms.ModelForm): - location = forms.CharField(required=True, - widget=forms.TextInput(attrs={'size': 64, 'maxlength': 255})) - message = forms.CharField(required=False, - widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) - - class Meta: - model = MapEntry - fields = ('location', 'message') - - class Media: - css = { - 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + - settings.GPP_THIRD_PARTY_CSS['jquery-ui']) - } - js = (settings.GPP_THIRD_PARTY_JS['markitup'] + - settings.GPP_THIRD_PARTY_JS['jquery-ui']) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/membermap/models.py --- a/gpp/membermap/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,32 +0,0 @@ -""" -Models for the member map application. -""" -import datetime -from django.db import models -from django.contrib.auth.models import User - -from core.markup import site_markup - - -class MapEntry(models.Model): - """Represents a user's entry on the map.""" - user = models.ForeignKey(User) - location = models.CharField(max_length=255) - lat = models.FloatField() - lon = models.FloatField() - message = models.TextField(blank=True) - html = models.TextField(blank=True) - date_updated = models.DateTimeField() - - def __unicode__(self): - return u'Map entry for %s' % self.user.username - - class Meta: - ordering = ('-date_updated', ) - verbose_name_plural = 'map entries' - - def save(self, *args, **kwargs): - self.html = site_markup(self.message) - self.date_updated = datetime.datetime.now() - super(MapEntry, self).save(*args, **kwargs) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/membermap/static/css/membermap.css --- a/gpp/membermap/static/css/membermap.css Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,20 +0,0 @@ -#member_map_members_column { - float: left; -} -#member_map_map { - width: 720px; - height: 540px; - border: 1px solid black; - margin: 0 auto; -} -#member_map_info { - padding-top: 1em; - clear: left; -} -.markItUp { - width: 600px; -} -.markItUpEditor { - width:543px; - height:200px; -} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/membermap/static/js/membermap.js --- a/gpp/membermap/static/js/membermap.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,173 +0,0 @@ -var mmap = { - map: null, - geocoder: null, - users: Object, - userOnMap: false, - userClick: function() { - var name = $('option:selected', this).text(); - if (name != mmap.selectText) - { - mmap.clickUser(name); - } - }, - clickUser: function(name) { - pt = new GLatLng(mmap.users[name].lat, mmap.users[name].lon); - mmap.map.setCenter(pt); - mmap.users[name].marker.openInfoWindowHtml(mmap.users[name].message); - }, - clear: function() { - mmap.users.length = 0; - }, - selectText: "(select)", - onMapDir: 'You have previously added yourself to the member map. Your information appears below. You may change ' + - 'the information if you wish. To delete yourself from the map, click the Delete button.', - offMapDir: 'Your location is not on the map. If you would like to appear on the map, please fill out the form below ' + - 'and click the Submit button.' -}; -$(document).ready(function() { - if (GBrowserIsCompatible()) - { - $(window).unload(GUnload); - mmap.map = new GMap2($('#member_map_map')[0]); - mmap.map.setCenter(new GLatLng(15.0, -30.0), 2); - mmap.map.enableScrollWheelZoom(); - mmap.map.addControl(new GLargeMapControl()); - mmap.map.addControl(new GMapTypeControl()); - mmap.geocoder = new GClientGeocoder(); - - if (mmapUser.userName) - { - $.getJSON('/member_map/query/', - function(data) { - mmap.map.clearOverlays(); - var sel = $('#member_map_members'); - sel[0].length = 0; - sel.append($('
-
-{{ shout.shout_date|date:"D M d Y H:i:s" }}
-Permalink -Flag -{% ifequal user.id shout.user.id %} -Delete -{% endifequal %} - - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/shoutbox/shoutbox.html --- a/gpp/templates/shoutbox/shoutbox.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,38 +0,0 @@ -{% extends 'side_block.html' %} -{% load url from future %} -{% load core_tags %} -{% block block_title %}Shoutbox{% endblock %} -{% block block_content %} -
- {% for shout in shouts reversed %} -

- {{ shout.user.username }}: - {{ shout.html|safe }}
- {{ shout.shout_date|elapsed }} -

- {% endfor %} -
-
- - Shout History - -
-{% if user.is_authenticated %} -
-
- -
- - -
- -
-{% else %} -

-Please login or -register to shout. -

-{% endif %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/shoutbox/view.html --- a/gpp/templates/shoutbox/view.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,30 +0,0 @@ -{% extends 'base.html' %} -{% load bio_tags %} -{% load script_tags %} -{% block custom_css %} - - -{% endblock %} -{% block custom_js %} -{% script_tags "jquery-jeditable" %} - -{% endblock %} -{% block title %}Shout History{% endblock %} -{% block content %} -

Shout History

-{% if page.object_list %} -{% include 'core/pagination.html' %} - -
- -{% for shout in page.object_list %} -{% include "shoutbox/shout_detail.html" %} -{% endfor %} -
-
- -{% include 'core/pagination.html' %} -{% else %} -

No shouts at this time.

-{% endif %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/shoutbox/view_shout.html --- a/gpp/templates/shoutbox/view_shout.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,20 +0,0 @@ -{% extends 'base.html' %} -{% load url from future %} -{% load script_tags %} -{% block custom_css %} - -{% endblock %} -{% block custom_js %} -{% script_tags "jquery-jeditable" %} - -{% endblock %} -{% block title %}Shout #{{ shout.id }}{% endblock %} -{% block content %} - -

Shout #{{ shout.id }}

-
- -{% include "shoutbox/shout_detail.html" %} -
-
-{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/side_block.html --- a/gpp/templates/side_block.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,8 +0,0 @@ -
-
-{% block block_title %}{% endblock %} -
-
-{% block block_content %}{% endblock %} -
-
diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/smiley/smiley_farm.html --- a/gpp/templates/smiley/smiley_farm.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,5 +0,0 @@ -
-{% for s in smilies %} -{{ s.code }} -{% endfor %} -
diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/sopa.html --- a/gpp/templates/sopa.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,85 +0,0 @@ - - - -SurfGuitar101.com is offline to protest SOPA / PIPA - - - - - -
-

SurfGuitar101.com Offline to Protest SOPA / PIPA

-

Dear Friends of SurfGuitar101.com:

-

-I am joining many other websites today and closing the site to protest two pieces of legislation in the US Congress: the so-called -Stop Online Piracy Act, or SOPA in the House, and the so-called -Protect IP Act, or PIPA in the Senate. I hope to draw your attention to these acts and -urge each one of you to read up on them. Then, please contact your representatives and ask them to withdraw -their support for these bills. -

-

I too am concerned about protecting copyrights and intellectual property. But these bills have provisions in them -that go too far. They allow media companies to ask the government to remove sites from the Internet without any -due process or oversight. The burden of proving that no coypright violations are present will fall on site operators. -Major tech companies like Google, Facebook, and Twitter oppose these bills. The engineers that built the internet have -also spoken out, pointing out that the provisions in these bills will not prevent piracy, but in fact will create -security problems and disrupt the operation of the Internet. -

-

-The Internet is quite possibly the greatest invention of my lifetime. It should be a tool for free expression, -democracy, innovation, and entrepreneurship. However, the media companies are failing to innovate and embrace -this new digital age, and instead are asking the US government to essentially let them decide what we can and cannot -view on the internet. We cannot let the US goverment -join the ranks of despot countries like China, Iran, Syria, and North Korea and censor their citizens' use of the Internet. -

-

Here are some links that I ask you to look over. They explain the issues far better than I can.

- -

-Thank you for your patience and understanding. I firmly believe that even small community websites like ours -would be threatened if bills like this were allowed to pass.
--- Brian Neal -

-
- - - - - - - -
Are you a US citizen?Contact your representatives now...
Not in the US?Petition the US State Department...
- -
- - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/add_link.html --- a/gpp/templates/weblinks/add_link.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,20 +0,0 @@ -{% extends 'weblinks/base.html' %} -{% load url from future %} -{% block title %}Web Links: Add Link{% endblock %} -{% block weblinks_content %} -

Add Link

- {% if add_form %} -
{% csrf_token %} - - {{ add_form.as_table }} - -
  -  Cancel -
-
-
- {% else %} -

Thank you for submitting a link!

-

Your link has been submitted for review to the site staff.

- {% endif %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/base.html --- a/gpp/templates/weblinks/base.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,15 +0,0 @@ -{% extends 'base.html' %} -{% load weblinks_tags %} -{% block custom_css %} - -{% block weblinks_css %}{% endblock %} -{% block weblinks_js %}{% endblock %} -{% endblock %} -{% block content %} -

Web Links

-{% include 'weblinks/navigation.html' %} - -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/index.html --- a/gpp/templates/weblinks/index.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,16 +0,0 @@ -{% extends 'weblinks/base.html' %} -{% load url from future %} -{% block title %}Web Links{% endblock %} -{% block weblinks_content %} -

Categories

- {% if categories %} -

We have {{ total_links }} links in {{ categories.count }} categories.

-
- {% for category in categories %} -
{{ category.title }} - ({{ category.count }})
-

{{ category.description }}

- {% endfor %} -
- {% endif %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/latest_tag.html --- a/gpp/templates/weblinks/latest_tag.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,12 +0,0 @@ -{% load core_tags %} -

New Links

-{% if links %} -
    - {% for link in links %} -
  1. {{ link.title }} - - {{ link.date_added|elapsed }}
  2. - {% endfor %} -
-{% else %} -

No links at this time.

-{% endif %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/link.html --- a/gpp/templates/weblinks/link.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,25 +0,0 @@ -{% load url from future %} -{% load bio_tags %} -
-

{{ link.title }}

-
-
-

{{ link.description }}

-
{% csrf_token %} - - - - - - - - - - - - -
-
-
diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/link_detail.html --- a/gpp/templates/weblinks/link_detail.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,8 +0,0 @@ -{% extends 'weblinks/base.html' %} -{% block title %}Web Links: {{ link.title }}{% endblock %} -{% block weblinks_content %} -

Link Details: {{ link.title }}

-
-{% include 'weblinks/link.html' %} -
-{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/link_summary.html --- a/gpp/templates/weblinks/link_summary.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,21 +0,0 @@ -{% extends 'weblinks/base.html' %} -{% block title %}Web Links: {{ title }}{% endblock %} -{% block weblinks_css %} - - -{% endblock %} -{% block weblinks_js %} - -{% endblock %} -{% block weblinks_content %} -

{{ title }}

-{% if page.object_list %} - {% include 'core/pagination.html' %} -
- {% for link in page.object_list %} - {% include 'weblinks/link.html' %} - {% endfor %} -
- {% include 'core/pagination.html' %} -{% endif %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/navigation.html --- a/gpp/templates/weblinks/navigation.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,17 +0,0 @@ -{% load url from future %} - - -
-
{% csrf_token %} - -
-
diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/weblinks/view_links.html --- a/gpp/templates/weblinks/view_links.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,34 +0,0 @@ -{% extends 'weblinks/base.html' %} -{% load url from future %} -{% block title %}Web Links: {{ category.title }}{% endblock %} -{% block weblinks_css %} - - -{% endblock %} -{% block weblinks_js %} - -{% endblock %} -{% block weblinks_content %} -

Category: {{ category.title }}

- -{% if page.object_list %} - - -{% include 'core/pagination.html' %} - -
-{% for link in page.object_list %} - {% include 'weblinks/link.html' %} -{% endfor %} -
- -{% include 'core/pagination.html' %} -{% endif %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/ygroup/pagination.html --- a/gpp/templates/ygroup/pagination.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,26 +0,0 @@ - -
diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/ygroup/post_detail.html --- a/gpp/templates/ygroup/post_detail.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,15 +0,0 @@ -{% extends 'base.html' %} -{% load url from future %} -{% block title %}Yahoo Group Archives: {{ post.title }}{% endblock %} -{% block content %} -

Yahoo Group Archives »

-

{{ post.title }} - - permalink -

-
-
{{ post.poster }} - {{ post.creation_date|date:"d M Y H:i:s" }}
-
{{ post.msg|linebreaks }}
-
-

See this post in context.

-{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/ygroup/thread.html --- a/gpp/templates/ygroup/thread.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,32 +0,0 @@ -{% extends 'base.html' %} -{% load url from future %} -{% block title %}Yahoo Group Archives: {{ thread.title }}{% endblock %} -{% block custom_css %} - -{% endblock %} -{% block content %} - -{% if thread.page == 1 %} -

Yahoo Group Archives »

-{% else %} -

Yahoo Group Archives » - Page {{ thread.page }} »

-{% endif %} -

{{ thread.title }} - - permalink -

-{% include "ygroup/pagination.html" %} -
- {% for post in page_obj.object_list %} -
{{ post.poster }} - {{ post.creation_date|date:"d M Y H:i:s" }} - - permalink
-
- {{ post.msg|linebreaks }} -

Top

-
- {% endfor %} -
-{% include "ygroup/pagination.html" %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/templates/ygroup/thread_list.html --- a/gpp/templates/ygroup/thread_list.html Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,25 +0,0 @@ -{% extends 'base.html' %} -{% load url from future %} -{% block title %}Yahoo Group Archives{% endblock %} -{% block custom_css %} - -{% endblock %} -{% block content %} -

Yahoo Group Archives » Page {{ page_obj.number }}

-

-SurfGuitar101.com began as a Yahoo Group on October 31, 2001. It ran until August, 2007 when this site officially replaced it. On these pages you'll find the archived messages of our original group. You can also search through these messages via our search page. -

-{% include "ygroup/pagination.html" %} - - - {% for thread in page_obj.object_list %} - - - - - - - {% endfor %} -
TitleAuthorPostsDate
{{ thread.title }}{{ thread.poster }}{{ thread.post_count }}{{ thread.creation_date|date:"d M Y" }}
-{% include "ygroup/pagination.html" %} -{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/urls.py --- a/gpp/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,70 +0,0 @@ -from django.conf.urls import patterns, url, include -from django.conf.urls.static import static -from django.conf import settings -from django.contrib import admin -from django.views.decorators.cache import cache_page - -from haystack.views import search_view_factory - -from news.feeds import LatestNewsFeed -from forums.feeds import ForumsFeed -from custom_search.forms import CustomModelSearchForm - - -admin.autodiscover() - -urlpatterns = patterns('', - url(r'^$', 'views.home', name='home'), - (r'^admin/doc/', include('django.contrib.admindocs.urls')), - - url(r'^admin/password_reset/$', 'django.contrib.auth.views.password_reset', name='admin_password_reset'), - (r'^admin/password_reset/done/$', 'django.contrib.auth.views.password_reset_done'), - (r'^reset/(?P[0-9A-Za-z]+)-(?P.+)/$', 'django.contrib.auth.views.password_reset_confirm'), - (r'^reset/done/$', 'django.contrib.auth.views.password_reset_complete'), - - (r'^admin/', include(admin.site.urls)), - (r'^accounts/', include('accounts.urls')), - (r'^antispam/', include('antispam.urls')), - (r'^calendar/', include('gcalendar.urls')), - (r'^comments/', include('comments.urls')), - (r'^contact/', include('contact.urls')), - (r'^contests/', include('contests.urls')), - (r'^core/', include('core.urls')), - (r'^donations/', include('donations.urls')), - (r'^downloads/', include('downloads.urls')), - url(r'^feeds/news/$', - cache_page(6 * 60 * 60)(LatestNewsFeed()), - name='feeds-news'), - url(r'^feeds/forums/$', - cache_page(5 * 60)(ForumsFeed()), - {'forum_slug': None}, - 'feeds-forum_combined'), - url(r'^feeds/forums/(?P[\w\d-]+)/$', - cache_page(5 * 60)(ForumsFeed()), - name='feeds-forum'), - (r'^forums/', include('forums.urls')), - (r'^irc/', include('irc.urls')), - (r'^links/', include('weblinks.urls')), - (r'^member_map/', include('membermap.urls')), - (r'^messages/', include('messages.urls')), - (r'^news/', include('news.urls')), - (r'^oembed/', include('oembed.urls')), - (r'^pb/', include('phantombrigade.urls')), - (r'^podcast/', include('podcast.urls')), - (r'^polls/', include('polls.urls')), - (r'^potd/', include('potd.urls')), - (r'^profile/', include('bio.urls')), - (r'^shout/', include('shoutbox.urls')), - (r'^smiley/', include('smiley.urls')), - (r'^ygroup/', include('ygroup.urls')), -) - -# Haystack search views -urlpatterns += patterns('haystack.views', - url(r'^search/$', - search_view_factory(form_class=CustomModelSearchForm, load_all=True), - name='haystack_search'), -) - -# For serving media files in development only: -urlpatterns += static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/views.py --- a/gpp/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,15 +0,0 @@ -""" -This file contains views that don't belong to any specific application. -In particular, the home page view. -""" -from django.shortcuts import render_to_response -from django.template import RequestContext - - -def home(request): - """ - The home page view of the site. - """ - return render_to_response('home.html', { - }, - context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/__init__.py --- a/gpp/weblinks/__init__.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,1 +0,0 @@ -import signals diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/admin.py --- a/gpp/weblinks/admin.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,62 +0,0 @@ -"""This file contains the automatic admin site definitions for the weblinks models""" -import datetime - -from django.contrib import admin -from weblinks.models import Category -from weblinks.models import PendingLink -from weblinks.models import Link -from weblinks.models import FlaggedLink - - -class CategoryAdmin(admin.ModelAdmin): - list_display = ('title', 'slug', 'description', 'count') - prepopulated_fields = {'slug': ('title', )} - readonly_fields = ('count', ) - - -class PendingLinkAdmin(admin.ModelAdmin): - list_display = ('title', 'url', 'user', 'category', 'date_added') - raw_id_fields = ('user', ) - actions = ('approve_links', ) - readonly_fields = ('update_date', ) - - def approve_links(self, request, qs): - for pending_link in qs: - link = Link(category=pending_link.category, - title=pending_link.title, - url=pending_link.url, - description=pending_link.description, - user=pending_link.user, - date_added=datetime.datetime.now(), - hits=0, - is_public=True) - link.save() - pending_link.delete() - - count = len(qs) - msg = "1 link" if count == 1 else "%d links" % count - self.message_user(request, "%s approved." % msg) - - approve_links.short_description = "Approve selected links" - - -class LinkAdmin(admin.ModelAdmin): - list_display = ('title', 'url', 'category', 'date_added', 'hits', 'is_public') - list_filter = ('date_added', 'is_public', 'category') - date_hierarchy = 'date_added' - ordering = ('-date_added', ) - search_fields = ('title', 'description', 'url', 'user__username') - raw_id_fields = ('user', ) - readonly_fields = ('update_date', ) - save_on_top = True - - -class FlaggedLinkAdmin(admin.ModelAdmin): - list_display = ('__unicode__', 'url', 'get_link_url', 'user', 'date_flagged') - date_hierarchy = 'date_flagged' - raw_id_fields = ('user', ) - -admin.site.register(Category, CategoryAdmin) -admin.site.register(PendingLink, PendingLinkAdmin) -admin.site.register(Link, LinkAdmin) -admin.site.register(FlaggedLink, FlaggedLinkAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/fixtures/weblinks_categories.json --- a/gpp/weblinks/fixtures/weblinks_categories.json Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,112 +0,0 @@ -[ - { - "pk": 1, - "model": "weblinks.category", - "fields": { - "count": 215, - "description": "", - "slug": "bands", - "title": "Bands" - } - }, - { - "pk": 5, - "model": "weblinks.category", - "fields": { - "count": 21, - "description": "", - "slug": "fan-sites", - "title": "Fan Sites" - } - }, - { - "pk": 4, - "model": "weblinks.category", - "fields": { - "count": 28, - "description": "", - "slug": "gear", - "title": "Gear" - } - }, - { - "pk": 2, - "model": "weblinks.category", - "fields": { - "count": 7, - "description": "", - "slug": "music-merchants", - "title": "Music Merchants" - } - }, - { - "pk": 8, - "model": "weblinks.category", - "fields": { - "count": 6, - "description": "", - "slug": "other", - "title": "Other" - } - }, - { - "pk": 11, - "model": "weblinks.category", - "fields": { - "count": 17, - "description": "Do you have a photo gallery of surf bands somewhere on the web? Why not add a link to it here?", - "slug": "photo-galleries", - "title": "Photo Galleries" - } - }, - { - "pk": 10, - "model": "weblinks.category", - "fields": { - "count": 4, - "description": "", - "slug": "podcasts", - "title": "Podcasts" - } - }, - { - "pk": 6, - "model": "weblinks.category", - "fields": { - "count": 8, - "description": "", - "slug": "radio", - "title": "Radio" - } - }, - { - "pk": 3, - "model": "weblinks.category", - "fields": { - "count": 13, - "description": "", - "slug": "record-labels", - "title": "Record Labels" - } - }, - { - "pk": 7, - "model": "weblinks.category", - "fields": { - "count": 4, - "description": "", - "slug": "tablature", - "title": "Tablature" - } - }, - { - "pk": 9, - "model": "weblinks.category", - "fields": { - "count": 31, - "description": "Links to surf videos on the web", - "slug": "videos", - "title": "Videos" - } - } -] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/forms.py --- a/gpp/weblinks/forms.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,23 +0,0 @@ -""" -Forms for the weblinks application. -""" - -from django import forms -from weblinks.models import PendingLink, Link - - -class AddLinkForm(forms.ModelForm): - title = forms.CharField(widget = forms.TextInput(attrs = {'size': 52})) - url = forms.CharField(widget = forms.TextInput(attrs = {'size': 52})) - - def clean_url(self): - new_url = self.cleaned_data['url'] - try: - Link.objects.get(url__iexact = new_url) - except Link.DoesNotExist: - return new_url - raise forms.ValidationError('That link already exists in our database.') - - class Meta: - model = PendingLink - exclude = ('user', 'date_added', 'update_date') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/models.py --- a/gpp/weblinks/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,139 +0,0 @@ -""" -This module contains the models for the weblinks application. -""" -import datetime - -from django.db import models -from django.contrib.auth.models import User - - -class Category(models.Model): - """Links belong to categories""" - title = models.CharField(max_length=64) - slug = models.SlugField(max_length=64) - description = models.TextField(blank=True) - count = models.IntegerField(default=0) - - def __unicode__(self): - return self.title - - class Meta: - verbose_name_plural = 'Categories' - ordering = ('title', ) - - -class PublicLinkManager(models.Manager): - """The manager for all public links.""" - def get_query_set(self): - return super(PublicLinkManager, self).get_query_set().filter( - is_public=True).select_related() - - -class LinkBase(models.Model): - """Abstract model to aggregate common fields of a web link.""" - category = models.ForeignKey(Category) - title = models.CharField(max_length=128) - url = models.URLField(db_index=True) - description = models.TextField(blank=True) - user = models.ForeignKey(User) - date_added = models.DateTimeField(db_index=True) - update_date = models.DateTimeField(db_index=True, blank=True) - - class Meta: - abstract = True - - -class Link(LinkBase): - """Model to represent a web link""" - hits = models.IntegerField(default=0) - is_public = models.BooleanField(default=False, db_index=True) - - # Managers: - objects = models.Manager() - public_objects = PublicLinkManager() - - class Meta: - ordering = ('title', ) - - def __unicode__(self): - return self.title - - def save(self, *args, **kwargs): - if not self.pk: - if not self.date_added: - self.date_added = datetime.datetime.now() - self.update_date = self.date_added - else: - self.update_date = datetime.datetime.now() - - super(Link, self).save(*args, **kwargs) - - @models.permalink - def get_absolute_url(self): - return ('weblinks-link_detail', [str(self.id)]) - - def search_title(self): - return self.title - - def search_summary(self): - return self.description - - -class PendingLink(LinkBase): - """This model represents links that users submit. They must be approved by - an admin before they become visible on the site. - """ - class Meta: - ordering = ('date_added', ) - - def __unicode__(self): - return self.title - - def save(self, *args, **kwargs): - if not self.pk: - self.date_added = datetime.datetime.now() - self.update_date = self.date_added - else: - self.update_date = datetime.datetime.now() - - super(PendingLink, self).save(*args, **kwargs) - - -class FlaggedLinkManager(models.Manager): - - def create(self, link, user): - flagged_link = FlaggedLink(link = link, user = user, approved = False) - flagged_link.save() - - -class FlaggedLink(models.Model): - """Model to represent links that have been flagged as broken by users""" - link = models.ForeignKey(Link) - user = models.ForeignKey(User) - date_flagged = models.DateField(auto_now_add = True) - approved = models.BooleanField(default = False, - help_text = 'Check this and save to remove the referenced link from the database') - - objects = FlaggedLinkManager() - - def save(self, *args, **kwargs): - if self.approved: - self.link.delete() - self.delete() - else: - super(FlaggedLink, self).save(*args, **kwargs) - - def url(self): - return self.link.url - - def get_link_url(self): - return 'Link #%d' % (self.link.get_absolute_url(), - self.link.id) - get_link_url.allow_tags = True - get_link_url.short_description = "View Link on Site" - - def __unicode__(self): - return self.link.title - - class Meta: - ordering = ('-date_flagged', ) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/search_indexes.py --- a/gpp/weblinks/search_indexes.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,23 +0,0 @@ -"""Haystack search index for the weblinks application.""" -from haystack.indexes import * -from haystack import site -from custom_search.indexes import CondQueuedSearchIndex - -from weblinks.models import Link - - -class LinkIndex(CondQueuedSearchIndex): - text = CharField(document=True, use_template=True) - author = CharField(model_attr='user') - pub_date = DateTimeField(model_attr='date_added') - - def index_queryset(self): - return Link.public_objects.all() - - def get_updated_field(self): - return 'update_date' - - def can_index(self, instance): - return instance.is_public - -site.register(Link, LinkIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/signals.py --- a/gpp/weblinks/signals.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,41 +0,0 @@ -"""Signals for the weblinks application. -We use signals to compute the denormalized category counts whenever a weblink -is saved.""" -from django.db.models.signals import post_save -from django.db.models.signals import post_delete - -from weblinks.models import Category, Link - - -def on_link_save(sender, **kwargs): - """This function updates the count field for all categories. - It is called whenever a link is saved via a signal. - """ - if kwargs['created']: - # we only have to update the parent category - link = kwargs['instance'] - cat = link.category - cat.count = Link.public_objects.filter(category=cat).count() - cat.save() - else: - # update all categories just to be safe (an existing link could - # have been moved from one category to another - cats = Category.objects.all() - for cat in cats: - cat.count = Link.public_objects.filter(category=cat).count() - cat.save() - - -def on_link_delete(sender, **kwargs): - """This function updates the count field for the link's parent - category. It is called when a link is deleted via a signal. - """ - # update the parent category - link = kwargs['instance'] - cat = link.category - cat.count = Link.public_objects.filter(category=cat).count() - cat.save() - - -post_save.connect(on_link_save, sender=Link, dispatch_uid='weblinks.signals') -post_delete.connect(on_link_delete, sender=Link, dispatch_uid='weblinks.signals') diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/static/css/weblinks.css --- a/gpp/weblinks/static/css/weblinks.css Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,14 +0,0 @@ -div.weblinks-link-sort { - padding-bottom: .5em; -} - -ul.weblinks-link-options { - margin: 0; - padding-left: 0; - list-style-type: none; -} - -ul.weblinks-link-options li { - display: inline; - padding: 0 5px; -} diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/static/js/weblinks.js --- a/gpp/weblinks/static/js/weblinks.js Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,24 +0,0 @@ -$(document).ready(function() { - $('a.weblinks-broken').click(function () { - var id = this.id; - if (id.match(/^link-(\d+)$/)) { - id = RegExp.$1; - if (confirm('Do you really want to report this link as broken? ' + - 'This will notify the site staff that the link is dead and that ' + - 'it may need to be deleted or revised.')) { - $.ajax({ - url: '/links/report/' + id + '/', - type: 'POST', - dataType: 'text', - success: function (response, textStatus) { - alert(response); - }, - error: function (xhr, textStatus, ex) { - alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); - } - }); - } - } - return false; - }); -}); diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/templatetags/weblinks_tags.py --- a/gpp/weblinks/templatetags/weblinks_tags.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,17 +0,0 @@ -""" -Template tags for the weblinks application. -""" -from django import template - -from weblinks.models import Link - - -register = template.Library() - - -@register.inclusion_tag('weblinks/latest_tag.html') -def latest_weblinks(): - links = Link.public_objects.order_by('-date_added')[:10] - return { - 'links': links, - } diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/urls.py --- a/gpp/weblinks/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,19 +0,0 @@ -"""urls for the weblinks application""" -from django.conf.urls import patterns, url - -urlpatterns = patterns('weblinks.views', - url(r'^$', 'link_index', name='weblinks-main'), - url(r'^add/$', 'add_link', name='weblinks-add_link'), - url(r'^add/thanks/$', 'add_thanks', name='weblinks-add_thanks'), - url(r'^category/(?P[\w\d-]+)/(?Ptitle|date|rating|hits)/$', - 'view_links', - name='weblinks-view_links'), - url(r'^detail/(\d+)/$', - 'link_detail', - name='weblinks-link_detail'), - url(r'^new/$', 'new_links', name='weblinks-new_links'), - url(r'^popular/$', 'popular_links', name='weblinks-popular_links'), - url(r'^random/$', 'random_link', name='weblinks-random_link'), - url(r'^report/(\d+)/$', 'report_link', name='weblinks-report_link'), - url(r'^visit/(\d+)/$', 'visit', name="weblinks-visit"), -) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/weblinks/views.py --- a/gpp/weblinks/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,199 +0,0 @@ -""" -Views for the weblinks application. -""" - -import datetime -import random -from django.shortcuts import render_to_response -from django.template import RequestContext -from django.core.paginator import InvalidPage -from django.http import HttpResponse -from django.http import HttpResponseBadRequest -from django.http import HttpResponseRedirect -from django.contrib.auth.decorators import login_required -from django.shortcuts import get_object_or_404 -from django.core.urlresolvers import reverse -from django.db.models import Q -from django.http import Http404 -from django.views.decorators.http import require_POST - -from core.paginator import DiggPaginator -from core.functions import email_admins -from core.functions import get_page -from weblinks.models import Category -from weblinks.models import Link -from weblinks.models import FlaggedLink -from weblinks.forms import AddLinkForm - -####################################################################### - -LINKS_PER_PAGE = 10 - -def create_paginator(links): - return DiggPaginator(links, LINKS_PER_PAGE, body=5, tail=3, margin=3, padding=2) - -####################################################################### - -def link_index(request): - categories = Category.objects.all() - total_links = Link.public_objects.all().count() - return render_to_response('weblinks/index.html', { - 'categories': categories, - 'total_links': total_links, - }, - context_instance = RequestContext(request)) - -####################################################################### - -def new_links(request): - links = Link.public_objects.order_by('-date_added') - paginator = create_paginator(links) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - - return render_to_response('weblinks/link_summary.html', { - 'page': the_page, - 'title': 'Newest Links', - }, - context_instance = RequestContext(request)) - -####################################################################### - -def popular_links(request): - links = Link.public_objects.order_by('-hits') - paginator = create_paginator(links) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - return render_to_response('weblinks/link_summary.html', { - 'page': the_page, - 'title': 'Popular Links', - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def add_link(request): - if request.method == 'POST': - add_form = AddLinkForm(request.POST) - if add_form.is_valid(): - new_link = add_form.save(commit=False) - new_link.user = request.user - new_link.save() - email_admins('New link for approval', """Hello, - -A user has added a new link for your approval. -""") - return HttpResponseRedirect(reverse('weblinks-add_thanks')) - else: - add_form = AddLinkForm() - - return render_to_response('weblinks/add_link.html', { - 'add_form': add_form, - }, - context_instance = RequestContext(request)) - -####################################################################### - -@login_required -def add_thanks(request): - return render_to_response('weblinks/add_link.html', { - }, - context_instance = RequestContext(request)) - -####################################################################### - -# Maps URL component to database field name for the links table: - -LINK_FIELD_MAP = { - 'title': 'title', - 'date': '-date_added', - 'hits': '-hits' -} - -def view_links(request, slug, sort='title'): - try: - cat = Category.objects.get(slug=slug) - except Category.DoesNotExist: - raise Http404 - - if sort in LINK_FIELD_MAP: - order_by = LINK_FIELD_MAP[sort] - else: - sort = 'title' - order_by = LINK_FIELD_MAP['title'] - - links = Link.public_objects.filter(category=cat).order_by(order_by) - paginator = create_paginator(links) - page = get_page(request.GET) - try: - the_page = paginator.page(page) - except InvalidPage: - raise Http404 - - return render_to_response('weblinks/view_links.html', { - 's' : sort, - 'category' : cat, - 'page' : the_page, - }, - context_instance = RequestContext(request)) - -####################################################################### - -def _visit_link(request, link): - link.hits += 1 - link.save() - return HttpResponseRedirect(link.url) - -####################################################################### - -@require_POST -def visit(request, link_id): - link = get_object_or_404(Link, pk = link_id) - return _visit_link(request, link) - -####################################################################### - -@require_POST -def random_link(request): - ids = Link.public_objects.values_list('id', flat=True) - if not ids: - raise Http404 - id = random.choice(ids) - random_link = Link.public_objects.get(pk=id) - return _visit_link(request, random_link) - -####################################################################### - -@require_POST -def report_link(request, link_id): - """ - This function is the target of an AJAX POST to report a link as dead. - """ - if not request.user.is_authenticated(): - return HttpResponse('Please login or register to report a broken link.') - - try: - link = Link.objects.get(pk=link_id) - except Link.DoesNotExist: - return HttpResponseBadRequest("That link doesn't exist.") - - FlaggedLink.objects.create(link, request.user) - return HttpResponse("The link was reported. A moderator will review the " \ - "link shortly. Thanks for helping to improve the content on " \ - "this site.") - -####################################################################### - -def link_detail(request, id): - link = get_object_or_404(Link, pk=id) - return render_to_response('weblinks/link_detail.html', { - 'link': link, - }, - context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/ygroup/management/commands/sync_ygroup_posts.py --- a/gpp/ygroup/management/commands/sync_ygroup_posts.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,53 +0,0 @@ -""" -sync_ygroup_posts.py - A management command to synchronize the yahoo group -archives by recomputing the de-normalized fields in the post objects. - -""" -import optparse - -from django.core.management.base import NoArgsCommand, CommandError -from django.core.urlresolvers import reverse - -from ygroup.models import Thread, Post -import ygroup.views - - -class Command(NoArgsCommand): - help = """\ -This command synchronizes the ygroup application's post objects -by updating their de-normalized fields. -""" - option_list = NoArgsCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 100 posts to show progress"), - ) - - def handle_noargs(self, **opts): - - show_progress = opts.get('progress', False) or False - - threads = {} - self.stdout.write("Processing threads...\n") - for thread in Thread.objects.iterator(): - threads[thread.id] = [reverse('ygroup-thread_view', args=[thread.id]), - list(Post.objects.filter(thread=thread).values_list('id', flat=True))] - - self.stdout.write("Processing posts...\n") - n = 0 - for post in Post.objects.iterator(): - thread = threads[post.thread.id] - pos = thread[1].index(post.id) - page = pos / ygroup.views.POSTS_PER_PAGE + 1 - if page == 1: - post.thread_url = thread[0] + '#p%d' % (post.id, ) - else: - post.thread_url = thread[0] + '?page=%d#p%d' % (page, post.id) - post.save() - - n += 1 - if show_progress and n % 100 == 0: - self.stdout.write('.') - self.stdout.flush() - - self.stdout.write('\n') - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/ygroup/management/commands/sync_ygroup_threads.py --- a/gpp/ygroup/management/commands/sync_ygroup_threads.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,39 +0,0 @@ -""" -sync_ygroup_threads.py - A management command to synchronize the yahoo group -archives by recomputing the de-normalized fields in the thread objects. - -""" -import optparse - -from django.core.management.base import NoArgsCommand, CommandError - -from ygroup.models import Thread, Post -import ygroup.views - - -class Command(NoArgsCommand): - help = """\ -This command synchronizes the ygroup application's thread objects -by updating their de-normalized fields. -""" - option_list = NoArgsCommand.option_list + ( - optparse.make_option("-p", "--progress", action="store_true", - help="Output a . after every 50 threads to show progress"), - ) - - def handle_noargs(self, **opts): - - show_progress = opts.get('progress', False) or False - - n = 0 - for thread in Thread.objects.iterator(): - thread.post_count = Post.objects.filter(thread=thread).count() - thread.page = n / ygroup.views.THREADS_PER_PAGE + 1 - thread.save() - n += 1 - if n % 50 == 0: - self.stdout.write('.') - self.stdout.flush() - - self.stdout.write('\n') - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/ygroup/models.py --- a/gpp/ygroup/models.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,55 +0,0 @@ -""" -Models for the ygroup application, which is a read-only archive of messages -from the old Yahoo Group. -""" -from django.db import models - - -class Thread(models.Model): - title = models.CharField(max_length=255) - creation_date = models.DateTimeField() - - # denormalized fields to reduce database hits - poster = models.CharField(max_length=128) - post_count = models.IntegerField(blank=True, default=0) - page = models.IntegerField(blank=True, default=1) - - class Meta: - ordering = ('creation_date', ) - - def __unicode__(self): - return u'Thread %d, %s' % (self.pk, self.title) - - @models.permalink - def get_absolute_url(self): - return ('ygroup-thread_view', [self.id]) - - -class Post(models.Model): - thread = models.ForeignKey(Thread, null=True, blank=True, - on_delete=models.SET_NULL, related_name='posts') - title = models.CharField(max_length=255) - creation_date = models.DateTimeField() - poster = models.CharField(max_length=128) - msg = models.TextField() - - # precomputed URL to this post in the parent thread for efficiency - thread_url = models.URLField(blank=True) - - class Meta: - ordering = ('creation_date', ) - verbose_name = 'yahoo group post' - verbose_name_plural = 'yahoo group posts' - - def __unicode__(self): - return u'Post %d, %s' % (self.pk, self.title) - - @models.permalink - def get_absolute_url(self): - return ('ygroup-post_view', [], {'pk': self.id}) - - def search_title(self): - return self.title - - def search_summary(self): - return self.msg diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/ygroup/search_indexes.py --- a/gpp/ygroup/search_indexes.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,20 +0,0 @@ -""" -Haystack search index for the Yahoo Group archives application. - -""" -from haystack.indexes import * -from haystack import site -from custom_search.indexes import CondQueuedSearchIndex - -from ygroup.models import Post - - -class PostIndex(CondQueuedSearchIndex): - text = CharField(document=True, use_template=True) - pub_date = DateTimeField(model_attr='creation_date') - - def get_updated_field(self): - return 'creation_date' - - -site.register(Post, PostIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/ygroup/tests.py --- a/gpp/ygroup/tests.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,16 +0,0 @@ -""" -This file demonstrates writing tests using the unittest module. These will pass -when you run "manage.py test". - -Replace this with more appropriate tests for your application. -""" - -from django.test import TestCase - - -class SimpleTest(TestCase): - def test_basic_addition(self): - """ - Tests that 1 + 1 always equals 2. - """ - self.assertEqual(1 + 1, 2) diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/ygroup/urls.py --- a/gpp/ygroup/urls.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,23 +0,0 @@ -""" -urls.py - URLs for the ygroup application. - -""" -from django.conf.urls import patterns, url -from django.views.generic import DetailView - -from ygroup.models import Post -from ygroup.views import ThreadIndexView, ThreadView - - -urlpatterns = patterns('', - url(r'^threads/$', - ThreadIndexView.as_view(), - name='ygroup-thread_index'), - url(r'^thread/(\d+)/$', - ThreadView.as_view(), - name='ygroup-thread_view'), - url(r'^post/(?P\d+)/$', - DetailView.as_view(model=Post, context_object_name='post'), - name='ygroup-post_view'), -) - diff -r c525f3e0b5d0 -r ee87ea74d46b gpp/ygroup/views.py --- a/gpp/ygroup/views.py Sat May 05 15:08:07 2012 -0500 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,55 +0,0 @@ -""" -Views for the ygroup (Yahoo Group Archive) application. - -""" -from django.shortcuts import get_object_or_404 -from django.views.generic import ListView - -from ygroup.models import Thread, Post -from core.paginator import DiggPaginator - - -THREADS_PER_PAGE = 40 -POSTS_PER_PAGE = 20 - - -class ThreadIndexView(ListView): - """ - This generic view displays the list of threads available. - - """ - model = Thread - paginate_by = THREADS_PER_PAGE - - def get_paginator(self, queryset, per_page, **kwargs): - """ - Return an instance of the paginator for this view. - """ - return DiggPaginator(queryset, per_page, body=5, tail=2, - margin=3, padding=2, **kwargs) - - -class ThreadView(ListView): - """ - This generic view displays the posts in a thread. - - """ - context_object_name = "post_list" - template_name = "ygroup/thread.html" - paginate_by = POSTS_PER_PAGE - - def get_queryset(self): - self.thread = get_object_or_404(Thread, pk=self.args[0]) - return Post.objects.filter(thread=self.thread) - - def get_context_data(self, **kwargs): - context = super(ThreadView, self).get_context_data(**kwargs) - context['thread'] = self.thread - return context - - def get_paginator(self, queryset, per_page, **kwargs): - """ - Return an instance of the paginator for this view. - """ - return DiggPaginator(queryset, per_page, body=5, tail=2, - margin=3, padding=2, **kwargs) diff -r c525f3e0b5d0 -r ee87ea74d46b irc/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/irc/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,15 @@ +"""Models for the IRC application. +The IRC application simply reports who is in the site's IRC chatroom. A bot in the channel updates +the table and we read it. +""" +from django.db import models + +class IrcChannel(models.Model): + name = models.CharField(max_length=30) + last_update = models.DateTimeField() + + def __unicode__(self): + return self.name + + class Meta: + ordering = ('name', ) diff -r c525f3e0b5d0 -r ee87ea74d46b irc/templatetags/irc_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/irc/templatetags/irc_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,14 @@ +""" +Template tags for the IRC application. +""" +from django import template +from irc.models import IrcChannel + +register = template.Library() + +@register.inclusion_tag('irc/irc_block.html') +def irc_status(): + nicks = IrcChannel.objects.all() + return { + 'nicks': nicks, + } diff -r c525f3e0b5d0 -r ee87ea74d46b irc/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/irc/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,6 @@ +"""urls for the IRC application""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('irc.views', + url(r'^$', 'view', name='irc-main'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b irc/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/irc/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,12 @@ +"""views for the IRC application""" + +from django.shortcuts import render_to_response +from django.template import RequestContext + +from irc.models import IrcChannel + +def view(request): + nicks = IrcChannel.objects.all() + return render_to_response('irc/view.html', + {'nicks': nicks}, + context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/data.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/data.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,31 @@ +""" +Misc data for the legacy management commands. + +""" + +# Over time various users asked me to change their username. The legacy site +# rarely stored foreign keys to users; instead it stored the name of the user +# at the time. This dictionary contains mappings from old usernames to new +# usernames. + +KNOWN_USERNAME_CHANGES = { + 'cavefishbutchdelux': 'butchdelux', + 'findicator1': 'WaveOhhh', + 'tikimania': 'Tikitena', + 'sandyfeet': 'RickRhoades', + 'crumb': 'crumble', + 'allenbridgewater': 'Outerwave_Allen', + 'reddtyde': 'Redd_Tyde', + 'fendershowman63': 'Abe', + 'hearteater': 'JoshHeartless', + 'surfdaddy': 'zzero', + 'frisbie': 'zzero', + 'retroactivegammarays': 'Retroactive_Taj', + 'mrrebel': 'Eddie_Bertrand', + 'doublecoil': 'Showman', + 'tsunami_tom': 'TomH', + 'davidj': 'davidphantomatic', + 'svd': 'Bilge_Rat', + 'dave_ledude': 'DaveF', +} + diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/html2md.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/html2md.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,291 @@ +""" +This module contains a class derived from Python's HTMLParser to convert HTML to +Markdown. Currently this class only supports those HTML tags that have counter- +parts in BBCode used by stock phpBB 2.x. + +In other words, this class was created to help convert data from a phpBB +forum to Markdown syntax and its scope is currently limited to that task. + +""" +from HTMLParser import HTMLParser +import htmlentitydefs + + +# Let's call Markdown markup entities "elements" to avoid confusion +# with HTML tags. + +class ElementBase(object): + """ + Base class for all Markdown elements. + + """ + def __init__(self, attrs=None): + self.data = u'' + self.attrs = dict(attrs) if attrs else {} + + def add_data(self, data): + self.data += data + + def markdown(self): + return self.data + + +class TextElement(ElementBase): + """ + TextElements represent text fragments not inside HTML tags. + """ + pass + + +class EmphasisElement(ElementBase): + """ + An EmphasisElement is a Markdown element used to indicate emphasis and is + represented by placing characters around text. E.g. _em_, **bold** + + """ + def __init__(self, tag, attrs): + super(EmphasisElement, self).__init__(attrs) + self.tag = tag + + def markdown(self): + return u'%s%s%s' % (self.tag, self.data, self.tag) + + +def create_emphasis(tag): + """ + Returns a function that creates an EmphasisElement using the supplied + tag. + + """ + def inner(attrs): + return EmphasisElement(tag, attrs) + return inner + + +class HtmlElement(ElementBase): + """ + Markdown also accepts HTML markup. This element represents a HTML tag that + maps to itself in Markdown. + + """ + def __init__(self, tag, attrs): + super(HtmlElement, self).__init__(attrs) + self.tag = tag + + def markdown(self): + return u'<%s>%s' % (self.tag, self.data, self.tag) + + +def create_html(tag): + """ + Returns a function that creates a HtmlElement using the supplied tag. + + """ + def inner(attrs): + return HtmlElement(tag, attrs) + return inner + + +class QuoteElement(ElementBase): + """ + Class to represent a blockquote in Markdown. + + """ + def markdown(self): + return u'> %s\n\n' % self.data.replace('\n', '\n> ') + + +class BreakElement(ElementBase): + """ + Class to represent a linebreak in Markdown. + + """ + def markdown(self): + return u' \n' + + +class DivElement(ElementBase): + """ + This class maps a HTML
into a block of text surrounded by newlines. + + """ + def markdown(self): + return u'\n%s\n' % self.data + + +class LinkElement(ElementBase): + """ + This class maps HTML tags into Markdown links. + If no data is present, the actual href is used for the link text. + + """ + def markdown(self): + try: + url = self.attrs['href'] + except KeyError: + return self.data if self.data else u'' + + text = self.data if self.data else url + return u'[%s](%s)' % (text, url) + + +class ImageElement(ElementBase): + """ + This class maps HTML tags into Markdown. + This element assumes no alt text is present, and simply uses the word + 'image' for the alt text. + + """ + def markdown(self): + try: + url = self.attrs['src'] + except KeyError: + return u' (missing image) ' + return u'![image](%s)' % url + + +class CodeElement(ElementBase): + """ + This class is used to create code blocks in Markdown. + + """ + def markdown(self): + return u' %s\n' % self.data.replace('\n', '\n ') + + +# List (ordered & unordered) support: + +class ListElement(ElementBase): + """ + This class creates Markdown for unordered lists. The bullet() method can be + overridden to create ordered lists. + + """ + def __init__(self, attrs=None): + super(ListElement, self).__init__(attrs) + self.items = [] + self.list_nesting = 1 + + def add_data(self, data): + self.items.append(data) + + def bullet(self): + return u'*' + + def markdown(self): + bullet_str = self.bullet() + indent = u' ' * (4 * (self.list_nesting - 1)) + s = u'' + for item in self.items: + s += u'\n%s%s %s' % (indent, bullet_str, item) + return s + + +class OrderedListElement(ListElement): + """ + This class creates Markdown for ordered lists. + + """ + def bullet(self): + return '1.' + + +class ItemElement(ElementBase): + """ + This element is used to represent ordered & unordered list items. + + """ + pass + +############################################################################### +############################################################################### + +class MarkdownWriter(HTMLParser): + """ + This class is an HTMLParser that converts a subset of HTML to Markdown. + + """ + + elem_factories = { + 'a': LinkElement, + 'blockquote': QuoteElement, + 'br': BreakElement, + 'div': DivElement, + 'em': create_emphasis('_'), + 'img': ImageElement, + 'li': ItemElement, + 'ol': OrderedListElement, + 'pre': CodeElement, + 's': create_html('strike'), + 'strong': create_emphasis('**'), + 'u': create_html('u'), + 'ul': ListElement, + } + + def __init__(self): + HTMLParser.__init__(self) + self.reset() + + def handle_starttag(self, tag, attrs): + if tag in self.elem_factories: + factory = self.elem_factories[tag] + element = factory(attrs) + else: + element = TextElement() + + self._push_elem(element) + + def handle_endtag(self, tag): + self._pop_elem() + + def handle_data(self, data): + if len(self.elem_stack) == 0: + self._push_elem(TextElement()) + self._add_data(data) + + def handle_entityref(self, name): + try: + text = unichr(htmlentitydefs.name2codepoint[name]) + except KeyError: + text = name + self.handle_data(text) + + def handle_charref(self, name): + self.handle_data(unichr(int(name))) + + def reset(self): + HTMLParser.reset(self) + self.elem_stack = [] + self.elements = [] + self.list_nesting = 0 + + def _push_elem(self, tag): + if len(self.elem_stack) and isinstance(self.elem_stack[-1], TextElement): + self._pop_elem() + if isinstance(tag, ListElement): + self.list_nesting += 1 + tag.list_nesting = self.list_nesting + self.elem_stack.append(tag) + + def _pop_elem(self): + try: + element = self.elem_stack.pop() + except IndexError: + # pop from empty list => bad HTML input; ignore it + return + + if isinstance(element, ListElement): + self.list_nesting -= 1 + if len(self.elem_stack): + self.elem_stack[-1].add_data(element.markdown()) + else: + self.elements.append(element) + + def _add_data(self, data): + self.elem_stack[-1].add_data(data) + + def markdown(self): + while len(self.elem_stack): + self._pop_elem() + text_list = [e.markdown() for e in self.elements] + return u''.join(text_list) diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/fix_potd_smiles.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/fix_potd_smiles.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,44 @@ +""" +This command fixes the old 1.0 smiley system to match the new scheme. + +""" +from django.core.management.base import NoArgsCommand +from comments.models import Comment + + +SMILEY_MAP = { + ':confused:': ':?', + ':upset:': ':argh:', + ':eek:': ':shock:', + ':rolleyes:': ':whatever:', + ':mad:': 'X-(', + ':shy:': ':oops:', + ':laugh:': ':lol:', + ':dead:': 'x_x', + ':cry:': ':-(', + ';)': ':wink:', + ':|': ':-|', + ';-)': ':wink:', + ':D': ':-D', + ':P': ':-P', + 'B)': '8)', + ':(': ':-(', + ':)': ':-)', +} + + +class Command(NoArgsCommand): + + def handle_noargs(self, **opts): + + comments = Comment.objects.filter(id__gt=3000) + for comment in comments: + save = False + for key, val in SMILEY_MAP.items(): + if key in comment.comment: + comment.comment = comment.comment.replace(key, val) + save = True + + if save: + comment.save() + diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_download_comments.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_download_comments.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,91 @@ +""" +import_old_download_comments.py - For importing download comments from SG101 1.0 +as csv files. + +""" +from __future__ import with_statement +import csv +from datetime import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User +from django.contrib.contenttypes.models import ContentType + +from downloads.models import Download, VoteRecord +from comments.models import Comment +from legacy.html2md import MarkdownWriter +import legacy.data + + +class Command(LabelCommand): + args = '' + help = 'Imports download comments from the old database in CSV format' + md_writer = MarkdownWriter() + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new object and saving it to the database. + + """ + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + try: + for row in self.reader: + self.process_row(row) + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create an object for the row + and save it in the database. + + """ + dl_id = int(row['ratinglid']) + if dl_id in (1, 2, 3, 4): + return + + try: + dl = Download.objects.get(pk=dl_id) + except Download.DoesNotExist: + return + + try: + user = User.objects.get(username=row['ratinguser']) + except User.DoesNotExist: + old_name = row['ratinguser'].lower() + try: + user = User.objects.get( + username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) + except (User.DoesNotExist, KeyError): + return + + vote_date = datetime.strptime(row['ratingtimestamp'], "%Y-%m-%d %H:%M:%S") + + comment_text = row['ratingcomments'].decode('latin-1').strip() + if comment_text: + comment = Comment( + content_type=ContentType.objects.get_for_model(dl), + object_id=dl.id, + user=user, + comment=comment_text, + creation_date=vote_date, + ip_address = row['ratinghostname'], + is_public = True, + is_removed = False, + ) + comment.save() + + vr = VoteRecord(download=dl, user=user, vote_date=vote_date) + vr.save() + + def to_markdown(self, s): + self.md_writer.reset() + self.md_writer.feed(s) + return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_downloads.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_downloads.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,133 @@ +""" +import_old_downloads.py - For importing downloads from SG101 1.0 as csv files. +""" +from __future__ import with_statement +import csv +import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User + +from downloads.models import Download, Category +from legacy.html2md import MarkdownWriter + + +# downloads with these lid's will be excluded +EXCLUDE_SET = set([1, 2, 3, 4, 277]) + +# Mapping of old category IDs to new; None means we don't plan on importing +CAT_MAP = { + 4: None, # Misc + 3: None, # Music + 1: None, # Demos + 6: 2, # Gear Samples + 8: 4, # Ringtones + 9: 8, # Tablature + 10: 6, # Interviews + 11: None, # 2008 MP3 Comp + 12: 1, # Backing Tracks + 13: None, # 2009 MP3 Comp +} + +SG101_PREFIX = 'http://surfguitar101.com/' + + +class Command(LabelCommand): + args = '' + help = 'Imports downloads from the old database in CSV format' + md_writer = MarkdownWriter() + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new object and saving it to the database. + + """ + self.cats = {} + try: + self.default_user = User.objects.get(pk=2) + except User.DoesNotExist: + raise CommandError("Need a default user with pk=2") + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + try: + for row in self.reader: + self.process_row(row) + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def get_category(self, old_cat_id): + """ + Return the Category object for the row. + + """ + cat_id = CAT_MAP[old_cat_id] + if cat_id not in self.cats: + try: + cat = Category.objects.get(pk=cat_id) + except Category.DoesNotExist: + raise CommandError("Category does not exist: %s on line %s" % ( + cat_id, self.reader.line_num)) + else: + self.cats[cat_id] = cat + return self.cats[cat_id] + + def get_user(self, username): + """ + Return the user object for the given username. + If the user cannot be found, self.default_user is returned. + + """ + try: + return User.objects.get(username=username) + except User.DoesNotExist: + return self.default_user + + def process_row(self, row): + """ + Process one row from the CSV file: create an object for the row + and save it in the database. + + """ + lid = int(row['lid']) + if lid in EXCLUDE_SET: + return # skip + + cat = int(row['cid']) + if CAT_MAP.get(cat) is None: + return # skip this one; we aren't carrying these over + + dl_date = datetime.datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S") + old_url = row['url'].decode('latin-1') + if old_url.startswith(SG101_PREFIX): + old_url = old_url[len(SG101_PREFIX):] + if old_url.startswith('dls/'): + old_url = old_url[4:] + new_url = u'downloads/1.0/%s' % old_url + + dl = Download( + id=lid, + title=row['title'].decode('latin-1'), + category=self.get_category(cat), + description=self.to_markdown(row['description'].decode('latin-1')), + file=new_url, + user=self.get_user(row['submitter']), + date_added=dl_date, + ip_address='127.0.0.1', # not available + hits=int(row['hits']), + average_score=float(row['downloadratingsummary']) / 2.0, + total_votes=int(row['totalvotes']), + is_public=True) + dl.save() + #print "cp %s %s" % (old_url, '/home/var/django-sites/sg101/sg101-trunk/media/' + new_url) + + def to_markdown(self, s): + self.md_writer.reset() + self.md_writer.feed(s) + return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_links.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_links.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,84 @@ +""" +import_old_links.py - For importing links from SG101 1.0 as csv files. +""" +from __future__ import with_statement +import csv +import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User + +from weblinks.models import Link, Category + + +class Command(LabelCommand): + args = '' + help = 'Imports weblinks from the old database in CSV format' + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new weblink object and saving it to the database. + + """ + self.cats = {} + try: + self.default_user = User.objects.get(pk=2) + except User.DoesNotExist: + raise CommandError("Need a default user with pk=2") + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + try: + for row in self.reader: + self.process_row(row) + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def get_category(self, row): + """ + Return the Category object for the row. + + """ + cat_id = row['cid'] + if cat_id not in self.cats: + try: + cat = Category.objects.get(pk=cat_id) + except Category.DoesNotExist: + raise CommandError("Category does not exist: %s on line %s" % ( + cat_id, self.reader.line_num)) + else: + self.cats[cat_id] = cat + return self.cats[cat_id] + + def get_user(self, username): + """ + Return the user object for the given username. + If the user cannot be found, self.default_user is returned. + + """ + try: + return User.objects.get(username=username) + except User.DoesNotExist: + return self.default_user + + def process_row(self, row): + """ + Process one row from the CSV file: create an object for the row + and save it in the database. + + """ + link = Link(category=self.get_category(row), + title=row['title'].decode('latin-1'), + url=row['url'].decode('latin-1'), + description=row['description'].decode('latin-1'), + user=self.get_user(row['submitter']), + date_added=datetime.datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S"), + hits=int(row['hits']), + is_public=True) + link.save() diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_news.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_news.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,121 @@ +""" +import_old_news.py - For importing news stories from SG101 1.0 as csv files. +""" +from __future__ import with_statement +import csv +import optparse +import sys +from datetime import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User + +from news.models import Category, Story +from legacy.phpbb import unescape +import legacy.data + + +class Command(LabelCommand): + args = '' + help = 'Imports news stories from the old database in CSV format' + option_list = LabelCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 20 stories to show progress"), + ) + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new story. + + """ + self.show_progress = options.get('progress') + self.users = {} + + # Create a mapping from the old database's topics to our + # Categories. + self.topics = {} + try: + self.topics[2] = Category.objects.get(slug='site-news') + self.topics[3] = Category.objects.get(slug='bands') + self.topics[4] = Category.objects.get(slug='show-announcements') + self.topics[5] = Category.objects.get(slug='show-reports') + self.topics[6] = Category.objects.get(slug='gear') + self.topics[7] = Category.objects.get(slug='reviews') + self.topics[8] = Category.objects.get(slug='surf-scene-news') + self.topics[9] = Category.objects.get(slug='articles') + self.topics[10] = Category.objects.get(slug='interviews') + self.topics[11] = Category.objects.get(slug='tablature') + self.topics[12] = Category.objects.get(slug='featured-videos') + except Category.DoesNotExist: + sys.exit("Category does not exist; check topic mapping.") + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + num_rows = 0 + try: + for row in self.reader: + self.process_row(row) + num_rows += 1 + if self.show_progress and num_rows % 20 == 0: + sys.stdout.write('.') + sys.stdout.flush() + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + print + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create a Story object for + the row and save it in the database. + + """ + row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) + + try: + submitter = self._get_user(row['informant']) + except User.DoesNotExist: + print "Could not find user %s for story %s; skipping." % ( + row['informant'], row['sid']) + return + + story = Story(id=int(row['sid']), + title=unescape(row['title'].decode('latin-1')), + submitter=submitter, + category=self.topics[int(row['topic'])], + short_text=row['hometext'].decode('latin-1'), + long_text=row['bodytext'].decode('latin-1'), + date_submitted=datetime.strptime(row['time'], "%Y-%m-%d %H:%M:%S"), + allow_comments=True) + + story.save() + + def _get_user(self, username): + """ + Returns the user object with the given username. + Throws User.DoesNotExist if not found. + + """ + try: + return self.users[username] + except KeyError: + pass + + try: + user = User.objects.get(username=username) + except User.DoesNotExist: + old_name = username.lower() + try: + user = User.objects.get( + username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) + except KeyError: + raise User.DoesNotExist + + self.users[username] = user + return user diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_news_comments.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_news_comments.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,125 @@ +""" +import_old_news_comments.py - For importing comments on news stories from SG101 1.0 as csv files. +""" +from __future__ import with_statement +import csv +import optparse +import sys +from datetime import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User +from django.contrib.contenttypes.models import ContentType + +from comments.models import Comment +from news.models import Story +import legacy.data +from legacy.html2md import MarkdownWriter + + +class Command(LabelCommand): + args = '' + help = 'Imports news story comments from the old database in CSV format' + option_list = LabelCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 20 comments to show progress"), + ) + md_writer = MarkdownWriter() + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new story comment. + + """ + self.show_progress = options.get('progress') + self.users = {} + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + num_rows = 0 + try: + for row in self.reader: + self.process_row(row) + num_rows += 1 + if self.show_progress and num_rows % 20 == 0: + sys.stdout.write('.') + sys.stdout.flush() + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + print + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create a Comment object for + the row and save it in the database. + + """ + row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) + + try: + user = self._get_user(row['name']) + except User.DoesNotExist: + print "Could not find user %s for comment %s; skipping." % ( + row['name'], row['tid']) + return + + try: + story = Story.objects.get(id=int(row['sid'])) + except Story.DoesNotExist: + print "Could not find story %s for comment %s; skipping." % ( + row['sid'], row['tid']) + return + + comment = Comment( + id=int(row['tid']), + content_type = ContentType.objects.get_for_model(story), + object_id = story.id, + user = user, + comment = self.to_markdown(row['comment']), + creation_date = datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S"), + ip_address = row['host_name'], + is_public = True, + is_removed = False, + ) + + comment.save() + + def _get_user(self, username): + """ + Returns the user object with the given username. + Throws User.DoesNotExist if not found. + + """ + try: + return self.users[username] + except KeyError: + pass + + try: + user = User.objects.get(username=username) + except User.DoesNotExist: + old_name = username.lower() + try: + user = User.objects.get( + username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) + except KeyError: + raise User.DoesNotExist + + self.users[username] = user + return user + + def to_markdown(self, s): + self.md_writer.reset() + + if not isinstance(s, unicode): + s = s.decode('latin-1', 'replace') + + self.md_writer.feed(s) + return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_podcasts.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_podcasts.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,63 @@ +""" +import_old_podcasts.py - For importing podcasts from SG101 1.0 as csv files. +""" +from __future__ import with_statement +import csv +import datetime + +from django.core.management.base import LabelCommand, CommandError + +from podcast.models import Channel, Item + + +class Command(LabelCommand): + args = '' + help = 'Imports podcasts from the old database in CSV format' + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new weblink object and saving it to the database. + + """ + try: + self.channel = Channel.objects.get(pk=1) + except Channel.DoesNotExist: + raise CommandError("Need a default channel with pk=1") + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + try: + for row in self.reader: + self.process_row(row) + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create an object for the row + and save it in the database. + + """ + item = Item(channel=self.channel, + title=row['title'], + author=row['author'], + subtitle=row['subtitle'], + summary=row['summary'], + enclosure_url=row['enclosure_url'], + alt_enclosure_url='', + enclosure_length=int(row['enclosure_length']), + enclosure_type=row['enclosure_type'], + guid=row['guid'], + pubdate=datetime.datetime.strptime(row['pubdate'], + "%Y-%m-%d %H:%M:%S"), + duration=row['duration'], + keywords=row['keywords'], + explicit=row['explicit']) + + item.save() diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_potd.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_potd.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,127 @@ +""" +import_old_potd.py - For importing POTD's from SG101 1.0 as csv files. + +""" +from __future__ import with_statement +import csv +import optparse +import sys +from datetime import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User + +from potd.models import Photo +from legacy.phpbb import unescape +import legacy.data + + +ID_OFFSET = 100 + + +class PathError(Exception): + pass + +def convert_path(old_path): + """ + Converts the old POTD path to a new one. + + """ + if old_path.startswith('images/potd/'): + return "potd/1.0/%s" % old_path[12:] + else: + raise PathError("Unknown path %s" % old_path) + + +class Command(LabelCommand): + args = '' + help = "Imports POTD's from the old database in CSV format" + option_list = LabelCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 20 items to show progress"), + ) + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new Photo + + """ + self.show_progress = options.get('progress') + self.users = {} + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + num_rows = 0 + try: + for row in self.reader: + self.process_row(row) + num_rows += 1 + if self.show_progress and num_rows % 20 == 0: + sys.stdout.write('.') + sys.stdout.flush() + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + print + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create a Photo object for + the row and save it in the database. + + """ + try: + submitter = self._get_user(row['submitted_by'].decode('latin-1')) + except User.DoesNotExist: + print "Could not find user %s for potd %s; skipping." % ( + row['submitted_by'], row['pid']) + return + + desc = row['description'].decode('latin-1').replace('\n', '\n
') + + try: + photo = Photo( + id=int(row['pid']) + ID_OFFSET, + photo=convert_path(row['photo_path']), + thumb=convert_path(row['thumb_path']), + caption=unescape(row['title'].decode('latin-1')), + description=desc, + user=submitter, + date_added=datetime.strptime(row['date_added'], + "%Y-%m-%d %H:%M:%S"), + potd_count=int(row['chosen_count'])) + except PathError, ex: + self.stderr.write("\n%s, skipping\n" % ex) + return + + photo.save() + + def _get_user(self, username): + """ + Returns the user object with the given username. + Throws User.DoesNotExist if not found. + + """ + try: + return self.users[username] + except KeyError: + pass + + try: + user = User.objects.get(username=username) + except User.DoesNotExist: + old_name = username.lower() + try: + user = User.objects.get( + username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) + except KeyError: + raise User.DoesNotExist + + self.users[username] = user + return user diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_potd_comments.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_potd_comments.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,141 @@ +""" +import_old_potd_comments.py - For importing comments on POTD's from SG101 1.0 +as csv files. + +""" +from __future__ import with_statement +import csv +import optparse +import sys +from datetime import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User +from django.contrib.contenttypes.models import ContentType + +from comments.models import Comment +from potd.models import Photo +import legacy.data +from legacy.html2md import MarkdownWriter + + +PHOTO_ID_OFFSET = 100 +ID_OFFSET = 3000 + + +class Command(LabelCommand): + args = '' + help = 'Imports POTD comments from the old database in CSV format' + option_list = LabelCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 20 items to show progress"), + optparse.make_option("--fix-mode", action="store_true", + help="Only create comments if they don't exist already"), + ) + md_writer = MarkdownWriter() + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new POTD comment. + + """ + self.show_progress = options.get('progress') + self.fix_mode = options.get('fix_mode') + self.users = {} + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + num_rows = 0 + try: + for row in self.reader: + self.process_row(row) + num_rows += 1 + if self.show_progress and num_rows % 20 == 0: + sys.stdout.write('.') + sys.stdout.flush() + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + print + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create a Comment object for + the row and save it in the database. + + """ + comment_id = int(row['cid']) + ID_OFFSET + + if self.fix_mode: + try: + c = Comment.objects.get(pk=comment_id) + except Comment.DoesNotExist: + pass + else: + return + + try: + user = self._get_user(row['username'].decode('latin-1')) + except User.DoesNotExist: + print "Could not find user %s for comment %s; skipping." % ( + row['username'], row['cid']) + return + + pid = int(row['pid']) + PHOTO_ID_OFFSET + try: + photo = Photo.objects.get(id=pid) + except Photo.DoesNotExist: + print "Could not find photo %s for comment %s; skipping." % ( + pid, row['cid']) + return + + comment = Comment( + id=comment_id, + content_type=ContentType.objects.get_for_model(photo), + object_id=photo.id, + user=user, + comment=self.to_markdown(row['comment'].decode('latin-1')), + creation_date=datetime.strptime(row['date'], "%Y-%m-%d %H:%M:%S"), + ip_address='192.0.2.0', # TEST-NET + is_public=True, + is_removed=False, + ) + + comment.save() + + def _get_user(self, username): + """ + Returns the user object with the given username. + Throws User.DoesNotExist if not found. + + """ + try: + return self.users[username] + except KeyError: + pass + + try: + user = User.objects.get(username=username) + except User.DoesNotExist: + old_name = username.lower() + try: + user = User.objects.get( + username=legacy.data.KNOWN_USERNAME_CHANGES[old_name]) + except KeyError: + raise User.DoesNotExist + + self.users[username] = user + return user + + def to_markdown(self, s): + + s = s.replace('\n', '\n
') + self.md_writer.reset() + self.md_writer.feed(s) + return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_topics.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_topics.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,117 @@ +""" +import_old_topics.py - For importing forum topics (threads) from SG101 1.0 as +csv files. + +""" +from __future__ import with_statement +import csv +import optparse +import sys +from datetime import datetime + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User + +from forums.models import Forum, Topic +from legacy.phpbb import unescape + + +class Command(LabelCommand): + args = '' + help = 'Imports forum topics from the old database in CSV format' + option_list = LabelCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 20 topics to show progress"), + ) + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new topic. + + """ + self.show_progress = options.get('progress') + self.users = {} + + # Create a mapping from the old database's forums to our + # forums + self.forums = {} + try: + self.forums[2] = Forum.objects.get(slug='suggestion-box') + self.forums[3] = Forum.objects.get(slug='surf-music') + self.forums[4] = Forum.objects.get(slug='surf-musician') + self.forums[5] = Forum.objects.get(slug='gear') + self.forums[6] = Forum.objects.get(slug='recording-corner') + self.forums[7] = Forum.objects.get(slug='shallow-end') + self.forums[8] = Forum.objects.get(slug='surfguitar101-website') + self.forums[9] = Forum.objects.get(id=15) + self.forums[10] = Forum.objects.get(slug='for-sale-trade') + self.forums[11] = Forum.objects.get(slug='musicians-gigs-wanted') + self.forums[12] = Forum.objects.get(slug='surf-videos') + self.forums[13] = Forum.objects.get(slug='sg101-podcast') + self.forums[14] = Forum.objects.get(slug='gigs') + self.forums[15] = Forum.objects.get(slug='music-reviews') + self.forums[18] = Forum.objects.get(slug='best-sg101') + except Forum.DoesNotExist: + sys.exit("Forum does not exist; check forum mapping.") + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + num_rows = 0 + try: + for row in self.reader: + self.process_row(row) + num_rows += 1 + if self.show_progress and num_rows % 20 == 0: + sys.stdout.write('.') + sys.stdout.flush() + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + print + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create a Story object for + the row and save it in the database. + + """ + row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) + + if row['topic_moved_id'] != '0': + return + + try: + user = User.objects.get(id=int(row['topic_poster'])) + except User.DoesNotExist: + print "Could not find user %s for topic %s; skipping." % ( + row['topic_poster'], row['topic_id']) + return + + creation_date = datetime.fromtimestamp(float(row['topic_time'])) + + title = row['topic_title'].decode('latin-1', 'replace') + + try: + forum = self.forums[int(row['forum_id'])] + except KeyError: + print 'skipping topic "%s"' % title + return + + topic = Topic(id=int(row['topic_id']), + forum=forum, + name=unescape(title), + creation_date=creation_date, + user=user, + view_count=int(row['topic_views']), + sticky=(int(row['topic_type']) != 0), + locked=(int(row['topic_status']) != 0), + update_date=creation_date) + + topic.save() + diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/import_old_users.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/import_old_users.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,163 @@ +""" +import_old_users.py - For importing users from SG101 1.0 as csv files. +""" +from __future__ import with_statement +import csv +import optparse +import re +import sys +from datetime import datetime + +import postmarkup + +from django.core.management.base import LabelCommand, CommandError +from django.contrib.auth.models import User + +import bio.models +from legacy.phpbb import unphpbb +from legacy.html2md import MarkdownWriter + +TIME_ZONES = { + '-5': 'US/Eastern', + '-6': 'US/Central', + '-7': 'US/Mountain', + '-8': 'US/Pacific', +} +USERNAME_RE = re.compile(r'^[\w.@+-]+$') +USERNAME_LEN = (1, 30) # min & max length values + + +def _valid_username(username): + """ + Return true if the username is valid. + """ + return (USERNAME_LEN[0] <= len(username) <= USERNAME_LEN[1] and + USERNAME_RE.match(username)) + + +def _break_name(name): + """ + Break name into a first and last name. + Return a 2-tuple of first_name, last_name. + """ + parts = name.split() + n = len(parts) + if n == 0: + t = '', '' + elif n == 1: + t = parts[0], '' + else: + t = ' '.join(parts[:-1]), parts[-1] + return t[0][:USERNAME_LEN[1]], t[1][:USERNAME_LEN[1]] + + +class Command(LabelCommand): + args = '' + help = 'Imports users from the old database in CSV format' + option_list = LabelCommand.option_list + ( + optparse.make_option("-s", "--super-user", + help="Make the user with this name a superuser"), + optparse.make_option("-a", "--anon-user", + help="Make the user with this name the anonymous user " + "[default: Anonymous]"), + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 20 users to show progress"), + ) + bb_parser = postmarkup.create(use_pygments=False, annotate_links=False) + md_writer = MarkdownWriter() + + def handle_label(self, filename, **options): + """ + Process each line in the CSV file given by filename by + creating a new user and profile. + + """ + self.superuser = options.get('super_user') + self.anonymous = options.get('anon_user') + if self.anonymous is None: + self.anonymous = 'Anonymous' + self.show_progress = options.get('progress') + + if self.superuser == self.anonymous: + raise CommandError("super-user name should not match anon-user") + + try: + with open(filename, "rb") as f: + self.reader = csv.DictReader(f) + num_rows = 0 + try: + for row in self.reader: + self.process_row(row) + num_rows += 1 + if self.show_progress and num_rows % 20 == 0: + sys.stdout.write('.') + sys.stdout.flush() + except csv.Error, e: + raise CommandError("CSV error: %s %s %s" % ( + filename, self.reader.line_num, e)) + + print + + except IOError: + raise CommandError("Could not open file: %s" % filename) + + def process_row(self, row): + """ + Process one row from the CSV file: create a user and user profile for + the row and save it in the database. + + """ + row = dict((k, v if v != 'NULL' else '') for k, v in row.iteritems()) + + if not _valid_username(row['username']): + print "Skipping import of %s; invalid username" % row['username'] + return + + n = User.objects.filter(username=row['username']).count() + if n > 0: + print "Skipping import of %s; user already exists" % row['username'] + return + + first_name, last_name = _break_name(row['name']) + is_superuser = self.superuser == row['username'] + is_anonymous = self.anonymous == row['username'] + + u = User(id=int(row['user_id']), + username=row['username'], + first_name=first_name, + last_name=last_name, + email=row['user_email'], + password=row['user_password'] if row['user_password'] else None, + is_staff=is_superuser, + is_active=True if not is_anonymous else False, + is_superuser=is_superuser, + last_login=datetime.fromtimestamp(int(row['user_lastvisit'])), + date_joined=datetime.strptime(row['user_regdate'], "%b %d, %Y")) + + if is_anonymous: + u.set_unusable_password() + + u.save() + + p = u.get_profile() + p.location = row['user_from'].decode('latin-1') + p.occupation = row['user_occ'].decode('latin-1') + p.interests = row['user_interests'].decode('latin-1') + p.profile_text = u'' + p.hide_email = True if row['user_viewemail'] != '1' else False + p.signature = self.to_markdown(row['user_sig']) if row['user_sig'] else u'' + p.time_zone = TIME_ZONES.get(row['user_timezone'], 'US/Pacific') + p.use_24_time = False + p.forum_post_count = int(row['user_posts']) + p.status = bio.models.STA_ACTIVE if p.forum_post_count > 10 else bio.models.STA_STRANGER + p.status_date = datetime.now() + p.update_date = p.status_date + p.save() + + def to_html(self, s): + return self.bb_parser.render_to_html(unphpbb(s), cosmetic_replace=False) + + def to_markdown(self, s): + self.md_writer.reset() + self.md_writer.feed(self.to_html(s)) + return self.md_writer.markdown() diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/management/commands/translate_old_posts.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/management/commands/translate_old_posts.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,134 @@ +""" +translate_old_posts.py - A management command to join the bbposts and +bbposts_text tables together and output as a .csv file, suitable for use as an +input to mysqlimport into the new database. This method bypasses the Django ORM +as it was too slow given the number of old posts to import. + +""" +from __future__ import with_statement +import csv +import optparse +from datetime import datetime + +import MySQLdb +import postmarkup + +from django.core.management.base import NoArgsCommand, CommandError + +from legacy.phpbb import unphpbb +from legacy.html2md import MarkdownWriter +from core.markup import SiteMarkup + + +def convert_ip(s): + """ + Converts a hex string representing an IP address into dotted notation. + """ + n = int(s, 16) + return "%d.%d.%d.%d" % ( + ((n >> 24) & 0xff), + ((n >> 16) & 0xff), + ((n >> 8) & 0xff), + n & 0xff) + + +class Command(NoArgsCommand): + help = """\ +This command joins the SG101 1.0 posts to 2.0 format and outputs the +data as a .csv file suitable for importing into the new database scheme with +the mysqlimport utility. +""" + option_list = NoArgsCommand.option_list + ( + optparse.make_option("-s", "--progress", action="store_true", + help="Output a . after every 100 posts to show progress"), + optparse.make_option("-a", "--host", help="set MySQL host name"), + optparse.make_option("-u", "--user", help="set MySQL user name"), + optparse.make_option("-p", "--password", help="set MySQL user password"), + optparse.make_option("-d", "--database", help="set MySQL database name"), + optparse.make_option("-o", "--out-file", help="set output filename"), + ) + bb_parser = postmarkup.create(use_pygments=False, annotate_links=False) + md_writer = MarkdownWriter() + site_markup = SiteMarkup() + + def handle_noargs(self, **opts): + + host = opts.get('host', 'localhost') or 'localhost' + user = opts.get('user', 'root') or 'root' + password = opts.get('password', '') or '' + database = opts.get('database') + out_filename = opts.get('out_file', 'forums_post.csv') or 'forums_post.csv' + + if database is None: + raise CommandError("Please specify a database option") + + out_file = open(out_filename, "wb") + + # database columns (fieldnames) for the output CSV file: + cols = ('id', 'topic_id', 'user_id', 'creation_date', 'update_date', + 'body', 'html', 'user_ip') + self.writer = csv.writer(out_file) + + # Write an initial row of fieldnames to the output file + self.writer.writerow(cols) + + # connect to the legacy database + try: + db = MySQLdb.connect(host=host, + user=user, + passwd=password, + db=database) + except MySQLdb.DatabaseError, e: + raise CommandError(str(e)) + + c = db.cursor(MySQLdb.cursors.DictCursor) + + # query the legacy database + sql = ('SELECT * FROM sln_bbposts as p, sln_bbposts_text as t WHERE ' + 'p.post_id = t.post_id ORDER BY p.post_id') + c.execute(sql) + + # convert the old data and write the output to the file + while True: + row = c.fetchone() + if row is None: + break + + self.process_row(row) + + c.close() + db.close() + out_file.close() + + def to_html(self, s): + return self.bb_parser.render_to_html(unphpbb(s), cosmetic_replace=False) + + def to_markdown(self, s): + self.md_writer.reset() + self.md_writer.feed(self.to_html(s)) + return self.md_writer.markdown() + + def process_row(self, row): + """ + This function accepts one row from the legacy database and converts the + contents to the new database format, and calls the writer to write the new + row to the output file. + """ + creation_date = datetime.fromtimestamp(float(row['post_time'])) + + if row['post_edit_time']: + update_date = datetime.fromtimestamp(float(row['post_edit_time'])) + else: + update_date = creation_date + + body = self.to_markdown(row['post_text']) + html = self.site_markup.convert(body) + + self.writer.writerow([row['post_id'], + row['topic_id'], + row['poster_id'], + creation_date, + update_date, + body.encode("utf-8"), + html.encode("utf-8"), + convert_ip(row['poster_ip'])]) diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,3 @@ +from django.db import models + +# Create your models here. diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/phpbb.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/phpbb.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,68 @@ +""" +This module contains functions for working with data from the legacy phpBB +based website. +""" +import re +import htmlentitydefs + + +# BBCode tags used by the old site +BBCODE_TAGS = "b i u s url quote img list * code color size".split() + +# Regular expressions used to get rid of phpBB's uid inside BBCode tags. +# This is a list of regular expression pairs. Element 0 of each pair +# is for the opening tag & element 1 is for the closing tag. + +BBCODE_RES = [( + re.compile(r"(\[%s):(?:[0-9a-fu]+:)?[0-9a-f]{10}" % tag), + re.compile(r"(\[/%s):(?:[0-9a-fu]+:)?[0-9a-f]{10}\]" % tag) +) for tag in BBCODE_TAGS] + + +## +# Removes HTML or XML character references and entities from a text string. +# +# @param text The HTML (or XML) source text. +# @return The plain text, as a Unicode string, if necessary. +# Source: http://effbot.org/zone/re-sub.htm#unescape-html +# +def unescape(text): + def fixup(m): + text = m.group(0) + if text[:2] == "&#": + # character reference + try: + if text[:3] == "&#x": + return unichr(int(text[3:-1], 16)) + else: + return unichr(int(text[2:-1])) + except ValueError: + pass + else: + # named entity + try: + text = unichr(htmlentitydefs.name2codepoint[text[1:-1]]) + except KeyError: + pass + return text # leave as is + return re.sub("&#?\w+;", fixup, text) + + +def unphpbb(s, encoding='latin-1'): + """Converts BBCode from phpBB database data into 'pure' BBCode. + + phpBB doesn't store plain BBCode in its database. The BBCode tags have + "uids" added to them and the data has already been HTML entity'ized. + This function removes the uid stuff and undoes the entity'ification and + returns the result as a unicode string. + + If the input 's' is not already unicode, it will be decoded using the + supplied encoding. + + """ + if not isinstance(s, unicode): + s = s.decode(encoding, 'replace') + for start, end in BBCODE_RES: + s = re.sub(start, r'\1', s, re.MULTILINE) + s = re.sub(end, r'\1]', s, re.MULTILINE) + return unescape(s) diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,38 @@ +""" +Tests for legacy app functions. +""" + +from django.test import TestCase + +from legacy.phpbb import unphpbb +from legacy.html2md import MarkdownWriter + +class UnPhpBbTest(TestCase): + + def test_unentities(self): + s1 = ""Look! No head!" - Laika & The Cosmonauts" + s2 = unphpbb(s1) + s3 = u'"Look! No head!" - Laika & The Cosmonauts' + self.failUnlessEqual(s2, s3) + + def test_rem_uuid1(self): + s1 = ("[url=http://www.thesurfites.com][color=black:3fdb565c83]" + "T H E - S U R F I T E S[/color:3fdb565c83][/url]") + s2 = unphpbb(s1) + s3 = (u'[url=http://www.thesurfites.com][color=black]' + 'T H E - S U R F I T E S[/color][/url]') + self.failUnlessEqual(s2, s3) + + +class Html2MdTest(TestCase): + + def test_sig1(self): + s1 = """

Pollo Del Mar
+Frankie & The Pool Boys
+PDM on FaceBook
+

""" + md_writer = MarkdownWriter() + md_writer.feed(s1) + s2 = md_writer.markdown() + s3 = u'[Pollo Del Mar](http://surfguitar101.com/modules.php?name=Web_Links&l_op=visit&lid=50) \n\n[Frankie & The Pool Boys](http://tinyurl.com/yjfmspj) \n\n[PDM on FaceBook](http://tinyurl.com/cnr27t) \n\n' + self.failUnlessEqual(s2, s3) diff -r c525f3e0b5d0 -r ee87ea74d46b legacy/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/legacy/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1 @@ +# Create your views here. diff -r c525f3e0b5d0 -r ee87ea74d46b manage.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/manage.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,9 @@ +#!/usr/bin/env python +import os, sys + +if __name__ == "__main__": + os.environ.setdefault("DJANGO_SETTINGS_MODULE", "gpp.settings") + + from django.core.management import execute_from_command_line + + execute_from_command_line(sys.argv) diff -r c525f3e0b5d0 -r ee87ea74d46b membermap/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/membermap/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,18 @@ +""" +Admin definitions for the member map application models. +""" + +from django.contrib import admin + +from membermap.models import MapEntry + +class MapEntryAdmin(admin.ModelAdmin): + exclude = ('html', ) + list_display = ('user', 'location', 'lat', 'lon', 'date_updated') + list_filter = ('date_updated', ) + date_hierarchy = 'date_updated' + ordering = ('-date_updated', ) + search_fields = ('user', 'location', 'message') + raw_id_fields = ('user', ) + +admin.site.register(MapEntry, MapEntryAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b membermap/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/membermap/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,27 @@ +""" +Forms for the member map application. +""" +from django import forms +from django.conf import settings + +from membermap.models import MapEntry + + +class MapEntryForm(forms.ModelForm): + location = forms.CharField(required=True, + widget=forms.TextInput(attrs={'size': 64, 'maxlength': 255})) + message = forms.CharField(required=False, + widget=forms.Textarea(attrs={'class': 'markItUp smileyTarget'})) + + class Meta: + model = MapEntry + fields = ('location', 'message') + + class Media: + css = { + 'all': (settings.GPP_THIRD_PARTY_CSS['markitup'] + + settings.GPP_THIRD_PARTY_CSS['jquery-ui']) + } + js = (settings.GPP_THIRD_PARTY_JS['markitup'] + + settings.GPP_THIRD_PARTY_JS['jquery-ui']) + diff -r c525f3e0b5d0 -r ee87ea74d46b membermap/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/membermap/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,32 @@ +""" +Models for the member map application. +""" +import datetime +from django.db import models +from django.contrib.auth.models import User + +from core.markup import site_markup + + +class MapEntry(models.Model): + """Represents a user's entry on the map.""" + user = models.ForeignKey(User) + location = models.CharField(max_length=255) + lat = models.FloatField() + lon = models.FloatField() + message = models.TextField(blank=True) + html = models.TextField(blank=True) + date_updated = models.DateTimeField() + + def __unicode__(self): + return u'Map entry for %s' % self.user.username + + class Meta: + ordering = ('-date_updated', ) + verbose_name_plural = 'map entries' + + def save(self, *args, **kwargs): + self.html = site_markup(self.message) + self.date_updated = datetime.datetime.now() + super(MapEntry, self).save(*args, **kwargs) + diff -r c525f3e0b5d0 -r ee87ea74d46b membermap/static/css/membermap.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/membermap/static/css/membermap.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,20 @@ +#member_map_members_column { + float: left; +} +#member_map_map { + width: 720px; + height: 540px; + border: 1px solid black; + margin: 0 auto; +} +#member_map_info { + padding-top: 1em; + clear: left; +} +.markItUp { + width: 600px; +} +.markItUpEditor { + width:543px; + height:200px; +} diff -r c525f3e0b5d0 -r ee87ea74d46b membermap/static/js/membermap.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/membermap/static/js/membermap.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,173 @@ +var mmap = { + map: null, + geocoder: null, + users: Object, + userOnMap: false, + userClick: function() { + var name = $('option:selected', this).text(); + if (name != mmap.selectText) + { + mmap.clickUser(name); + } + }, + clickUser: function(name) { + pt = new GLatLng(mmap.users[name].lat, mmap.users[name].lon); + mmap.map.setCenter(pt); + mmap.users[name].marker.openInfoWindowHtml(mmap.users[name].message); + }, + clear: function() { + mmap.users.length = 0; + }, + selectText: "(select)", + onMapDir: 'You have previously added yourself to the member map. Your information appears below. You may change ' + + 'the information if you wish. To delete yourself from the map, click the Delete button.', + offMapDir: 'Your location is not on the map. If you would like to appear on the map, please fill out the form below ' + + 'and click the Submit button.' +}; +$(document).ready(function() { + if (GBrowserIsCompatible()) + { + $(window).unload(GUnload); + mmap.map = new GMap2($('#member_map_map')[0]); + mmap.map.setCenter(new GLatLng(15.0, -30.0), 2); + mmap.map.enableScrollWheelZoom(); + mmap.map.addControl(new GLargeMapControl()); + mmap.map.addControl(new GMapTypeControl()); + mmap.geocoder = new GClientGeocoder(); + + if (mmapUser.userName) + { + $.getJSON('/member_map/query/', + function(data) { + mmap.map.clearOverlays(); + var sel = $('#member_map_members'); + sel[0].length = 0; + sel.append($('
+
+{{ shout.shout_date|date:"D M d Y H:i:s" }}
+Permalink +Flag +{% ifequal user.id shout.user.id %} +Delete +{% endifequal %} + + diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/shoutbox/shoutbox.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/shoutbox/shoutbox.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,38 @@ +{% extends 'side_block.html' %} +{% load url from future %} +{% load core_tags %} +{% block block_title %}Shoutbox{% endblock %} +{% block block_content %} +
+ {% for shout in shouts reversed %} +

+ {{ shout.user.username }}: + {{ shout.html|safe }}
+ {{ shout.shout_date|elapsed }} +

+ {% endfor %} +
+
+ + Shout History + +
+{% if user.is_authenticated %} +
+
+ +
+ + +
+ +
+{% else %} +

+Please login or +register to shout. +

+{% endif %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/shoutbox/view.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/shoutbox/view.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,30 @@ +{% extends 'base.html' %} +{% load bio_tags %} +{% load script_tags %} +{% block custom_css %} + + +{% endblock %} +{% block custom_js %} +{% script_tags "jquery-jeditable" %} + +{% endblock %} +{% block title %}Shout History{% endblock %} +{% block content %} +

Shout History

+{% if page.object_list %} +{% include 'core/pagination.html' %} + +
+ +{% for shout in page.object_list %} +{% include "shoutbox/shout_detail.html" %} +{% endfor %} +
+
+ +{% include 'core/pagination.html' %} +{% else %} +

No shouts at this time.

+{% endif %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/shoutbox/view_shout.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/shoutbox/view_shout.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,20 @@ +{% extends 'base.html' %} +{% load url from future %} +{% load script_tags %} +{% block custom_css %} + +{% endblock %} +{% block custom_js %} +{% script_tags "jquery-jeditable" %} + +{% endblock %} +{% block title %}Shout #{{ shout.id }}{% endblock %} +{% block content %} + +

Shout #{{ shout.id }}

+
+ +{% include "shoutbox/shout_detail.html" %} +
+
+{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/side_block.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/side_block.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,8 @@ +
+
+{% block block_title %}{% endblock %} +
+
+{% block block_content %}{% endblock %} +
+
diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/smiley/smiley_farm.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/smiley/smiley_farm.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,5 @@ +
+{% for s in smilies %} +{{ s.code }} +{% endfor %} +
diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/sopa.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/sopa.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,85 @@ + + + +SurfGuitar101.com is offline to protest SOPA / PIPA + + + + + +
+

SurfGuitar101.com Offline to Protest SOPA / PIPA

+

Dear Friends of SurfGuitar101.com:

+

+I am joining many other websites today and closing the site to protest two pieces of legislation in the US Congress: the so-called +Stop Online Piracy Act, or SOPA in the House, and the so-called +Protect IP Act, or PIPA in the Senate. I hope to draw your attention to these acts and +urge each one of you to read up on them. Then, please contact your representatives and ask them to withdraw +their support for these bills. +

+

I too am concerned about protecting copyrights and intellectual property. But these bills have provisions in them +that go too far. They allow media companies to ask the government to remove sites from the Internet without any +due process or oversight. The burden of proving that no coypright violations are present will fall on site operators. +Major tech companies like Google, Facebook, and Twitter oppose these bills. The engineers that built the internet have +also spoken out, pointing out that the provisions in these bills will not prevent piracy, but in fact will create +security problems and disrupt the operation of the Internet. +

+

+The Internet is quite possibly the greatest invention of my lifetime. It should be a tool for free expression, +democracy, innovation, and entrepreneurship. However, the media companies are failing to innovate and embrace +this new digital age, and instead are asking the US government to essentially let them decide what we can and cannot +view on the internet. We cannot let the US goverment +join the ranks of despot countries like China, Iran, Syria, and North Korea and censor their citizens' use of the Internet. +

+

Here are some links that I ask you to look over. They explain the issues far better than I can.

+ +

+Thank you for your patience and understanding. I firmly believe that even small community websites like ours +would be threatened if bills like this were allowed to pass.
+-- Brian Neal +

+
+ + + + + + + +
Are you a US citizen?Contact your representatives now...
Not in the US?Petition the US State Department...
+ +
+ + diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/add_link.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/add_link.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,20 @@ +{% extends 'weblinks/base.html' %} +{% load url from future %} +{% block title %}Web Links: Add Link{% endblock %} +{% block weblinks_content %} +

Add Link

+ {% if add_form %} +
{% csrf_token %} + + {{ add_form.as_table }} + +
  +  Cancel +
+
+
+ {% else %} +

Thank you for submitting a link!

+

Your link has been submitted for review to the site staff.

+ {% endif %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/base.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/base.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,15 @@ +{% extends 'base.html' %} +{% load weblinks_tags %} +{% block custom_css %} + +{% block weblinks_css %}{% endblock %} +{% block weblinks_js %}{% endblock %} +{% endblock %} +{% block content %} +

Web Links

+{% include 'weblinks/navigation.html' %} + +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/index.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/index.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,16 @@ +{% extends 'weblinks/base.html' %} +{% load url from future %} +{% block title %}Web Links{% endblock %} +{% block weblinks_content %} +

Categories

+ {% if categories %} +

We have {{ total_links }} links in {{ categories.count }} categories.

+
+ {% for category in categories %} +
{{ category.title }} + ({{ category.count }})
+

{{ category.description }}

+ {% endfor %} +
+ {% endif %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/latest_tag.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/latest_tag.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,12 @@ +{% load core_tags %} +

New Links

+{% if links %} +
    + {% for link in links %} +
  1. {{ link.title }} - + {{ link.date_added|elapsed }}
  2. + {% endfor %} +
+{% else %} +

No links at this time.

+{% endif %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/link.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/link.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,25 @@ +{% load url from future %} +{% load bio_tags %} +
+

{{ link.title }}

+
+
+

{{ link.description }}

+
{% csrf_token %} + + + + + + + + + + + + +
+
+
diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/link_detail.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/link_detail.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,8 @@ +{% extends 'weblinks/base.html' %} +{% block title %}Web Links: {{ link.title }}{% endblock %} +{% block weblinks_content %} +

Link Details: {{ link.title }}

+
+{% include 'weblinks/link.html' %} +
+{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/link_summary.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/link_summary.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,21 @@ +{% extends 'weblinks/base.html' %} +{% block title %}Web Links: {{ title }}{% endblock %} +{% block weblinks_css %} + + +{% endblock %} +{% block weblinks_js %} + +{% endblock %} +{% block weblinks_content %} +

{{ title }}

+{% if page.object_list %} + {% include 'core/pagination.html' %} +
+ {% for link in page.object_list %} + {% include 'weblinks/link.html' %} + {% endfor %} +
+ {% include 'core/pagination.html' %} +{% endif %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/navigation.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/navigation.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,17 @@ +{% load url from future %} + + +
+
{% csrf_token %} + +
+
diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/weblinks/view_links.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/weblinks/view_links.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,34 @@ +{% extends 'weblinks/base.html' %} +{% load url from future %} +{% block title %}Web Links: {{ category.title }}{% endblock %} +{% block weblinks_css %} + + +{% endblock %} +{% block weblinks_js %} + +{% endblock %} +{% block weblinks_content %} +

Category: {{ category.title }}

+ +{% if page.object_list %} + + +{% include 'core/pagination.html' %} + +
+{% for link in page.object_list %} + {% include 'weblinks/link.html' %} +{% endfor %} +
+ +{% include 'core/pagination.html' %} +{% endif %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/ygroup/pagination.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/ygroup/pagination.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,26 @@ + +
diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/ygroup/post_detail.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/ygroup/post_detail.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,15 @@ +{% extends 'base.html' %} +{% load url from future %} +{% block title %}Yahoo Group Archives: {{ post.title }}{% endblock %} +{% block content %} +

Yahoo Group Archives »

+

{{ post.title }} + + permalink +

+
+
{{ post.poster }} - {{ post.creation_date|date:"d M Y H:i:s" }}
+
{{ post.msg|linebreaks }}
+
+

See this post in context.

+{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/ygroup/thread.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/ygroup/thread.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,32 @@ +{% extends 'base.html' %} +{% load url from future %} +{% block title %}Yahoo Group Archives: {{ thread.title }}{% endblock %} +{% block custom_css %} + +{% endblock %} +{% block content %} + +{% if thread.page == 1 %} +

Yahoo Group Archives »

+{% else %} +

Yahoo Group Archives » + Page {{ thread.page }} »

+{% endif %} +

{{ thread.title }} + + permalink +

+{% include "ygroup/pagination.html" %} +
+ {% for post in page_obj.object_list %} +
{{ post.poster }} - {{ post.creation_date|date:"d M Y H:i:s" }} + + permalink
+
+ {{ post.msg|linebreaks }} +

Top

+
+ {% endfor %} +
+{% include "ygroup/pagination.html" %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/templates/ygroup/thread_list.html --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/templates/ygroup/thread_list.html Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,25 @@ +{% extends 'base.html' %} +{% load url from future %} +{% block title %}Yahoo Group Archives{% endblock %} +{% block custom_css %} + +{% endblock %} +{% block content %} +

Yahoo Group Archives » Page {{ page_obj.number }}

+

+SurfGuitar101.com began as a Yahoo Group on October 31, 2001. It ran until August, 2007 when this site officially replaced it. On these pages you'll find the archived messages of our original group. You can also search through these messages via our search page. +

+{% include "ygroup/pagination.html" %} + + + {% for thread in page_obj.object_list %} + + + + + + + {% endfor %} +
TitleAuthorPostsDate
{{ thread.title }}{{ thread.poster }}{{ thread.post_count }}{{ thread.creation_date|date:"d M Y" }}
+{% include "ygroup/pagination.html" %} +{% endblock %} diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,70 @@ +from django.conf.urls import patterns, url, include +from django.conf.urls.static import static +from django.conf import settings +from django.contrib import admin +from django.views.decorators.cache import cache_page + +from haystack.views import search_view_factory + +from news.feeds import LatestNewsFeed +from forums.feeds import ForumsFeed +from custom_search.forms import CustomModelSearchForm + + +admin.autodiscover() + +urlpatterns = patterns('', + url(r'^$', 'sg101.views.home', name='home'), + (r'^admin/doc/', include('django.contrib.admindocs.urls')), + + url(r'^admin/password_reset/$', 'django.contrib.auth.views.password_reset', name='admin_password_reset'), + (r'^admin/password_reset/done/$', 'django.contrib.auth.views.password_reset_done'), + (r'^reset/(?P[0-9A-Za-z]+)-(?P.+)/$', 'django.contrib.auth.views.password_reset_confirm'), + (r'^reset/done/$', 'django.contrib.auth.views.password_reset_complete'), + + (r'^admin/', include(admin.site.urls)), + (r'^accounts/', include('accounts.urls')), + (r'^antispam/', include('antispam.urls')), + (r'^calendar/', include('gcalendar.urls')), + (r'^comments/', include('comments.urls')), + (r'^contact/', include('contact.urls')), + (r'^contests/', include('contests.urls')), + (r'^core/', include('core.urls')), + (r'^donations/', include('donations.urls')), + (r'^downloads/', include('downloads.urls')), + url(r'^feeds/news/$', + cache_page(6 * 60 * 60)(LatestNewsFeed()), + name='feeds-news'), + url(r'^feeds/forums/$', + cache_page(5 * 60)(ForumsFeed()), + {'forum_slug': None}, + 'feeds-forum_combined'), + url(r'^feeds/forums/(?P[\w\d-]+)/$', + cache_page(5 * 60)(ForumsFeed()), + name='feeds-forum'), + (r'^forums/', include('forums.urls')), + (r'^irc/', include('irc.urls')), + (r'^links/', include('weblinks.urls')), + (r'^member_map/', include('membermap.urls')), + (r'^messages/', include('messages.urls')), + (r'^news/', include('news.urls')), + (r'^oembed/', include('oembed.urls')), + (r'^pb/', include('phantombrigade.urls')), + (r'^podcast/', include('podcast.urls')), + (r'^polls/', include('polls.urls')), + (r'^potd/', include('potd.urls')), + (r'^profile/', include('bio.urls')), + (r'^shout/', include('shoutbox.urls')), + (r'^smiley/', include('smiley.urls')), + (r'^ygroup/', include('ygroup.urls')), +) + +# Haystack search views +urlpatterns += patterns('haystack.views', + url(r'^search/$', + search_view_factory(form_class=CustomModelSearchForm, load_all=True), + name='haystack_search'), +) + +# For serving media files in development only: +urlpatterns += static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT) diff -r c525f3e0b5d0 -r ee87ea74d46b sg101/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/sg101/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,15 @@ +""" +This file contains views that don't belong to any specific application. +In particular, the home page view. +""" +from django.shortcuts import render_to_response +from django.template import RequestContext + + +def home(request): + """ + The home page view of the site. + """ + return render_to_response('home.html', { + }, + context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,34 @@ +""" +This file contains the automatic admin site definitions for the shoutbox models. +""" +from django.contrib import admin +from shoutbox.models import Shout +from shoutbox.models import ShoutFlag + +class ShoutAdmin(admin.ModelAdmin): + list_display = ('__unicode__', 'user', 'shout_date') + raw_id_fields = ('user', ) + date_hierarchy = 'shout_date' + exclude = ('html', ) + search_fields = ('shout', 'user__username') + list_filter = ('shout_date', ) + + +class ShoutFlagAdmin(admin.ModelAdmin): + list_display = ('__unicode__', 'flag_date', 'shout', 'get_shout_url') + actions = ('delete_shouts', ) + + def delete_shouts(self, request, qs): + """ + Admin action function to delete the shouts associated with the shout + flags. + """ + for flag in qs: + flag.shout.delete() # will delete the flag too + + delete_shouts.short_description = "Delete selected flags & shouts" + + +admin.site.register(Shout, ShoutAdmin) +admin.site.register(ShoutFlag, ShoutFlagAdmin) + diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,24 @@ +""" +Forms for the Shoutbox application. +""" + +import re +from django import forms + +url_re = re.compile('(' + r'^https?://' # http:// or https:// + r'(?:(?:[A-Z0-9-]+\.)+[A-Z]{2,6}|' #domain... + r'localhost|' #localhost... + r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})' # ...or ip + r'(?::\d+)?' # optional port + r'(?:/?|/\S+))', re.IGNORECASE) + + +class ShoutBoxForm(forms.Form): + msg = forms.CharField(label='', max_length=2048, required=True) + + def get_shout(self): + msg = self.cleaned_data['msg'] + msg = re.sub(url_re, r'URL', msg) + return msg + diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,56 @@ +""" +Models for the shoutbox application. +""" +import datetime + +from django.db import models +from django.contrib.auth.models import User +from django.utils.html import escape, urlize + +from smiley import smilify_html + + +class Shout(models.Model): + user = models.ForeignKey(User) + shout_date = models.DateTimeField(blank=True) + shout = models.TextField() + html = models.TextField() + + class Meta: + ordering = ('-shout_date', ) + + def __unicode__(self): + if len(self.shout) > 60: + return self.shout[:60] + "..." + return self.shout + + @models.permalink + def get_absolute_url(self): + return ('shoutbox-view', [str(self.id)]) + + def save(self, *args, **kwargs): + if not self.id: + self.shout_date = datetime.datetime.now() + self.html = urlize(smilify_html(escape(self.shout)), trim_url_limit=15, + nofollow=True) + super(Shout, self).save(*args, **kwargs) + + +class ShoutFlag(models.Model): + """This model represents a user flagging a shout as inappropriate.""" + user = models.ForeignKey(User) + shout = models.ForeignKey(Shout) + flag_date = models.DateTimeField(auto_now_add=True) + + def __unicode__(self): + return u'Shout ID %s flagged by %s' % (self.shout_id, self.user.username) + + class Meta: + ordering = ('flag_date', ) + + def get_shout_url(self): + return 'Shout #%(id)d' % ( + {'id': self.shout.id}) + get_shout_url.allow_tags = True + get_shout_url.short_description = 'Link to Shout' + diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/static/css/shoutbox.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/static/css/shoutbox.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,36 @@ +span.shoutbox-user { + font-weight: bold; + text-decoration: underline; +} +span.shoutbox-shout { +} +span.shoutbox-date { + font-style: italic; +} +div#shoutbox-smiley-frame { + margin: 0.5em 2px; +} +div#shoutbox-smiley-frame img { + padding: 1px 1px; +} +div.smiley_farm img { + border: 0; + cursor: pointer; +} +#shoutbox-shout-container { + margin: auto; + width: 142px; + height: 200px; + background-color: #bdd6d6; + border: 1px solid teal; + padding: 2px; + padding-left: 4px; + margin-bottom: 2px; + -moz-border-radius: 5px; + border-radius: 5px; +} +#shoutbox-shout-container p { + margin-left: 2px; + margin-right: 2px; + margin-top: 0.5em; +} diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/static/css/shoutbox_app.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/static/css/shoutbox_app.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,32 @@ +div.shoutbox-history table { + border-collapse: collapse; + width: 95%; + margin: 1em auto 1em auto; + border: 1px solid black; +} +div.shoutbox-history th { + border: 1px solid black; + padding: 5px 2px; + text-align: center; + width: 10%; +} +div.shoutbox-history td { + border: 1px solid black; + padding: 5px 5px; + width: 90%; +} +div.shoutbox-history tr.odd { + background-color: #ddd; +} +div.shoutbox-history .date { + font-style: italic; +} + +div.shoutbox-history .edit { + padding: 5px 5px; +} + +div.shoutbox-history .edit:hover { + background-color: #7fffd4; + cursor: pointer; +} diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/static/js/shoutbox.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/static/js/shoutbox.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,107 @@ +$(document).ready(function() { + $.ajaxSetup({ + beforeSend: function(xhr, settings) { + function getCookie(name) { + var cookieValue = null; + if (document.cookie && document.cookie != '') { + var cookies = document.cookie.split(';'); + for (var i = 0; i < cookies.length; i++) { + var cookie = jQuery.trim(cookies[i]); + // Does this cookie string begin with the name we want? + if (cookie.substring(0, name.length + 1) == (name + '=')) { + cookieValue = decodeURIComponent(cookie.substring(name.length + 1)); + break; + } + } + } + return cookieValue; + } + if (!(/^http:.*/.test(settings.url) || /^https:.*/.test(settings.url))) { + // Only send the token to relative URLs i.e. locally. + xhr.setRequestHeader("X-CSRFToken", getCookie('csrftoken')); + } + } + }); + + $("html").bind("ajaxStart", function() { + $(this).addClass('busy'); + }).bind("ajaxStop", function() { + $(this).removeClass('busy'); + }); + + var numShouts = $('#shoutbox-shout-container > p').size(); + var sbBox = $('#shoutbox-shout-container'); + + if (numShouts < 2) + { + sbBox.append('

Welcome to SurfGuitar101.com!

'); + ++numShouts; + } + if (numShouts < 2) + { + sbBox.append('

((((( More Reverb )))))

'); + ++numShouts; + } + + var sbCycleOpts = null; + var sbCycle = sbBox.cycle({ + fx: 'scrollUp', + timeout: 5000, + pause: 1, + next: '#shoutbox-next', + prev: '#shoutbox-prev', + before: function(curr, next, opts) { + if (!opts.addSlide || sbCycleOpts) return; + sbCycleOpts = opts; + } + }); + function addShout(shout) { + ++numShouts; + sbCycleOpts.addSlide(shout); + sbBox.cycle(numShouts - 1); + } + + var submit = $('#shoutbox-submit'); + submit.click(function () { + var input = $('#shoutbox-smiley-input'); + var msg = $.trim(input.val()); + if (msg.length == 0) { + return false; + } + submit.attr('disabled', 'disabled'); + $.ajax({ + url: '/shout/shout/', + type: 'POST', + data: { msg: msg }, + dataType: 'html', + success: function (data, textStatus) { + input.val(''); + if (data != '') { + addShout(data); + } + submit.removeAttr('disabled'); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + return false; + }); + var smilies_loaded = false; + var smiley_frame = $('#shoutbox-smiley-frame'); + $('#shoutbox-smilies').click(function () { + smiley_frame.toggle(); + if (!smilies_loaded) { + smiley_frame.load('/smiley/farm/', function () { + $('#shoutbox-busy-icon').hide(); + var txt = $("#shoutbox-smiley-input")[0]; + $('#shoutbox-smiley-frame img').click(function() { + txt.value += ' ' + this.alt + ' '; + txt.focus(); + }); + smilies_loaded = true; + }); + } + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/static/js/shoutbox_app.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/static/js/shoutbox_app.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,60 @@ +$(document).ready(function() { + $('div.shoutbox-history table tr:odd').addClass('odd'); + $('.edit').editable('/shout/edit/', { + loadurl : '/shout/text/', + indicator : 'Saving...', + tooltip : 'Click to edit your shout...', + submit : 'OK', + cancel : 'Cancel' + }); + $('a.shout-del').click(function () { + if (confirm('Really delete this shout?')) { + var id = this.id; + if (id.match(/^shout-del-(\d+)/)) { + $.ajax({ + url: '/shout/delete/', + type: 'POST', + data: { id : RegExp.$1 }, + dataType: 'text', + success: function (id) { + var id = '#shout-del-' + id; + $(id).parents('tr').fadeOut(1500, function () { + $('div.shoutbox-history table tr:visible:even').removeClass('odd'); + $('div.shoutbox-history table tr:visible:odd').addClass('odd'); + }); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + } + } + return false; + }); + $('.shout-flag').click(function () { + var id = this.id; + if (id.match(/^shout-flag-(\d+)/)) { + id = RegExp.$1; + if (confirm('Only flag a shout if you feel it is spam, abuse, violates site rules, ' + + 'or is not appropriate. ' + + 'A moderator will be notified and will review the shout. ' + + 'Are you sure you want to flag this shout?')) { + $.ajax({ + url: '/shout/flag/', + type: 'POST', + data: { id : id }, + dataType: 'text', + success: function(response) { + alert(response); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred. ' + xhr.statusText + ' - ' + + xhr.responseText); + } + }); + } + } + return false; + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/templatetags/shoutbox_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/templatetags/shoutbox_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,17 @@ +""" +Template tags for the shoutbox application. +""" +from django import template + +from shoutbox.models import Shout + +register = template.Library() + +@register.inclusion_tag('shoutbox/shoutbox.html', takes_context=True) +def shoutbox(context): + shouts = Shout.objects.select_related('user')[:10] + return { + 'shouts': shouts, + 'user': context['user'], + 'STATIC_URL': context['STATIC_URL'], + } diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,15 @@ +""" +Urls for the Shoutbox application. +""" + +from django.conf.urls import patterns, url + +urlpatterns = patterns('shoutbox.views', + url(r'^delete/$', 'delete', name='shoutbox-delete'), + url(r'^edit/$', 'edit', name='shoutbox-edit'), + url(r'^flag/$', 'flag', name='shoutbox-flag'), + url(r'^shout/$', 'shout', name='shoutbox-shout'), + url(r'^text/$', 'text', name='shoutbox-text'), + url(r'^view/(\d+)/$', 'view_shout', name='shoutbox-view'), + url(r'^view/history/$', 'view_history', name='shoutbox-history'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b shoutbox/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/shoutbox/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,156 @@ +""" +Views for the Shoutbox application. +""" + +import re +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.core.paginator import InvalidPage +from django.http import HttpResponse +from django.http import HttpResponseBadRequest +from django.http import HttpResponseForbidden +from django.http import HttpResponseRedirect +from django.http import Http404 +from django.contrib.auth.decorators import login_required +from django.views.decorators.http import require_POST + +from core.paginator import DiggPaginator +from core.functions import email_admins +from core.functions import get_page +from shoutbox.forms import ShoutBoxForm +from shoutbox.models import Shout +from shoutbox.models import ShoutFlag + +SHOUTS_PER_PAGE = 10 + +@login_required +@require_POST +def shout(request): + msg = request.POST.get('msg', '').strip() + if msg == '': + return HttpResponse('') + + shout = Shout(user=request.user, shout=msg) + shout.save() + return render_to_response('shoutbox/shout.html', { + 'shout': shout, + }, + context_instance = RequestContext(request)) + + +def view_shout(request, id): + """This view is for viewing an individual shout.""" + try: + shout = Shout.objects.get(pk=id) + except Shout.DoesNotExist: + return render_to_response('shoutbox/missing_shout.html', {}, + context_instance = RequestContext(request)) + + return render_to_response('shoutbox/view_shout.html', { + 'shout': shout, + }, + context_instance = RequestContext(request)) + + +def view_history(request): + """This view allows one to view the shoutbox history.""" + paginator = DiggPaginator(Shout.objects.all().select_related(), + SHOUTS_PER_PAGE, body=5, tail=3, margin=3, padding=2) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + return render_to_response('shoutbox/view.html', { + 'page': the_page, + }, + context_instance = RequestContext(request)) + + +shout_id_re = re.compile(r'shout-(\d+)') + +def text(request): + """This view function retrieves the text of a shout; it is used in the in-place + editing of shouts on the shoutbox history view.""" + if request.user.is_authenticated(): + m = shout_id_re.match(request.GET.get('id', '')) + if m is None: + return HttpResponseBadRequest() + try: + shout = Shout.objects.get(pk=m.group(1)) + except Shout.DoesNotExist: + return HttpResponseBadRequest() + return HttpResponse(shout.shout) + + return HttpResponseForbidden() + + +def edit(request): + """This view accepts a shoutbox edit from the shoutbox history view.""" + if request.user.is_authenticated(): + m = shout_id_re.match(request.POST.get('id', '')) + if m is None: + return HttpResponseBadRequest() + try: + shout = Shout.objects.get(pk=m.group(1)) + except Shout.DoesNotExist: + return HttpResponseBadRequest() + if request.user != shout.user: + return HttpResponseForbidden() + new_shout = request.POST.get('value', '').strip() + if new_shout == '': + return HttpResponseBadRequest() + shout.shout = new_shout + shout.save() + return HttpResponse(shout.html) + + return HttpResponseForbidden() + + +def delete(request): + """This view deletes a shout. It is called by AJAX from the shoutbox history view.""" + if request.user.is_authenticated(): + id = request.POST.get('id', None) + if id is None or not id.isdigit(): + return HttpResponseBadRequest() + try: + shout = Shout.objects.get(pk=id) + except Shout.DoesNotExist: + return HttpResponseBadRequest() + if request.user != shout.user: + return HttpResponseForbidden() + shout.delete() + return HttpResponse(id) + + return HttpResponseForbidden() + + +@require_POST +def flag(request): + """ + This function handles the flagging of shouts by users. This function should + be the target of an AJAX post. + """ + if not request.user.is_authenticated(): + return HttpResponse('Please login or register to flag a shout.') + + id = request.POST.get('id', None) + if id is None: + return HttpResponseBadRequest('No id') + + try: + shout = Shout.objects.get(pk=id) + except Shout.DoesNotExist: + return HttpResponseBadRequest('No shout with id %s' % id) + + flag = ShoutFlag(user=request.user, shout=shout) + flag.save() + email_admins('A Shout Has Been Flagged', """Hello, + +A user has flagged a shout for review. +""") + return HttpResponse('The shout was flagged. A moderator will review the shout shortly. ' \ + 'Thanks for helping to improve the quality of this site.') + +# vim: ts=4 sw=4 diff -r c525f3e0b5d0 -r ee87ea74d46b smiley/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/smiley/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,70 @@ +""" +Smiley classes and functions. +""" +import re + +from django.utils.safestring import SafeData +from django.utils.html import conditional_escape + +from smiley.models import Smiley + + +class SmilifyHtml(object): + """ + A class to "smilify" text by replacing text with HTML img tags for smiley + images. + """ + def __init__(self): + self.map = Smiley.objects.get_smiley_map() + + def convert(self, value, autoescape=False): + """ + Converts and returns the supplied text with the HTML version of the + smileys. + """ + if not value: + return u'' + + if not autoescape or isinstance(value, SafeData): + esc = lambda x: x + else: + esc = conditional_escape + + words = value.split() + for i, word in enumerate(words): + if word in self.map: + words[i] = self.map[word] + else: + words[i] = esc(words[i]) + return u' '.join(words) + + +class SmilifyMarkdown(object): + """ + A class to "smilify" text by replacing text with Markdown image syntax for + smiley images. + """ + def __init__(self): + self.regexes = Smiley.objects.get_smiley_regexes() + + def convert(self, s): + """ + Returns a string copy of the input s that has the smiley codes replaced + with Markdown for smiley images. + """ + if not s: + return u'' + + for regex, repl in self.regexes: + s = regex.sub(repl, s) + return s + + +def smilify_html(value, autoescape=False): + """ + A convenience function to "smilify" text by replacing text with HTML + img tags of smilies. + """ + s = SmilifyHtml() + return s.convert(value, autoescape=autoescape) + diff -r c525f3e0b5d0 -r ee87ea74d46b smiley/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/smiley/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,12 @@ +""" +This file contains the automatic admin site definitions for the Smiley models. +""" + +from django.contrib import admin +from smiley.models import Smiley + +class SmileyAdmin(admin.ModelAdmin): + list_display = ('title', 'code', 'html', 'is_extra') + list_filter = ('is_extra', ) + +admin.site.register(Smiley, SmileyAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b smiley/fixtures/smilies.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/smiley/fixtures/smilies.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,832 @@ +[ + { + "pk": 61, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_iagree.gif", + "code": ":agree:", + "title": "Agree" + } + }, + { + "pk": 57, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_angel_1.gif", + "code": ":angel:", + "title": "Angel" + } + }, + { + "pk": 22, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/upset.gif", + "code": ":argh:", + "title": "Argh" + } + }, + { + "pk": 42, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_arrow.gif", + "code": ":arrow:", + "title": "Arrow" + } + }, + { + "pk": 5, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/biggrin.gif", + "code": ":-D", + "title": "Big Grin" + } + }, + { + "pk": 7, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/bigrazz.gif", + "code": ":-P", + "title": "Big Razz" + } + }, + { + "pk": 55, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_blah.gif", + "code": ":blah:", + "title": "Blah Blah" + } + }, + { + "pk": 52, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_beer.gif", + "code": ":cheers:", + "title": "Cheers" + } + }, + { + "pk": 28, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_confused.gif", + "code": ":?", + "title": "Confused" + } + }, + { + "pk": 8, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/confused.gif", + "code": "o_O", + "title": "Confused" + } + }, + { + "pk": 4, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/cool.gif", + "code": "8^)", + "title": "Cool" + } + }, + { + "pk": 29, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_cool.gif", + "code": "8)", + "title": "Cool" + } + }, + { + "pk": 9, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/cry.gif", + "code": ":-(", + "title": "Cry" + } + }, + { + "pk": 34, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_cry_1.gif", + "code": ":cry:", + "title": "Crying" + } + }, + { + "pk": 10, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/dead.gif", + "code": "x_x", + "title": "Dead" + } + }, + { + "pk": 77, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_tumbleweed.gif", + "code": ":tumbleweed:", + "title": "Dead Thread" + } + }, + { + "pk": 60, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_drool.gif", + "code": ":drool:", + "title": "Drool" + } + }, + { + "pk": 47, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_augh.gif", + "code": ":bonk:", + "title": "Duh" + } + }, + { + "pk": 33, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_redface.gif", + "code": ":oops:", + "title": "Embarassed" + } + }, + { + "pk": 11, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/embarrassed.gif", + "code": ":-#", + "title": "Embarrassed" + } + }, + { + "pk": 35, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_evil.gif", + "code": ":evil:", + "title": "Evil" + } + }, + { + "pk": 39, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_exclaim.gif", + "code": ":!:", + "title": "Exclamation" + } + }, + { + "pk": 83, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_facepalm1.gif", + "code": ":facepalm:", + "title": "Face Palm" + } + }, + { + "pk": 59, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_fight.gif", + "code": ":fight:", + "title": "Fight" + } + }, + { + "pk": 63, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_groovy.gif", + "code": ":groovy:", + "title": "Groovy" + } + }, + { + "pk": 73, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_guitar.gif", + "code": ":guitar:", + "title": "Guitar" + } + }, + { + "pk": 80, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/headbang_1.gif", + "code": ":headbang:", + "title": "Headbang" + } + }, + { + "pk": 69, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_help.gif", + "code": ":help:", + "title": "Help" + } + }, + { + "pk": 71, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_hmm.gif", + "code": ":hmmm:", + "title": "Hmmm" + } + }, + { + "pk": 41, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_idea.gif", + "code": ":idea:", + "title": "Idea" + } + }, + { + "pk": 62, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_money.gif", + "code": ":$$:", + "title": "Ka-Ching!!!" + } + }, + { + "pk": 72, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_huglove.gif", + "code": ":kiss:", + "title": "Kiss" + } + }, + { + "pk": 45, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_lame.gif", + "code": ":lame:", + "title": "Lame" + } + }, + { + "pk": 30, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_lol.gif", + "code": ":lol:", + "title": "Laughing" + } + }, + { + "pk": 54, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_lmao.gif", + "code": ":lmao:", + "title": "LMAO" + } + }, + { + "pk": 13, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/laugh.gif", + "code": ":lol:", + "title": "LOL" + } + }, + { + "pk": 14, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/mad.gif", + "code": "X-(", + "title": "Mad" + } + }, + { + "pk": 31, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_mad.gif", + "code": ":x", + "title": "Mad" + } + }, + { + "pk": 78, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_monkey.gif", + "code": ":monkey:", + "title": "Monkey" + } + }, + { + "pk": 44, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_mrgreen.gif", + "code": ":mrgreen:", + "title": "Mr. Green" + } + }, + { + "pk": 43, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_neutral.gif", + "code": ":neutral:", + "title": "Neutral" + } + }, + { + "pk": 3, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/no.gif", + "code": ":no:", + "title": "No" + } + }, + { + "pk": 15, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/none.gif", + "code": ":-|", + "title": "None" + } + }, + { + "pk": 53, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_offtopic.gif", + "code": ":ot:", + "title": "Off Topic" + } + }, + { + "pk": 46, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_oops.gif", + "code": ":omg:", + "title": "OMG" + } + }, + { + "pk": 81, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_oyvey.gif", + "code": ":oyvey:", + "title": "Oy Vey" + } + }, + { + "pk": 75, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_paranoid.gif", + "code": ":paranoid:", + "title": "Paranoid" + } + }, + { + "pk": 84, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_facepalm2.gif", + "code": ":facepalm2:", + "title": "Picard Face Palm" + } + }, + { + "pk": 58, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_puke.gif", + "code": ":puke:", + "title": "Puke" + } + }, + { + "pk": 40, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_question.gif", + "code": ":?:", + "title": "Question" + } + }, + { + "pk": 32, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_razz.gif", + "code": ":P", + "title": "Razz" + } + }, + { + "pk": 85, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_rimshot.gif", + "code": ":rimshot:", + "title": "Rimshot" + } + }, + { + "pk": 49, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_rock.gif", + "code": ":rock:", + "title": "Rock" + } + }, + { + "pk": 37, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_rolleyes.gif", + "code": ":roll:", + "title": "Rolling Eyes" + } + }, + { + "pk": 51, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_rotfl.gif", + "code": ":rotfl:", + "title": "ROTFL" + } + }, + { + "pk": 74, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_rtfm.gif", + "code": ":rtfm:", + "title": "RTFM" + } + }, + { + "pk": 25, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_sad.gif", + "code": ":(", + "title": "Sad" + } + }, + { + "pk": 67, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_sg101.gif", + "code": ":sg101:", + "title": "SG101!" + } + }, + { + "pk": 1, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/bigeek.gif", + "code": ":shock:", + "title": "Shock" + } + }, + { + "pk": 27, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_eek.gif", + "code": "8O", + "title": "Shocked" + } + }, + { + "pk": 19, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/sigh.gif", + "code": ":sigh:", + "title": "Sigh" + } + }, + { + "pk": 68, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_sleep.gif", + "code": ":zzz:", + "title": "Sleeping" + } + }, + { + "pk": 24, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_smile.gif", + "code": ":)", + "title": "Smile" + } + }, + { + "pk": 20, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/smile.gif", + "code": ":-)", + "title": "Smile" + } + }, + { + "pk": 70, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_sorry.gif", + "code": ":sorry:", + "title": "Sorry" + } + }, + { + "pk": 56, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_stir.gif", + "code": ":stir:", + "title": "Stir the Pot" + } + }, + { + "pk": 66, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_hitsfan.gif", + "code": ":hits-fan:", + "title": "Stuff Hits the Fan" + } + }, + { + "pk": 79, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_suicide.gif", + "code": ":suicide:", + "title": "Suicide" + } + }, + { + "pk": 26, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_surprised.gif", + "code": ":o", + "title": "Surprised" + } + }, + { + "pk": 76, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_hijack.gif", + "code": ":hijack:", + "title": "Thread Hijack" + } + }, + { + "pk": 65, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_thumbsdown.gif", + "code": ":thumbs-down:", + "title": "Thumbs Down" + } + }, + { + "pk": 64, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_thumbsup.gif", + "code": ":thumbs-up:", + "title": "Thumbs Up" + } + }, + { + "pk": 36, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_twisted.gif", + "code": ":twisted:", + "title": "Twisted Evil" + } + }, + { + "pk": 21, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/uhoh.gif", + "code": ":uh-oh:", + "title": "Uh-Oh" + } + }, + { + "pk": 23, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_biggrin.gif", + "code": ":D", + "title": "Very Happy" + } + }, + { + "pk": 50, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_twak.gif", + "code": ":whack:", + "title": "Whack" + } + }, + { + "pk": 17, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/rolleyes.gif", + "code": ":whatever:", + "title": "Whatever" + } + }, + { + "pk": 6, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/smilewinkgrin.gif", + "code": ";-)", + "title": "Wink" + } + }, + { + "pk": 38, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_wink.gif", + "code": ":wink:", + "title": "Wink" + } + }, + { + "pk": 48, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_worship.gif", + "code": ":worship:", + "title": "Worship" + } + }, + { + "pk": 82, + "model": "smiley.smiley", + "fields": { + "is_extra": true, + "image": "smiley/images/icon_wtf.gif", + "code": ":wtf:", + "title": "WTF?" + } + }, + { + "pk": 2, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/yes.gif", + "code": ":yes:", + "title": "Yes" + } + }, + { + "pk": 18, + "model": "smiley.smiley", + "fields": { + "is_extra": false, + "image": "smiley/images/sleep.gif", + "code": ":sleep:", + "title": "Zzzzz" + } + } +] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b smiley/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/smiley/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,88 @@ +""" +Models for the smiley application. +""" +import re + +from django.db import models +from django.core.cache import cache + +CACHE_TIMEOUT = 60 * 60 # seconds + + +class SmileyManager(models.Manager): + + def get_smiley_map(self): + """ + Returns a dictionary, the keys are smiley codes. + The values are the HTML representations of the keys. + The dictionary is cached. + """ + map = cache.get('smiley_map') + if map: + return map + + map = dict((s.code, s.html()) for s in self.all()) + cache.set('smiley_map', map, CACHE_TIMEOUT) + return map + + def get_smilies(self, extra=False): + """ + Returns smiley model instances filtered by the extra flag. + """ + key = 'smileys' if not extra else 'smileys_extra' + smilies = cache.get(key) + if smilies: + return smilies + + smilies = self.filter(is_extra=extra) + cache.set(key, smilies, CACHE_TIMEOUT) + return smilies + + def get_smiley_regexes(self): + """ + Returns a list of 2-tuples of the form: (regex, repl) + where regex is a regular expression for a smiley and + repl is the replacement image in Markdown format. + """ + regexes = cache.get('smiley_regexes') + if regexes: + return regexes + + regexes = [(re.compile(r"(^|\s|(?<=\s))%s(\s|$)" % re.escape(s.code)), + r"\1%s\2" % s.markdown()) for s in self.all()] + cache.set('smiley_regexes', regexes, CACHE_TIMEOUT) + return regexes + + +class Smiley(models.Model): + image = models.ImageField(upload_to='smiley/images/') + title = models.CharField(max_length=32) + code = models.CharField(max_length=32) + is_extra = models.BooleanField() + + objects = SmileyManager() + + class Meta: + verbose_name_plural = 'Smilies' + ordering = ('title', ) + + def __unicode__(self): + return self.title + + def get_absolute_url(self): + return self.image.url + + def html(self): + """Returns a HTML img tag representation of the smiley.""" + if self.image: + return (u'%s' % + (self.get_absolute_url(), self.title, self.title)) + return u'' + html.allow_tags = True + + def markdown(self): + """Returns a markdown representation of the smiley.""" + if self.image: + return (u'![%s](%s "%s")' % + (self.title, self.get_absolute_url(), self.title)) + return u'' diff -r c525f3e0b5d0 -r ee87ea74d46b smiley/templatetags/smiley_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/smiley/templatetags/smiley_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,26 @@ +""" +Template tags for the smiley application. +""" +from django import template +from django.template.defaultfilters import stringfilter +from django.utils.safestring import mark_safe + +from smiley.models import Smiley + +register = template.Library() + + +@register.filter +@stringfilter +def smiley_html(value, autoescape=False): + """A filter to "smilify" text by replacing text with HTML img tags of smilies.""" + from smiley import smilify_html + return mark_safe(smilify_html(value, autoescape=autoescape)) +smiley_html.needs_autoescape = True + + +@register.inclusion_tag('smiley/smiley_farm.html') +def smiley_farm(): + """An inclusion tag that displays all of the smilies in clickable form.""" + return {'smilies': Smiley.objects.get_smilies(), } + diff -r c525f3e0b5d0 -r ee87ea74d46b smiley/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/smiley/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,10 @@ +""" +Urls for the Smiley application. +""" + +from django.conf.urls import patterns, url + +urlpatterns = patterns('smiley.views', + url(r'^farm/$', 'farm', name='smiley-farm'), + url(r'^farm/extra/$', 'farm', kwargs={'extra': True}, name='smiley-farm_extra'), +) diff -r c525f3e0b5d0 -r ee87ea74d46b smiley/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/smiley/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,18 @@ +""" +Views for the Smiley application. +""" +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.contrib.auth.decorators import login_required +from django.views.decorators.http import require_GET + +from smiley.models import Smiley + +@login_required +@require_GET +def farm(request, extra=False): + return render_to_response('smiley/smiley_farm.html', { + 'smilies': Smiley.objects.get_smilies(extra), + }, + context_instance = RequestContext(request)) + diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/__init__.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,1 @@ +import signals diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/admin.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,62 @@ +"""This file contains the automatic admin site definitions for the weblinks models""" +import datetime + +from django.contrib import admin +from weblinks.models import Category +from weblinks.models import PendingLink +from weblinks.models import Link +from weblinks.models import FlaggedLink + + +class CategoryAdmin(admin.ModelAdmin): + list_display = ('title', 'slug', 'description', 'count') + prepopulated_fields = {'slug': ('title', )} + readonly_fields = ('count', ) + + +class PendingLinkAdmin(admin.ModelAdmin): + list_display = ('title', 'url', 'user', 'category', 'date_added') + raw_id_fields = ('user', ) + actions = ('approve_links', ) + readonly_fields = ('update_date', ) + + def approve_links(self, request, qs): + for pending_link in qs: + link = Link(category=pending_link.category, + title=pending_link.title, + url=pending_link.url, + description=pending_link.description, + user=pending_link.user, + date_added=datetime.datetime.now(), + hits=0, + is_public=True) + link.save() + pending_link.delete() + + count = len(qs) + msg = "1 link" if count == 1 else "%d links" % count + self.message_user(request, "%s approved." % msg) + + approve_links.short_description = "Approve selected links" + + +class LinkAdmin(admin.ModelAdmin): + list_display = ('title', 'url', 'category', 'date_added', 'hits', 'is_public') + list_filter = ('date_added', 'is_public', 'category') + date_hierarchy = 'date_added' + ordering = ('-date_added', ) + search_fields = ('title', 'description', 'url', 'user__username') + raw_id_fields = ('user', ) + readonly_fields = ('update_date', ) + save_on_top = True + + +class FlaggedLinkAdmin(admin.ModelAdmin): + list_display = ('__unicode__', 'url', 'get_link_url', 'user', 'date_flagged') + date_hierarchy = 'date_flagged' + raw_id_fields = ('user', ) + +admin.site.register(Category, CategoryAdmin) +admin.site.register(PendingLink, PendingLinkAdmin) +admin.site.register(Link, LinkAdmin) +admin.site.register(FlaggedLink, FlaggedLinkAdmin) diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/fixtures/weblinks_categories.json --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/fixtures/weblinks_categories.json Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,112 @@ +[ + { + "pk": 1, + "model": "weblinks.category", + "fields": { + "count": 215, + "description": "", + "slug": "bands", + "title": "Bands" + } + }, + { + "pk": 5, + "model": "weblinks.category", + "fields": { + "count": 21, + "description": "", + "slug": "fan-sites", + "title": "Fan Sites" + } + }, + { + "pk": 4, + "model": "weblinks.category", + "fields": { + "count": 28, + "description": "", + "slug": "gear", + "title": "Gear" + } + }, + { + "pk": 2, + "model": "weblinks.category", + "fields": { + "count": 7, + "description": "", + "slug": "music-merchants", + "title": "Music Merchants" + } + }, + { + "pk": 8, + "model": "weblinks.category", + "fields": { + "count": 6, + "description": "", + "slug": "other", + "title": "Other" + } + }, + { + "pk": 11, + "model": "weblinks.category", + "fields": { + "count": 17, + "description": "Do you have a photo gallery of surf bands somewhere on the web? Why not add a link to it here?", + "slug": "photo-galleries", + "title": "Photo Galleries" + } + }, + { + "pk": 10, + "model": "weblinks.category", + "fields": { + "count": 4, + "description": "", + "slug": "podcasts", + "title": "Podcasts" + } + }, + { + "pk": 6, + "model": "weblinks.category", + "fields": { + "count": 8, + "description": "", + "slug": "radio", + "title": "Radio" + } + }, + { + "pk": 3, + "model": "weblinks.category", + "fields": { + "count": 13, + "description": "", + "slug": "record-labels", + "title": "Record Labels" + } + }, + { + "pk": 7, + "model": "weblinks.category", + "fields": { + "count": 4, + "description": "", + "slug": "tablature", + "title": "Tablature" + } + }, + { + "pk": 9, + "model": "weblinks.category", + "fields": { + "count": 31, + "description": "Links to surf videos on the web", + "slug": "videos", + "title": "Videos" + } + } +] \ No newline at end of file diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/forms.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/forms.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,23 @@ +""" +Forms for the weblinks application. +""" + +from django import forms +from weblinks.models import PendingLink, Link + + +class AddLinkForm(forms.ModelForm): + title = forms.CharField(widget = forms.TextInput(attrs = {'size': 52})) + url = forms.CharField(widget = forms.TextInput(attrs = {'size': 52})) + + def clean_url(self): + new_url = self.cleaned_data['url'] + try: + Link.objects.get(url__iexact = new_url) + except Link.DoesNotExist: + return new_url + raise forms.ValidationError('That link already exists in our database.') + + class Meta: + model = PendingLink + exclude = ('user', 'date_added', 'update_date') diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,139 @@ +""" +This module contains the models for the weblinks application. +""" +import datetime + +from django.db import models +from django.contrib.auth.models import User + + +class Category(models.Model): + """Links belong to categories""" + title = models.CharField(max_length=64) + slug = models.SlugField(max_length=64) + description = models.TextField(blank=True) + count = models.IntegerField(default=0) + + def __unicode__(self): + return self.title + + class Meta: + verbose_name_plural = 'Categories' + ordering = ('title', ) + + +class PublicLinkManager(models.Manager): + """The manager for all public links.""" + def get_query_set(self): + return super(PublicLinkManager, self).get_query_set().filter( + is_public=True).select_related() + + +class LinkBase(models.Model): + """Abstract model to aggregate common fields of a web link.""" + category = models.ForeignKey(Category) + title = models.CharField(max_length=128) + url = models.URLField(db_index=True) + description = models.TextField(blank=True) + user = models.ForeignKey(User) + date_added = models.DateTimeField(db_index=True) + update_date = models.DateTimeField(db_index=True, blank=True) + + class Meta: + abstract = True + + +class Link(LinkBase): + """Model to represent a web link""" + hits = models.IntegerField(default=0) + is_public = models.BooleanField(default=False, db_index=True) + + # Managers: + objects = models.Manager() + public_objects = PublicLinkManager() + + class Meta: + ordering = ('title', ) + + def __unicode__(self): + return self.title + + def save(self, *args, **kwargs): + if not self.pk: + if not self.date_added: + self.date_added = datetime.datetime.now() + self.update_date = self.date_added + else: + self.update_date = datetime.datetime.now() + + super(Link, self).save(*args, **kwargs) + + @models.permalink + def get_absolute_url(self): + return ('weblinks-link_detail', [str(self.id)]) + + def search_title(self): + return self.title + + def search_summary(self): + return self.description + + +class PendingLink(LinkBase): + """This model represents links that users submit. They must be approved by + an admin before they become visible on the site. + """ + class Meta: + ordering = ('date_added', ) + + def __unicode__(self): + return self.title + + def save(self, *args, **kwargs): + if not self.pk: + self.date_added = datetime.datetime.now() + self.update_date = self.date_added + else: + self.update_date = datetime.datetime.now() + + super(PendingLink, self).save(*args, **kwargs) + + +class FlaggedLinkManager(models.Manager): + + def create(self, link, user): + flagged_link = FlaggedLink(link = link, user = user, approved = False) + flagged_link.save() + + +class FlaggedLink(models.Model): + """Model to represent links that have been flagged as broken by users""" + link = models.ForeignKey(Link) + user = models.ForeignKey(User) + date_flagged = models.DateField(auto_now_add = True) + approved = models.BooleanField(default = False, + help_text = 'Check this and save to remove the referenced link from the database') + + objects = FlaggedLinkManager() + + def save(self, *args, **kwargs): + if self.approved: + self.link.delete() + self.delete() + else: + super(FlaggedLink, self).save(*args, **kwargs) + + def url(self): + return self.link.url + + def get_link_url(self): + return 'Link #%d' % (self.link.get_absolute_url(), + self.link.id) + get_link_url.allow_tags = True + get_link_url.short_description = "View Link on Site" + + def __unicode__(self): + return self.link.title + + class Meta: + ordering = ('-date_flagged', ) diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/search_indexes.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/search_indexes.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,23 @@ +"""Haystack search index for the weblinks application.""" +from haystack.indexes import * +from haystack import site +from custom_search.indexes import CondQueuedSearchIndex + +from weblinks.models import Link + + +class LinkIndex(CondQueuedSearchIndex): + text = CharField(document=True, use_template=True) + author = CharField(model_attr='user') + pub_date = DateTimeField(model_attr='date_added') + + def index_queryset(self): + return Link.public_objects.all() + + def get_updated_field(self): + return 'update_date' + + def can_index(self, instance): + return instance.is_public + +site.register(Link, LinkIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/signals.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/signals.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,41 @@ +"""Signals for the weblinks application. +We use signals to compute the denormalized category counts whenever a weblink +is saved.""" +from django.db.models.signals import post_save +from django.db.models.signals import post_delete + +from weblinks.models import Category, Link + + +def on_link_save(sender, **kwargs): + """This function updates the count field for all categories. + It is called whenever a link is saved via a signal. + """ + if kwargs['created']: + # we only have to update the parent category + link = kwargs['instance'] + cat = link.category + cat.count = Link.public_objects.filter(category=cat).count() + cat.save() + else: + # update all categories just to be safe (an existing link could + # have been moved from one category to another + cats = Category.objects.all() + for cat in cats: + cat.count = Link.public_objects.filter(category=cat).count() + cat.save() + + +def on_link_delete(sender, **kwargs): + """This function updates the count field for the link's parent + category. It is called when a link is deleted via a signal. + """ + # update the parent category + link = kwargs['instance'] + cat = link.category + cat.count = Link.public_objects.filter(category=cat).count() + cat.save() + + +post_save.connect(on_link_save, sender=Link, dispatch_uid='weblinks.signals') +post_delete.connect(on_link_delete, sender=Link, dispatch_uid='weblinks.signals') diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/static/css/weblinks.css --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/static/css/weblinks.css Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,14 @@ +div.weblinks-link-sort { + padding-bottom: .5em; +} + +ul.weblinks-link-options { + margin: 0; + padding-left: 0; + list-style-type: none; +} + +ul.weblinks-link-options li { + display: inline; + padding: 0 5px; +} diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/static/js/weblinks.js --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/static/js/weblinks.js Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,24 @@ +$(document).ready(function() { + $('a.weblinks-broken').click(function () { + var id = this.id; + if (id.match(/^link-(\d+)$/)) { + id = RegExp.$1; + if (confirm('Do you really want to report this link as broken? ' + + 'This will notify the site staff that the link is dead and that ' + + 'it may need to be deleted or revised.')) { + $.ajax({ + url: '/links/report/' + id + '/', + type: 'POST', + dataType: 'text', + success: function (response, textStatus) { + alert(response); + }, + error: function (xhr, textStatus, ex) { + alert('Oops, an error occurred: ' + xhr.statusText + ' - ' + xhr.responseText); + } + }); + } + } + return false; + }); +}); diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/templatetags/weblinks_tags.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/templatetags/weblinks_tags.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,17 @@ +""" +Template tags for the weblinks application. +""" +from django import template + +from weblinks.models import Link + + +register = template.Library() + + +@register.inclusion_tag('weblinks/latest_tag.html') +def latest_weblinks(): + links = Link.public_objects.order_by('-date_added')[:10] + return { + 'links': links, + } diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,19 @@ +"""urls for the weblinks application""" +from django.conf.urls import patterns, url + +urlpatterns = patterns('weblinks.views', + url(r'^$', 'link_index', name='weblinks-main'), + url(r'^add/$', 'add_link', name='weblinks-add_link'), + url(r'^add/thanks/$', 'add_thanks', name='weblinks-add_thanks'), + url(r'^category/(?P[\w\d-]+)/(?Ptitle|date|rating|hits)/$', + 'view_links', + name='weblinks-view_links'), + url(r'^detail/(\d+)/$', + 'link_detail', + name='weblinks-link_detail'), + url(r'^new/$', 'new_links', name='weblinks-new_links'), + url(r'^popular/$', 'popular_links', name='weblinks-popular_links'), + url(r'^random/$', 'random_link', name='weblinks-random_link'), + url(r'^report/(\d+)/$', 'report_link', name='weblinks-report_link'), + url(r'^visit/(\d+)/$', 'visit', name="weblinks-visit"), +) diff -r c525f3e0b5d0 -r ee87ea74d46b weblinks/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/weblinks/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,199 @@ +""" +Views for the weblinks application. +""" + +import datetime +import random +from django.shortcuts import render_to_response +from django.template import RequestContext +from django.core.paginator import InvalidPage +from django.http import HttpResponse +from django.http import HttpResponseBadRequest +from django.http import HttpResponseRedirect +from django.contrib.auth.decorators import login_required +from django.shortcuts import get_object_or_404 +from django.core.urlresolvers import reverse +from django.db.models import Q +from django.http import Http404 +from django.views.decorators.http import require_POST + +from core.paginator import DiggPaginator +from core.functions import email_admins +from core.functions import get_page +from weblinks.models import Category +from weblinks.models import Link +from weblinks.models import FlaggedLink +from weblinks.forms import AddLinkForm + +####################################################################### + +LINKS_PER_PAGE = 10 + +def create_paginator(links): + return DiggPaginator(links, LINKS_PER_PAGE, body=5, tail=3, margin=3, padding=2) + +####################################################################### + +def link_index(request): + categories = Category.objects.all() + total_links = Link.public_objects.all().count() + return render_to_response('weblinks/index.html', { + 'categories': categories, + 'total_links': total_links, + }, + context_instance = RequestContext(request)) + +####################################################################### + +def new_links(request): + links = Link.public_objects.order_by('-date_added') + paginator = create_paginator(links) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + return render_to_response('weblinks/link_summary.html', { + 'page': the_page, + 'title': 'Newest Links', + }, + context_instance = RequestContext(request)) + +####################################################################### + +def popular_links(request): + links = Link.public_objects.order_by('-hits') + paginator = create_paginator(links) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + return render_to_response('weblinks/link_summary.html', { + 'page': the_page, + 'title': 'Popular Links', + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def add_link(request): + if request.method == 'POST': + add_form = AddLinkForm(request.POST) + if add_form.is_valid(): + new_link = add_form.save(commit=False) + new_link.user = request.user + new_link.save() + email_admins('New link for approval', """Hello, + +A user has added a new link for your approval. +""") + return HttpResponseRedirect(reverse('weblinks-add_thanks')) + else: + add_form = AddLinkForm() + + return render_to_response('weblinks/add_link.html', { + 'add_form': add_form, + }, + context_instance = RequestContext(request)) + +####################################################################### + +@login_required +def add_thanks(request): + return render_to_response('weblinks/add_link.html', { + }, + context_instance = RequestContext(request)) + +####################################################################### + +# Maps URL component to database field name for the links table: + +LINK_FIELD_MAP = { + 'title': 'title', + 'date': '-date_added', + 'hits': '-hits' +} + +def view_links(request, slug, sort='title'): + try: + cat = Category.objects.get(slug=slug) + except Category.DoesNotExist: + raise Http404 + + if sort in LINK_FIELD_MAP: + order_by = LINK_FIELD_MAP[sort] + else: + sort = 'title' + order_by = LINK_FIELD_MAP['title'] + + links = Link.public_objects.filter(category=cat).order_by(order_by) + paginator = create_paginator(links) + page = get_page(request.GET) + try: + the_page = paginator.page(page) + except InvalidPage: + raise Http404 + + return render_to_response('weblinks/view_links.html', { + 's' : sort, + 'category' : cat, + 'page' : the_page, + }, + context_instance = RequestContext(request)) + +####################################################################### + +def _visit_link(request, link): + link.hits += 1 + link.save() + return HttpResponseRedirect(link.url) + +####################################################################### + +@require_POST +def visit(request, link_id): + link = get_object_or_404(Link, pk = link_id) + return _visit_link(request, link) + +####################################################################### + +@require_POST +def random_link(request): + ids = Link.public_objects.values_list('id', flat=True) + if not ids: + raise Http404 + id = random.choice(ids) + random_link = Link.public_objects.get(pk=id) + return _visit_link(request, random_link) + +####################################################################### + +@require_POST +def report_link(request, link_id): + """ + This function is the target of an AJAX POST to report a link as dead. + """ + if not request.user.is_authenticated(): + return HttpResponse('Please login or register to report a broken link.') + + try: + link = Link.objects.get(pk=link_id) + except Link.DoesNotExist: + return HttpResponseBadRequest("That link doesn't exist.") + + FlaggedLink.objects.create(link, request.user) + return HttpResponse("The link was reported. A moderator will review the " \ + "link shortly. Thanks for helping to improve the content on " \ + "this site.") + +####################################################################### + +def link_detail(request, id): + link = get_object_or_404(Link, pk=id) + return render_to_response('weblinks/link_detail.html', { + 'link': link, + }, + context_instance = RequestContext(request)) diff -r c525f3e0b5d0 -r ee87ea74d46b ygroup/management/commands/sync_ygroup_posts.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/ygroup/management/commands/sync_ygroup_posts.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,53 @@ +""" +sync_ygroup_posts.py - A management command to synchronize the yahoo group +archives by recomputing the de-normalized fields in the post objects. + +""" +import optparse + +from django.core.management.base import NoArgsCommand, CommandError +from django.core.urlresolvers import reverse + +from ygroup.models import Thread, Post +import ygroup.views + + +class Command(NoArgsCommand): + help = """\ +This command synchronizes the ygroup application's post objects +by updating their de-normalized fields. +""" + option_list = NoArgsCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 100 posts to show progress"), + ) + + def handle_noargs(self, **opts): + + show_progress = opts.get('progress', False) or False + + threads = {} + self.stdout.write("Processing threads...\n") + for thread in Thread.objects.iterator(): + threads[thread.id] = [reverse('ygroup-thread_view', args=[thread.id]), + list(Post.objects.filter(thread=thread).values_list('id', flat=True))] + + self.stdout.write("Processing posts...\n") + n = 0 + for post in Post.objects.iterator(): + thread = threads[post.thread.id] + pos = thread[1].index(post.id) + page = pos / ygroup.views.POSTS_PER_PAGE + 1 + if page == 1: + post.thread_url = thread[0] + '#p%d' % (post.id, ) + else: + post.thread_url = thread[0] + '?page=%d#p%d' % (page, post.id) + post.save() + + n += 1 + if show_progress and n % 100 == 0: + self.stdout.write('.') + self.stdout.flush() + + self.stdout.write('\n') + diff -r c525f3e0b5d0 -r ee87ea74d46b ygroup/management/commands/sync_ygroup_threads.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/ygroup/management/commands/sync_ygroup_threads.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,39 @@ +""" +sync_ygroup_threads.py - A management command to synchronize the yahoo group +archives by recomputing the de-normalized fields in the thread objects. + +""" +import optparse + +from django.core.management.base import NoArgsCommand, CommandError + +from ygroup.models import Thread, Post +import ygroup.views + + +class Command(NoArgsCommand): + help = """\ +This command synchronizes the ygroup application's thread objects +by updating their de-normalized fields. +""" + option_list = NoArgsCommand.option_list + ( + optparse.make_option("-p", "--progress", action="store_true", + help="Output a . after every 50 threads to show progress"), + ) + + def handle_noargs(self, **opts): + + show_progress = opts.get('progress', False) or False + + n = 0 + for thread in Thread.objects.iterator(): + thread.post_count = Post.objects.filter(thread=thread).count() + thread.page = n / ygroup.views.THREADS_PER_PAGE + 1 + thread.save() + n += 1 + if n % 50 == 0: + self.stdout.write('.') + self.stdout.flush() + + self.stdout.write('\n') + diff -r c525f3e0b5d0 -r ee87ea74d46b ygroup/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/ygroup/models.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,55 @@ +""" +Models for the ygroup application, which is a read-only archive of messages +from the old Yahoo Group. +""" +from django.db import models + + +class Thread(models.Model): + title = models.CharField(max_length=255) + creation_date = models.DateTimeField() + + # denormalized fields to reduce database hits + poster = models.CharField(max_length=128) + post_count = models.IntegerField(blank=True, default=0) + page = models.IntegerField(blank=True, default=1) + + class Meta: + ordering = ('creation_date', ) + + def __unicode__(self): + return u'Thread %d, %s' % (self.pk, self.title) + + @models.permalink + def get_absolute_url(self): + return ('ygroup-thread_view', [self.id]) + + +class Post(models.Model): + thread = models.ForeignKey(Thread, null=True, blank=True, + on_delete=models.SET_NULL, related_name='posts') + title = models.CharField(max_length=255) + creation_date = models.DateTimeField() + poster = models.CharField(max_length=128) + msg = models.TextField() + + # precomputed URL to this post in the parent thread for efficiency + thread_url = models.URLField(blank=True) + + class Meta: + ordering = ('creation_date', ) + verbose_name = 'yahoo group post' + verbose_name_plural = 'yahoo group posts' + + def __unicode__(self): + return u'Post %d, %s' % (self.pk, self.title) + + @models.permalink + def get_absolute_url(self): + return ('ygroup-post_view', [], {'pk': self.id}) + + def search_title(self): + return self.title + + def search_summary(self): + return self.msg diff -r c525f3e0b5d0 -r ee87ea74d46b ygroup/search_indexes.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/ygroup/search_indexes.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,20 @@ +""" +Haystack search index for the Yahoo Group archives application. + +""" +from haystack.indexes import * +from haystack import site +from custom_search.indexes import CondQueuedSearchIndex + +from ygroup.models import Post + + +class PostIndex(CondQueuedSearchIndex): + text = CharField(document=True, use_template=True) + pub_date = DateTimeField(model_attr='creation_date') + + def get_updated_field(self): + return 'creation_date' + + +site.register(Post, PostIndex) diff -r c525f3e0b5d0 -r ee87ea74d46b ygroup/tests.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/ygroup/tests.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,16 @@ +""" +This file demonstrates writing tests using the unittest module. These will pass +when you run "manage.py test". + +Replace this with more appropriate tests for your application. +""" + +from django.test import TestCase + + +class SimpleTest(TestCase): + def test_basic_addition(self): + """ + Tests that 1 + 1 always equals 2. + """ + self.assertEqual(1 + 1, 2) diff -r c525f3e0b5d0 -r ee87ea74d46b ygroup/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/ygroup/urls.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,23 @@ +""" +urls.py - URLs for the ygroup application. + +""" +from django.conf.urls import patterns, url +from django.views.generic import DetailView + +from ygroup.models import Post +from ygroup.views import ThreadIndexView, ThreadView + + +urlpatterns = patterns('', + url(r'^threads/$', + ThreadIndexView.as_view(), + name='ygroup-thread_index'), + url(r'^thread/(\d+)/$', + ThreadView.as_view(), + name='ygroup-thread_view'), + url(r'^post/(?P\d+)/$', + DetailView.as_view(model=Post, context_object_name='post'), + name='ygroup-post_view'), +) + diff -r c525f3e0b5d0 -r ee87ea74d46b ygroup/views.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/ygroup/views.py Sat May 05 17:10:48 2012 -0500 @@ -0,0 +1,55 @@ +""" +Views for the ygroup (Yahoo Group Archive) application. + +""" +from django.shortcuts import get_object_or_404 +from django.views.generic import ListView + +from ygroup.models import Thread, Post +from core.paginator import DiggPaginator + + +THREADS_PER_PAGE = 40 +POSTS_PER_PAGE = 20 + + +class ThreadIndexView(ListView): + """ + This generic view displays the list of threads available. + + """ + model = Thread + paginate_by = THREADS_PER_PAGE + + def get_paginator(self, queryset, per_page, **kwargs): + """ + Return an instance of the paginator for this view. + """ + return DiggPaginator(queryset, per_page, body=5, tail=2, + margin=3, padding=2, **kwargs) + + +class ThreadView(ListView): + """ + This generic view displays the posts in a thread. + + """ + context_object_name = "post_list" + template_name = "ygroup/thread.html" + paginate_by = POSTS_PER_PAGE + + def get_queryset(self): + self.thread = get_object_or_404(Thread, pk=self.args[0]) + return Post.objects.filter(thread=self.thread) + + def get_context_data(self, **kwargs): + context = super(ThreadView, self).get_context_data(**kwargs) + context['thread'] = self.thread + return context + + def get_paginator(self, queryset, per_page, **kwargs): + """ + Return an instance of the paginator for this view. + """ + return DiggPaginator(queryset, per_page, body=5, tail=2, + margin=3, padding=2, **kwargs)