Compare commits

...

44 Commits

Author SHA1 Message Date
3e51584b3c 0.6.0.4 2005-09-01 20:27:35 +00:00
4ff8a53084 2005-09-01 jrandom
* Don't send out a netDb store of a router if it is more than a few hours
      old, even if someone asked us for it.
2005-09-01 06:55:00 +00:00
ccb73437c4 2005-08-31 jrandom
* Don't publish leaseSets to the netDb if they will never be looked for -
      namely, if they are for destinations that only establish outbound
      streams.  I2PTunnel's 'client' and 'httpclient' proxies have been
      modified to tell the router that it doesn't need to publish their
      leaseSet (by setting the I2CP config option 'i2cp.dontPublishLeaseSet'
      to 'true').
    * Don't publish the top 10 peer rankings of each router in the netdb, as
      it isn't being watched right now.
2005-09-01 00:26:20 +00:00
b43114f61b 2005-08-31 jrandom
* Don't publish leaseSets to the netDb if they will never be looked for -
      namely, if they are for destinations that only establish outbound
      streams.  I2PTunnel's 'client' and 'httpclient' proxies have been
      modified to tell the router that it doesn't need to publish their
      leaseSet (by setting the I2CP config option 'i2cp.dontPublishLeaseSet'
      to 'true').
    * Don't publish the top 10 peer rankings of each router in the netdb, as
      it isn't being watched right now.
2005-09-01 00:20:16 +00:00
9bd87ab511 make it work with any host charset or content charset 2005-08-31 09:50:23 +00:00
b6ea55f7ef more error handling (thanks frosk) 2005-08-30 02:39:37 +00:00
5f18cec97d 2005-08-29 jrandom
* Added the new test Floodfill netDb
2005-08-30 02:04:17 +00:00
3ba921ec0e 2005-08-29 jrandom
* Added the new test Floodfill netDb
2005-08-30 01:59:11 +00:00
e313da254c 2005-08-27 jrandom
* Minor logging and optimization tweaks in the router and SDK
    * Use ISO-8859-1 in the XML files (thanks redzara!)
    * The consolePassword config property can now be used to bypass the router
      console's nonce checking, allowing CLI restarts
2005-08-27 22:46:22 +00:00
8660cf0d74 2005-08-27 jrandom
* Minor logging and optimization tweaks in the router and SDK
    * Use ISO-8859-1 in the XML files (thanks redzara!)
    * The consolePassword config property can now be used to bypass the router
      console's nonce checking, allowing CLI restarts
2005-08-27 22:15:35 +00:00
e0bfdff152 TZ asap 2005-08-25 21:08:13 +00:00
c27aed3603 fix up the entryId calc 2005-08-25 21:07:18 +00:00
cdc6002f0e no message 2005-08-25 21:01:15 +00:00
4cf3d9c1a2 HTTP file upload (rfc 1867) helper 2005-08-25 21:00:09 +00:00
0473e08e21 remote w0rks 2005-08-25 20:59:46 +00:00
346faa3de2 2005-08-24 jrandom
* Catch errors with corrupt tunnel messages more gracefully (no need to
      kill the thread and cause an OOM...)
    * Don't skip shitlisted peers for netDb store messages, as they aren't
      necessarily shitlisted by other people (though they probably are).
    * Adjust the netDb store per-peer timeout based on each particular peer's
      profile (timeout = 4x their average netDb store response time)
    * Don't republish leaseSets to *failed* peers - send them to peers who
      replied but just didn't know the value.
    * Set a 5 second timeout on the I2PTunnelHTTPServer reading the client's
      HTTP headers, rather than blocking indefinitely.  HTTP headers should be
      sent entirely within the first streaming packet anyway, so this won't be
      a problem.
    * Don't use the I2PTunnel*Server handler thread pool by default, as it may
      prevent any clients from accessing the server if the handlers get
      blocked by the streaming lib or other issues.
    * Don't overwrite a known status (OK/ERR-Reject/ERR-SymmetricNAT) with
      Unknown.
2005-08-24 22:55:25 +00:00
5ec6dca64d 2005-08-23 jrandom
* Removed the concept of "no bandwidth limit" - if none is specified, its
      16KBps in/out.
    * Include ack packets in the per-peer cwin throttle (they were part of the
      bandwidth limit though).
    * Tweak the SSU cwin operation to get more accurrate estimates under
      congestions.
    * SSU improvements to resend more efficiently.
    * Added a basic scheduler to eepget to fetch multiple files sequentially.
2005-08-23 22:43:51 +00:00
1a6b49cfb8 2005-08-23 jrandom
* Removed the concept of "no bandwidth limit" - if none is specified, its
      16KBps in/out.
    * Include ack packets in the per-peer cwin throttle (they were part of the
      bandwidth limit though).
    * Tweak the SSU cwin operation to get more accurrate estimates under
      congestions.
    * SSU improvements to resend more efficiently.
    * Added a basic scheduler to eepget to fetch multiple files sequentially.
2005-08-23 21:25:49 +00:00
c7b75df390 Added announcement about the new Irc2P server at irc.freshcoffee.i2p 2005-08-22 13:03:11 +00:00
f97c09291b 0.6.0.3 2005-08-21 19:21:50 +00:00
8f2a5b403c * 2005-08-21 0.6.0.3 released
2005-08-21  jrandom
    * If we already have an established SSU session with the Charlie helping
      test us, cancel the test with the status of "unknown".
2005-08-21 18:39:05 +00:00
ea41a90eae sanity checking 2005-08-21 18:37:57 +00:00
b1dd29e64d added syndie.i2p and syndiemedia.i2p 2005-08-21 18:33:58 +00:00
46e47c47ac ewps 2005-08-21 18:19:22 +00:00
b7bf431f0d [these are not the droids you are looking for] 2005-08-21 18:08:05 +00:00
7f432122d9 added irc.freshcoffee.i2p (new IRC server on the irc2p network) 2005-08-20 01:19:51 +00:00
e7be8c6097 Added references to the new irc2p server: irc.freshcoffee.i2p 2005-08-20 01:18:38 +00:00
adf56a16e1 2005-08-17 jrandom
* Revise the SSU peer testing protocol so that Bob verifies Charlie's
      viability before agreeing to Alice's request.  This doesn't work with
      older SSU peer test builds, but is backwards compatible (older nodes
      won't ask newer nodes to participate in tests, and newer nodes won't
      ask older nodes to either).
2005-08-17 20:16:27 +00:00
11204b8a2b 2005-08-17 jrandom
* Revise the SSU peer testing protocol so that Bob verifies Charlie's
      viability before agreeing to Alice's request.  This doesn't work with
      older SSU peer test builds, but is backwards compatible (older nodes
      won't ask newer nodes to participate in tests, and newer nodes won't
      ask older nodes to either).
2005-08-17 20:05:01 +00:00
cade27dceb added surrender.adab.i2p 2005-08-17 00:42:15 +00:00
5597d28e59 Removing references to irc.duck.i2p, adding references to irc.arcturus.i2p, and replacing current ircProxy default destination string with "irc.postman.i2p,irc.arcturus.i2p" 2005-08-16 09:35:58 +00:00
0502fec432 added terror.i2p 2005-08-15 18:44:04 +00:00
a6714fc2de Adding irc.arcturus.i2p, a new server for the soon-to-be Irc2P network 2005-08-14 15:52:12 +00:00
1219dadbd5 2005-08-12 jrandom
* Keep detailed stats on the peer testing, publishing the results in the
      netDb.
    * Don't overwrite the status with 'unknown' unless we haven't had a valid
      status in a while.
    * Make sure to avoid shitlisted peers for peer testing.
    * When we get an unknown result to a peer test, try again soon afterwards.
    * When a peer tells us that our address is different from what we expect,
      if we've done a recent peer test with a result of OK, fire off a peer
      test to make sure our IP/port is still valid.  If our test is old or the
      result was not OK, accept their suggestion, but queue up a peer test for
      later.
    * Don't try to do a netDb store to a shitlisted peer, and adjust the way
      we monitor netDb store progress (to clear up the high netDb.storePeers
      stat)
2005-08-12 23:54:46 +00:00
77b995f5ed 2005-08-10 jrandom
* Deployed the peer testing implementation to be run every few minutes on
      each router, as well as any time the user requests a test manually.  The
      tests do not reconfigure the ports at the moment, merely determine under
      what conditions the local router is reachable.  The status shown in the
      top left will be "ERR-SymmetricNAT" if the user's IP and port show up
      differently for different peers, "ERR-Reject" if the router cannot
      receive unsolicited packets or the peer helping test could not find a
      collaborator, "Unknown" if the test has not been run or the test
      participants were unreachable, or "OK" if the router can receive
      unsolicited connections and those connections use the same IP and port.
2005-08-10 23:55:40 +00:00
2f53b9ff68 0.6.0.2 2005-08-09 18:55:31 +00:00
d84d045849 deal with full windows without *cough* NPEs
(how many times can I cvs rtag -F before going crazy?)
2005-08-08 21:20:08 +00:00
d8e72dfe48 foo 2005-08-08 20:49:17 +00:00
88b9f7a74c "ERROR [eive on 8887] uter.transport.udp.UDPReceiver: Dropping inbound packet with 1 queued for 1912 packet handlers: Handlers: 3 handler 0 state: 2 handler 1 state: 2 handler 2 state: 2"
state = 2 means all three handlers are blocking on udpReceiver.receive())
this can legitimately happen if the bandwidth limiter or router throttle chokes the receive for >= 1s.
2005-08-08 20:42:13 +00:00
6a19501214 2005-08-08 jrandom
* Add a configurable throttle to the number of concurrent outbound SSU
      connection negotiations (via i2np.udp.maxConcurrentEstablish=4).  This
      may help those with slow connections to get integrated at the start.
    * Further fixlets to the streaming lib
2005-08-08 20:35:50 +00:00
ba30b56c5f 2005-08-07 Complication
* Display the average clock skew for both SSU and TCP connections
2005-08-07  jrandom
    * Fixed the long standing streaming lib bug where we could lose the first
      packet on retransmission.
    * Avoid an NPE when a message expires on the SSU queue.
    * Adjust the streaming lib's window growth factor with an additional
      Vegas-esque congestion detection algorithm.
    * Removed an unnecessary SSU session drop
    * Reduced the MTU (until we get a working PMTU lib)
    * Deferr tunnel acceptance until we know how to reach the next hop,
      rejecting it if we can't find them in time.
    * If our netDb store of our leaseSet fails, give it a few seconds before
      republishing.
2005-08-07 19:31:58 +00:00
a375e4b2ce added more postman services (w3wt) 2005-08-07 19:27:22 +00:00
44fd71e17f added i2p-bt.postman.i2p 2005-08-05 21:20:30 +00:00
b41c378de9 Removed reference and link to Invisiblechat/IIP from the router console greeting page (because IIP's dead, Jim... how many times does it need to be said?) and added irc.postman.i2p. 2005-08-05 19:20:52 +00:00
157 changed files with 10865 additions and 739 deletions

View File

@ -111,12 +111,12 @@ public class Bogobot extends PircBot {
_botShutdownPassword = config.getProperty("botShutdownPassword", "take off eh");
_ircChannel = config.getProperty("ircChannel", "#i2p-chat");
_ircServer = config.getProperty("ircServer", "irc.duck.i2p");
_ircServer = config.getProperty("ircServer", "irc.postman.i2p");
_ircServerPort = Integer.parseInt(config.getProperty("ircServerPort", "6668"));
_isLoggerEnabled = Boolean.valueOf(config.getProperty("isLoggerEnabled", "true")).booleanValue();
_loggedHostnamePattern = config.getProperty("loggedHostnamePattern", "");
_logFilePrefix = config.getProperty("logFilePrefix", "irc.duck.i2p.i2p-chat");
_logFilePrefix = config.getProperty("logFilePrefix", "irc.postman.i2p.i2p-chat");
_logFileRotationInterval = config.getProperty("logFileRotationInterval", INTERVAL_DAILY);
_isRoundTripDelayEnabled = Boolean.valueOf(config.getProperty("isRoundTripDelayEnabled", "false")).booleanValue();

View File

@ -109,8 +109,9 @@ public class I2PTunnel implements Logging, EventDispatcher {
_tunnelId = ++__tunnelId;
_log = _context.logManager().getLog(I2PTunnel.class);
_event = new EventDispatcherImpl();
_clientOptions = new Properties();
_clientOptions.putAll(System.getProperties());
Properties p = new Properties();
p.putAll(System.getProperties());
_clientOptions = p;
_sessions = new ArrayList(1);
addConnectionEventListener(lsnr);

View File

@ -101,6 +101,10 @@ public abstract class I2PTunnelClientBase extends I2PTunnelTask implements Runna
this.l = l;
this.handlerName = handlerName + _clientId;
// no need to load the netDb with leaseSets for destinations that will never
// be looked up
tunnel.getClientOptions().setProperty("i2cp.dontPublishLeaseSet", "true");
while (sockMgr == null) {
synchronized (sockLock) {
if (ownDest) {

View File

@ -60,11 +60,13 @@ public class I2PTunnelHTTPServer extends I2PTunnelServer {
//local is fast, so synchronously. Does not need that many
//threads.
try {
socket.setReadTimeout(readTimeout);
// give them 5 seconds to send in the HTTP request
socket.setReadTimeout(5*1000);
String modifiedHeader = getModifiedHeader(socket);
if (_log.shouldLog(Log.DEBUG))
_log.debug("Modified header: [" + modifiedHeader + "]");
socket.setReadTimeout(readTimeout);
Socket s = new Socket(remoteHost, remotePort);
afterSocket = getTunnel().getContext().clock().now();
new I2PTunnelRunner(s, socket, slock, null, modifiedHeader.getBytes(), null);

View File

@ -11,6 +11,7 @@ import java.io.InputStream;
import java.net.InetAddress;
import java.net.Socket;
import java.net.SocketException;
import java.net.ConnectException;
import java.util.Iterator;
import java.util.Properties;
@ -39,6 +40,7 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
protected InetAddress remoteHost;
protected int remotePort;
private boolean _usePool;
private Logging l;
@ -46,15 +48,27 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
/** default timeout to 3 minutes - override if desired */
protected long readTimeout = DEFAULT_READ_TIMEOUT;
private static final boolean DEFAULT_USE_POOL = false;
public I2PTunnelServer(InetAddress host, int port, String privData, Logging l, EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host + ":" + port + " <- " + privData, notifyThis, tunnel);
ByteArrayInputStream bais = new ByteArrayInputStream(Base64.decode(privData));
String usePool = tunnel.getClientOptions().getProperty("i2ptunnel.usePool");
if (usePool != null)
_usePool = "true".equalsIgnoreCase(usePool);
else
_usePool = DEFAULT_USE_POOL;
init(host, port, bais, privData, l);
}
public I2PTunnelServer(InetAddress host, int port, File privkey, String privkeyname, Logging l,
EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host + ":" + port + " <- " + privkeyname, notifyThis, tunnel);
String usePool = tunnel.getClientOptions().getProperty("i2ptunnel.usePool");
if (usePool != null)
_usePool = "true".equalsIgnoreCase(usePool);
else
_usePool = DEFAULT_USE_POOL;
try {
init(host, port, new FileInputStream(privkey), privkeyname, l);
} catch (IOException ioe) {
@ -65,6 +79,11 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
public I2PTunnelServer(InetAddress host, int port, InputStream privData, String privkeyname, Logging l, EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host + ":" + port + " <- " + privkeyname, notifyThis, tunnel);
String usePool = tunnel.getClientOptions().getProperty("i2ptunnel.usePool");
if (usePool != null)
_usePool = "true".equalsIgnoreCase(usePool);
else
_usePool = DEFAULT_USE_POOL;
init(host, port, privData, privkeyname, l);
}
@ -178,22 +197,34 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
}
public void run() {
if (shouldUsePool()) {
I2PServerSocket i2pss = sockMgr.getServerSocket();
int handlers = getHandlerCount();
for (int i = 0; i < handlers; i++) {
I2PThread handler = new I2PThread(new Handler(i2pss), "Handle Server " + i);
handler.start();
}
/*
} else {
I2PServerSocket i2pss = sockMgr.getServerSocket();
while (true) {
I2PSocket i2ps = i2pss.accept();
if (i2ps == null) throw new I2PException("I2PServerSocket closed");
I2PThread t = new I2PThread(new Handler(i2ps));
t.start();
try {
final I2PSocket i2ps = i2pss.accept();
if (i2ps == null) throw new I2PException("I2PServerSocket closed");
new I2PThread(new Runnable() { public void run() { blockingHandle(i2ps); } }).start();
} catch (I2PException ipe) {
if (_log.shouldLog(Log.ERROR))
_log.error("Error accepting - KILLING THE TUNNEL SERVER", ipe);
return;
} catch (ConnectException ce) {
if (_log.shouldLog(Log.ERROR))
_log.error("Error accepting", ce);
// not killing the server..
}
}
*/
}
}
public boolean shouldUsePool() { return _usePool; }
/**
* minor thread pool to pull off the accept() concurrently. there are still lots

View File

@ -1,4 +1,4 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="ISO-8859-1"?>
<!DOCTYPE web-app
PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.2//EN"
"http://java.sun.com/j2ee/dtds/web-app_2.2.dtd">
@ -14,4 +14,4 @@
<welcome-file>index.html</welcome-file>
<welcome-file>index.jsp</welcome-file>
</welcome-file-list>
</web-app>
</web-app>

View File

@ -27,6 +27,7 @@ public class ConfigNetHandler extends FormHandler {
private boolean _guessRequested;
private boolean _reseedRequested;
private boolean _saveRequested;
private boolean _recheckReachabilityRequested;
private boolean _timeSyncEnabled;
private String _tcpPort;
private String _udpPort;
@ -44,6 +45,8 @@ public class ConfigNetHandler extends FormHandler {
reseed();
} else if (_saveRequested) {
saveChanges();
} else if (_recheckReachabilityRequested) {
recheckReachability();
} else {
// noop
}
@ -53,6 +56,7 @@ public class ConfigNetHandler extends FormHandler {
public void setReseed(String moo) { _reseedRequested = true; }
public void setSave(String moo) { _saveRequested = true; }
public void setEnabletimesync(String moo) { _timeSyncEnabled = true; }
public void setRecheckReachability(String moo) { _recheckReachabilityRequested = true; }
public void setHostname(String hostname) {
_hostname = (hostname != null ? hostname.trim() : null);
@ -195,6 +199,11 @@ public class ConfigNetHandler extends FormHandler {
fos.close();
}
private void recheckReachability() {
_context.commSystem().recheckReachability();
addFormNotice("Rechecking router reachability...");
}
/**
* The user made changes to the network config and wants to save them, so
* lets go ahead and do so.

View File

@ -20,6 +20,7 @@ public class FormHandler {
protected Log _log;
private String _nonce;
protected String _action;
protected String _passphrase;
private List _errors;
private List _notices;
private boolean _processed;
@ -32,6 +33,7 @@ public class FormHandler {
_processed = false;
_valid = true;
_nonce = null;
_passphrase = null;
}
/**
@ -51,6 +53,7 @@ public class FormHandler {
public void setNonce(String val) { _nonce = val; }
public void setAction(String val) { _action = val; }
public void setPassphrase(String val) { _passphrase = val; }
/**
* Override this to perform the final processing (in turn, adding formNotice
@ -119,8 +122,14 @@ public class FormHandler {
String noncePrev = System.getProperty(getClass().getName() + ".noncePrev");
if ( ( (nonce == null) || (!_nonce.equals(nonce)) ) &&
( (noncePrev == null) || (!_nonce.equals(noncePrev)) ) ) {
addFormError("Invalid nonce, are you being spoofed?");
_valid = false;
String expected = _context.getProperty("consolePassword");
if ( (expected != null) && (expected.trim().length() > 0) && (expected.equals(_passphrase)) ) {
// ok
} else {
addFormError("Invalid nonce, are you being spoofed?");
_valid = false;
}
}
}

View File

@ -12,6 +12,7 @@ import net.i2p.data.Destination;
import net.i2p.data.LeaseSet;
import net.i2p.stat.Rate;
import net.i2p.stat.RateStat;
import net.i2p.router.CommSystemFacade;
import net.i2p.router.Router;
import net.i2p.router.RouterContext;
import net.i2p.router.RouterVersion;
@ -97,6 +98,23 @@ public class SummaryHelper {
return (_context.netDb().getKnownRouters() < 10);
}
public int getAllPeers() { return _context.netDb().getKnownRouters(); }
public String getReachability() {
int status = _context.commSystem().getReachabilityStatus();
switch (status) {
case CommSystemFacade.STATUS_OK:
return "OK";
case CommSystemFacade.STATUS_DIFFERENT:
return "ERR-SymmetricNAT";
case CommSystemFacade.STATUS_REJECT_UNSOLICITED:
return "ERR-Reject";
case CommSystemFacade.STATUS_UNKNOWN: // fallthrough
default:
return "Unknown";
}
}
/**
* Retrieve amount of used memory.
*
@ -189,6 +207,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.receiveMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);
@ -206,6 +225,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.sendMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);
@ -224,6 +244,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.receiveMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(5*60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);
@ -242,6 +263,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.sendMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(5*60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);

View File

@ -35,6 +35,8 @@ this port from arbitrary peers (this requirement will be removed in i2p 0.6.1, b
TCP port: <input name="tcpPort" type="text" size="5" value="<jsp:getProperty name="nethelper" property="tcpPort" />" /> <br />
<b>You must poke a hole in your firewall or NAT (if applicable) so that you can receive inbound TCP
connections on it (this requirement will be removed in i2p 0.6.1, but is necessary now)</b>
<br />
<input type="submit" name="recheckReachability" value="Check network reachability..." />
<hr />
<b>Bandwidth limiter</b><br />

View File

@ -14,7 +14,8 @@
<b>Version:</b> <jsp:getProperty name="helper" property="version" /><br />
<b>Uptime:</b> <jsp:getProperty name="helper" property="uptime" /><br />
<b>Now:</b> <jsp:getProperty name="helper" property="time" /><br />
<b>Memory:</b> <jsp:getProperty name="helper" property="memory" /><br /><%
<b>Memory:</b> <jsp:getProperty name="helper" property="memory" /><br />
<b>Status:</b> <a href="config.jsp"><jsp:getProperty name="helper" property="reachability" /></a><br /><%
if (helper.updateAvailable()) {
if ("true".equals(System.getProperty("net.i2p.router.web.UpdateHandler.updateInProgress", "false"))) {
out.print(update.getStatus());
@ -39,7 +40,8 @@
<b>High capacity:</b> <jsp:getProperty name="helper" property="highCapacityPeers" /><br />
<b>Well integrated:</b> <jsp:getProperty name="helper" property="wellIntegratedPeers" /><br />
<b>Failing:</b> <jsp:getProperty name="helper" property="failingPeers" /><br />
<b>Shitlisted:</b> <jsp:getProperty name="helper" property="shitlistedPeers" /><br /><%
<b>Shitlisted:</b> <jsp:getProperty name="helper" property="shitlistedPeers" /><br />
<b>Known:</b> <jsp:getProperty name="helper" property="allPeers" /><br /><%
if (helper.getActivePeers() <= 0) {
%><b><a href="config.jsp">check your NAT/firewall</a></b><br /><%
}

View File

@ -1,4 +1,4 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="ISO-8859-1"?>
<!DOCTYPE web-app
PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.2//EN"
"http://java.sun.com/j2ee/dtds/web-app_2.2.dtd">
@ -14,4 +14,4 @@
<welcome-file>index.html</welcome-file>
<welcome-file>index.jsp</welcome-file>
</welcome-file-list>
</web-app>
</web-app>

View File

@ -72,7 +72,7 @@ public class Connection {
private long _lifetimeDupMessageSent;
private long _lifetimeDupMessageReceived;
public static final long MAX_RESEND_DELAY = 10*1000;
public static final long MAX_RESEND_DELAY = 5*1000;
public static final long MIN_RESEND_DELAY = 3*1000;
/** wait up to 5 minutes after disconnection so we can ack/close packets */
@ -272,10 +272,13 @@ public class Connection {
SimpleTimer.getInstance().addEvent(new ResendPacketEvent(packet), timeout);
}
_context.statManager().getStatLog().addData(Packet.toId(_sendStreamId), "stream.rtt", _options.getRTT(), _options.getWindowSize());
_lastSendTime = _context.clock().now();
_outboundQueue.enqueue(packet);
resetActivityTimer();
/*
if (ackOnly) {
// ACK only, don't schedule this packet for retries
// however, if we are running low on sessionTags we want to send
@ -286,6 +289,7 @@ public class Connection {
_connectionManager.ping(_remotePeer, _options.getRTT()*2, false, packet.getKeyUsed(), packet.getTagsSent(), new PingNotifier());
}
}
*/
}
private class PingNotifier implements ConnectionManager.PingNotifier {

View File

@ -13,6 +13,7 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
private int _receiveWindow;
private int _profile;
private int _rtt;
private int _trend[];
private int _resendDelay;
private int _sendAckDelay;
private int _maxMessageSize;
@ -50,6 +51,8 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
public static final String PROP_CONGESTION_AVOIDANCE_GROWTH_RATE_FACTOR = "i2p.streaming.congestionAvoidanceGrowthRateFactor";
public static final String PROP_SLOW_START_GROWTH_RATE_FACTOR = "i2p.streaming.slowStartGrowthRateFactor";
private static final int TREND_COUNT = 3;
public ConnectionOptions() {
super();
}
@ -85,6 +88,8 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
protected void init(Properties opts) {
super.init(opts);
_trend = new int[TREND_COUNT];
setConnectDelay(getInt(opts, PROP_CONNECT_DELAY, -1));
setProfile(getInt(opts, PROP_PROFILE, PROFILE_BULK));
setMaxMessageSize(getInt(opts, PROP_MAX_MESSAGE_SIZE, 4*1024));
@ -93,7 +98,7 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
setResendDelay(getInt(opts, PROP_INITIAL_RESEND_DELAY, 1000));
setSendAckDelay(getInt(opts, PROP_INITIAL_ACK_DELAY, 500));
setWindowSize(getInt(opts, PROP_INITIAL_WINDOW_SIZE, 1));
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 5));
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 10));
setWriteTimeout(getInt(opts, PROP_WRITE_TIMEOUT, -1));
setInactivityTimeout(getInt(opts, PROP_INACTIVITY_TIMEOUT, 5*60*1000));
setInactivityAction(getInt(opts, PROP_INACTIVITY_ACTION, INACTIVITY_ACTION_DISCONNECT));
@ -125,7 +130,7 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
if (opts.containsKey(PROP_INITIAL_WINDOW_SIZE))
setWindowSize(getInt(opts, PROP_INITIAL_WINDOW_SIZE, 1));
if (opts.containsKey(PROP_MAX_RESENDS))
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 5));
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 10));
if (opts.containsKey(PROP_WRITE_TIMEOUT))
setWriteTimeout(getInt(opts, PROP_WRITE_TIMEOUT, -1));
if (opts.containsKey(PROP_INACTIVITY_TIMEOUT))
@ -186,11 +191,36 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
*/
public int getRTT() { return _rtt; }
public void setRTT(int ms) {
synchronized (_trend) {
_trend[0] = _trend[1];
_trend[1] = _trend[2];
if (ms > _rtt)
_trend[2] = 1;
else if (ms < _rtt)
_trend[2] = -1;
else
_trend[2] = 0;
}
_rtt = ms;
if (_rtt > 60*1000)
_rtt = 60*1000;
}
/**
* If we have 3 consecutive rtt increases, we are trending upwards (1), or if we have
* 3 consecutive rtt decreases, we are trending downwards (-1), else we're stable.
*
*/
public int getRTTTrend() {
synchronized (_trend) {
for (int i = 0; i < TREND_COUNT - 1; i++) {
if (_trend[i] != _trend[i+1])
return 0;
}
return _trend[0];
}
}
/** rtt = rtt*RTT_DAMPENING + (1-RTT_DAMPENING)*currentPacketRTT */
private static final double RTT_DAMPENING = 0.9;

View File

@ -26,6 +26,7 @@ public class ConnectionPacketHandler {
_context.statManager().createRateStat("stream.con.packetsAckedPerMessageReceived", "Size of a duplicate message received on a connection", "Stream", new long[] { 60*1000, 10*60*1000, 60*60*1000 });
_context.statManager().createRateStat("stream.sendsBeforeAck", "How many times a message was sent before it was ACKed?", "Stream", new long[] { 10*60*1000, 60*60*1000 });
_context.statManager().createRateStat("stream.resetReceived", "How many messages had we sent successfully before receiving a RESET?", "Stream", new long[] { 60*60*1000, 24*60*60*1000 });
_context.statManager().createRateStat("stream.trend", "What direction the RTT is trending in (with period = windowsize)", "Stream", new long[] { 60*1000, 60*60*1000 });
}
/** distribute a packet to the connection specified */
@ -61,7 +62,7 @@ public class ConnectionPacketHandler {
con.getOutputStream().setBufferSize(packet.getOptionalMaxSize());
}
}
con.packetReceived();
boolean choke = false;
@ -91,7 +92,20 @@ public class ConnectionPacketHandler {
_context.statManager().addRateData("stream.con.receiveMessageSize", packet.getPayloadSize(), 0);
boolean isNew = con.getInputStream().messageReceived(packet.getSequenceNum(), packet.getPayload());
boolean isNew = false;
boolean allowAck = true;
if ( (!packet.isFlagSet(Packet.FLAG_SYNCHRONIZE)) &&
( (packet.getSendStreamId() == null) ||
(packet.getReceiveStreamId() == null) ||
(DataHelper.eq(packet.getSendStreamId(), Packet.STREAM_ID_UNKNOWN)) ||
(DataHelper.eq(packet.getReceiveStreamId(), Packet.STREAM_ID_UNKNOWN)) ) )
allowAck = false;
if (allowAck)
isNew = con.getInputStream().messageReceived(packet.getSequenceNum(), packet.getPayload());
else
isNew = con.getInputStream().messageReceived(con.getInputStream().getHighestReadyBockId(), null);
if ( (packet.getSequenceNum() == 0) && (packet.getPayloadSize() > 0) ) {
if (_log.shouldLog(Log.DEBUG))
@ -177,7 +191,21 @@ public class ConnectionPacketHandler {
// con.getOptions().setRTT(con.getOptions().getRTT() + nacks.length*1000);
int numResends = 0;
List acked = con.ackPackets(ackThrough, nacks);
List acked = null;
// if we don't know the streamIds for both sides of the connection, there's no way we
// could actually be acking data (this fixes the buggered up ack of packet 0 problem).
// this is called after packet verification, which places the stream IDs as necessary if
// the SYN verifies (so if we're acking w/out stream IDs, no SYN has been received yet)
if ( (packet != null) && (packet.getSendStreamId() != null) && (packet.getReceiveStreamId() != null) &&
(con != null) && (con.getSendStreamId() != null) && (con.getReceiveStreamId() != null) &&
(!DataHelper.eq(packet.getSendStreamId(), Packet.STREAM_ID_UNKNOWN)) &&
(!DataHelper.eq(packet.getReceiveStreamId(), Packet.STREAM_ID_UNKNOWN)) &&
(!DataHelper.eq(con.getSendStreamId(), Packet.STREAM_ID_UNKNOWN)) &&
(!DataHelper.eq(con.getReceiveStreamId(), Packet.STREAM_ID_UNKNOWN)) )
acked = con.ackPackets(ackThrough, nacks);
else
return false;
if ( (acked != null) && (acked.size() > 0) ) {
if (_log.shouldLog(Log.DEBUG))
_log.debug(acked.size() + " of our packets acked with " + packet);
@ -247,8 +275,13 @@ public class ConnectionPacketHandler {
int oldWindow = con.getOptions().getWindowSize();
int newWindowSize = oldWindow;
int trend = con.getOptions().getRTTTrend();
_context.statManager().addRateData("stream.trend", trend, newWindowSize);
if ( (!congested) && (acked > 0) && (numResends <= 0) ) {
if (newWindowSize > con.getLastCongestionSeenAt() / 2) {
if ( (newWindowSize > con.getLastCongestionSeenAt() / 2) ||
(trend > 0) ) { // tcp vegas: avoidance if rtt is increasing, even if we arent at ssthresh/2 yet
// congestion avoidance
// we can't use newWindowSize += 1/newWindowSize, since we're

View File

@ -202,7 +202,7 @@ public class MessageInputStream extends InputStream {
public boolean messageReceived(long messageId, ByteArray payload) {
synchronized (_dataLock) {
if (_log.shouldLog(Log.DEBUG))
_log.debug("received " + messageId + " with " + payload.getValid());
_log.debug("received " + messageId + " with " + (payload != null ? payload.getValid()+"" : "no payload"));
if (messageId <= _highestReadyBlockId) {
if (_log.shouldLog(Log.DEBUG))
_log.debug("ignoring dup message " + messageId);

View File

@ -578,7 +578,7 @@ public class Packet {
return buf;
}
private static final String toId(byte id[]) {
static final String toId(byte id[]) {
if (id == null)
return Base64.encode(STREAM_ID_UNKNOWN);
else

31
apps/syndie/doc/intro.sml Normal file
View File

@ -0,0 +1,31 @@
Syndie is a new effort to build a user friendly secure blogging tool, exploiting the capabilities offered by anonymity and security systems such as [link schema="web" location="http://www.i2p.net/"]I2P[/link], [link schema="web" location="http://tor.eff.org/"]TOR[/link], [link schema="web" location="http://www.freenetproject.org/"]Freenet[/link], [link schema="web" location="http://www.mnetproject.org/"]MNet[/link], and others. Abstracting away the content distribution side, Syndie allows people to [b]build content and communities[/b] that span technologies rather than tying oneself down to the ups and downs of any particular network.
[cut][/cut]Syndie is working to take the technologies of the security, anonymity, and cryptography worlds and merge them with the simplicity and user focus of the blogging world. From the user's standpoint, you could perhaps view Syndie as a distributed [link schema="web" location="http://www.livejournal.com"]LiveJournal[/link], while technically Syndie is much, much simpler.
[b]How Syndie works[/b][hr][/hr]The [i]magic[/i] behind Syndie's abstraction is to ignore any content distribution issues and merely assume data moves around as necessary. Each Syndie instance runs agains the filesystem, verifying and indexing blogs and offering up what it knows to the user through a web interface. The core idea in Syndie, therefore, is the [b]archive[/b]- a collection of blogs categorized and ready for consumption.
Whenever someone reads or posts to a Syndie instance, it is working with the [b]local archive[/b]. However, as Syndie's development progresses, people will be able to read [b]remote archives[/b] - pulling the archive summary from an I2P [i]eepsite[/i], TOR [i]hosted service[/i], Freenet [i]Freesite[/i], MNet [i]key[/i], or (with a little less glamor) usenet, filesharing apps, or the web. The first thing Syndie needs to use a remote archive is the archive's index - a plain text file summarizing what the archive contains ([attachment id="0"]an example[/attachment]). From that, Syndie will let the user browse through the blogs, pulling the individual blog posts into the local archive when necessary.
[b]Posting[/b][hr][/hr]Creating and posting to blogs with Syndie is trivial - simply log in to Syndie, click on the [i]Post[/i] button, and fill out the form offered. Syndie handles all of the encryption and formatting details - packaging up the post with any attached files into a single signed, compressed, and potentially encrypted bundle, storing it in the local archive and capable of being shared with other Syndie users. Every blog is identified by its public key behind the scenes, so there is no need for a central authority to require that your blogs are all named uniquely or any other such thing.
While each blog is run by a single author, they can in turn allow other authors to post to the blog while still letting readers know that the post is authorized (though created by a different author). Of course, if multiple people wanted to run a single blog and make it look like only one person wrote it, they could share the blog's private keys.
[b]Tags[/b][hr][/hr]Following the lessons from the last few years, every Syndie entry has any number of tags associated with it by the author, allowing trivial categorization and filtering.
[b]Hosting[/b][hr][/hr]While in many scenarios it is best for people to run Syndie locally on their machine, Syndie is a fully multiuser system so anyone can be a Syndie hosting provider by simply exposing the web interface to the public. The Syndie host's operator can password protect the blog registration interface so only authorized people can create a blog, and the operator can technically go through and delete blog posts or even entire blogs from their local archive. A public Syndie host can be a general purpose blog repository, letting anyone sign up (following the blogger and geocities path), be a more community oriented blog repository, requiring people to introduce you to the host to sign up (following the livejournal/orkut path), be a more focused blog repository, requiring posts to stay within certain guidelines (following the indymedia path), or even to fit specialized needs by picking and choosing among the best blogs and posts out there, offering the operator's editorial flare into a comprehensive collection.
[b]Syndication[/b][hr][/hr]By itself, Syndie is a nice blogging community system, but its real strength as a tool for individual and community empowerment comes when blogs are shared. While Syndie does not aim to be a content distribution network, it does want to exploit them to allow those who require their message to get out to do so. By design, syndicating Syndie can be done with some of the most basic tools - simply pass around the self authenticating files written to the archive and you're done. The archive itself is organized so that you can expose it as an indexed directory in some webserver and let people wget against it, picking to pull individual posts, all posts within a blog, all posts since a given date, or all posts in all blogs. With a very small shell script, you could parse the plain text archive summary to pull posts by size and tag as well. People could offer up their archives as rsync repositories or package up tarballs/zipfiles of blogs or entries - simply grabbing them and extracting them into your local Syndie archive would instantly give you access to all of the content therein.
Of course, manual syndication as described above has... limits. When appropriate, Syndie will tie in to content syndication systems such as [link schema="eep" location="http://feedspace.i2p/"]Feedspace[/link] (or even good ol' Usenet) to automatically import (and export) posts. Integration with content distribution networks like Freenet and MNet will allow the user to periodically grab a published archive index and pull down blogs as necessary. Posting archives and blogs to those networks will be done trivially as well, though they do still depend upon a polling paradigm.
[b]SML[/b][hr][/hr]Syndie is meant to work securely with any browser regardless of the browser's security. Blog entries are written in [b]SML[/b] [i](Syndie or Secure Markup Language)[/i] with a bbcode-linke syntax, extended to exploit some of Syndie's capabilities and context. In addition to the SML content in a blog entry, there can be any number of attachments, references to other blogs/posts/tags, nym<->public key mappings (useful for I2P host distribution), references to archives of blogs (on eepsites, freesites, etc), links to various resources, and more.
[b]Future[/b][hr][/hr]Down the road, there are lots of things to improve with Syndie. The interface, of course, is critical, as are tools for SML authoring and improvements to SML itself to offer a more engaging user experience. Integration with a search engine like Lucene would allow full text search through entire archives, and Atom/RSS interfaces would allow trivial import and export to existing clients. Even further, blogs could be transparently encrypted, allowing only authorized users (those with the key) to read entries posted to them (or even know what attachments are included). Integration with existing blogging services (such as [link schema="web" location="http://www.anonyblog.com"]anonyblog[/link], [link schema="web" location="http://blo.gs"]blo.gs[/link], and [link schema="web" location="http://livejournal.com"]livejournal[/link]) may also be explored. Of course, bundling with I2P and other anonymity, security, and community systems will be pursued.
[b]Who/where/when/why[/b][hr][/hr]The base Syndie system was written in a few days by [blog name="jrandom" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" archive0="eep://dev.i2p/~jrandom" archive1="http://dev.i2p.net/~jrandom" archive2="mailto://jrandom@i2p.net"][/blog], though comes out of discussions with [link schema="eep" location="http://frosk.i2p"]Frosk[/link] and many others in the I2P community. Yes, this is an incarnation of [b]MyI2P[/b] (or for those who remember jrand0m's flog, [b]Flogger[/b]).
All of the Syndie code is of course open source and released into the public domain (the [i]real[/i] "free as in freedom"), though it does use some BSD licensed cryptographic routines and an Apache licensed file upload component. Contributions of code are very much welcome - the source is located within the [link schema="web" location="http://www.i2p.net/cvs"]I2P codebase[/link]. Of course, those who cannot or choose not to contribute code are encouraged to [b]use[/b] Syndie - create a blog, create some content, read some content! For those who really want to though, financial contributions to the Syndie development effort can be channeled through the [link schema="web" location="http://www.i2p.net/donate"]I2P fund[/link] (donations for Syndie are distributed to Syndie developers from time to time).
The "why" of Syndie is a much bigger question, though is hopefully self-evident. We need kickass anonymity-aware client applications so that we can get better anonymity (since without kickass clients, we don't have many users). We also need kickass tools for safe blogging, since there are limits to the strength offered by low latency anonymity systems like I2P and TOR - Syndie goes beyond them to offer an interface to mid and high latency anonymous systems while exploiting their capabilities for fast and efficient syndication.
Oh, and jrandom also lost his blog's private key, so needed something to blog with again.

View File

@ -0,0 +1,27 @@
To install this base instance:
mkdir lib
cp ../lib/i2p.jar lib/
cp ../lib/commons-el.jar lib/
cp ../lib/commons-logging.jar lib/
cp ../lib/jasper-compiler.jar lib/
cp ../lib/jasper-runtime.jar lib/
cp ../lib/javax.servlet.jar lib/
cp ../lib/jbigi.jar lib/
cp ../lib/org.mortbay.jetty.jar lib/
cp ../lib/xercesImpl.jar lib/
To run it:
sh run.sh
firefox http://localhost:7653/syndie/
You can share your archive at http://localhost:7653/ so
that people can syndicate off you via
cd archive ; wget -m -nH http://yourmachine:7653/
You may want to add a password on the registration form
so that you have control over who can create blogs via /syndie/.
To do so, set the password in the run.sh script.
Windows users:
write your own instructions. We're alpha, here ;)

41
apps/syndie/doc/sml.sml Normal file
View File

@ -0,0 +1,41 @@
[cut]A brief glance at SML[/cut]
[b]General rules[/b]
Newlines are newlines are newlines. If you include a newline in your SML, you'll get a newline in the rendered HTML.
All < and > characters are replaced by their HTML entity counterparts.
All SML tags are enclosed with [[ and ]] (e.g. [[b]]bold stuff[[/b]]). ([[ and ]] characters are quoted by [[[[ and ]]]], respectively)
Nesting SML tags is [b]not[/b] currently supported (though will be at a later date).
All SML tags must have a beginning and end tag (even for ones without any 'body', such as [[hr]][[/hr]]). This restriction may be removed later.
Simple formatting tags behave as expected: [[b]], [[i]], [[u]], [[h1]] through [[h5]], [[hr]], [[pre]].
[hr][/hr]
[b]Tag details[/b]
* To cut an entry so that the summary is before while the details are afterwards:
[[cut]]more inside...[[/cut]]
* To load an attachment as an image with "syndie's logo" as the alternate text:
[[img attachment="0"]]syndie's logo[[/img]]
* To add a download link to an attachment:
[[attachment id="0"]]anchor text[[/img]]
* To quote someone:
[[quote author="who you are quoting" location="blog://ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=/1234567890"]]stuff they said[[/quote]]
* To sample some code:
[[code location="eep://dev.i2p/cgi-bin/cvsweb.cgi/i2p/index.html"]]<html>[[/code]]
* To link to a [blog name="jrandom" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" blogentry="1124402137773" archive0="eep://dev.i2p/~jrandom/archive" archive1="irc2p://jrandom@irc.postman.i2p/#i2p"]bitchin' blog[/blog]:
[[blog name="the blogs name" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" blogtag="tag" blogentry="123456789" archive0="eep://dev.i2p/~jrandom/archive/" archive1="freenet://SSK@blah/archive//"]]description of the blog[[/blog]]. blogentry and blogtag are optional and there can be any number of archiveN locations specified.
* To link to an [link schema="eep" location="http://dev.i2p/"]external resource[/link]:
[[link schema="eep" location="http://dev.i2p/"]]link to it[[/link]].
[i]The schema should be a network selection tool, such as "eep" for an eepsite, "tor" for a tor hidden service, "web" for a normal website, "freenet" for a freenet key, etc. The local user's Syndie configuration should include information necessary for the user to access the content referenced through the given schemas.[/i]
* To pass an [address name="dev.i2p" schema="eep" location="NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA"]addressbook entry[/address]:
[[address name="dev.i2p" schema="eep" location="NF2...AAAA"]]add it[[/address]].

101
apps/syndie/java/build.xml Normal file
View File

@ -0,0 +1,101 @@
<?xml version="1.0" encoding="UTF-8"?>
<project basedir="." default="all" name="syndie">
<target name="all" depends="clean, build" />
<target name="build" depends="builddep, jar" />
<target name="builddep">
<ant dir="../../jetty/" target="build" />
<ant dir="../../../core/java/" target="build" />
<!-- ministreaming will build core -->
</target>
<target name="compile">
<mkdir dir="./build" />
<mkdir dir="./build/obj" />
<javac
srcdir="./src"
debug="true" deprecation="on" source="1.3" target="1.3"
destdir="./build/obj"
classpath="../../../core/java/build/i2p.jar:../../jetty/jettylib/org.mortbay.jetty.jar:../../jetty/jettylib/javax.servlet.jar" />
</target>
<target name="jar" depends="builddep, compile">
<jar destfile="./build/syndie.jar" basedir="./build/obj" includes="**/*.class">
<manifest>
<attribute name="Main-Class" value="net.i2p.syndie.CLI" />
<attribute name="Class-Path" value="i2p.jar" />
</manifest>
</jar>
<ant target="war" />
</target>
<target name="war" depends="builddep, compile, precompilejsp">
<war destfile="../syndie.war" webxml="../jsp/web-out.xml">
<fileset dir="../jsp/" includes="**/*" excludes=".nbintdb, web.xml, web-out.xml, web-fragment.xml, **/*.java, **/*.jsp" />
<classes dir="./build/obj" />
</war>
</target>
<target name="precompilejsp">
<delete dir="../jsp/WEB-INF/" />
<delete file="../jsp/web-fragment.xml" />
<delete file="../jsp/web-out.xml" />
<mkdir dir="../jsp/WEB-INF/" />
<mkdir dir="../jsp/WEB-INF/classes" />
<!-- there are various jspc ant tasks, but they all seem a bit flakey -->
<java classname="org.apache.jasper.JspC" fork="true" >
<classpath>
<pathelement location="../../jetty/jettylib/jasper-compiler.jar" />
<pathelement location="../../jetty/jettylib/jasper-runtime.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/commons-logging.jar" />
<pathelement location="../../jetty/jettylib/commons-el.jar" />
<pathelement location="../../jetty/jettylib/org.mortbay.jetty.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/ant.jar" />
<pathelement location="build/obj" />
<pathelement location="../../../core/java/build/i2p.jar" />
</classpath>
<arg value="-d" />
<arg value="../jsp/WEB-INF/classes" />
<arg value="-p" />
<arg value="net.i2p.syndie.jsp" />
<arg value="-webinc" />
<arg value="../jsp/web-fragment.xml" />
<arg value="-webapp" />
<arg value="../jsp/" />
</java>
<javac debug="true" deprecation="on" source="1.3" target="1.3"
destdir="../jsp/WEB-INF/classes/" srcdir="../jsp/WEB-INF/classes" includes="**/*.java" >
<classpath>
<pathelement location="../../jetty/jettylib/jasper-runtime.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/commons-logging.jar" />
<pathelement location="../../jetty/jettylib/commons-el.jar" />
<pathelement location="../../jetty/jettylib/org.mortbay.jetty.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="build/obj" />
<pathelement location="../../../core/java/build/i2p.jar" />
</classpath>
</javac>
<copy file="../jsp/web.xml" tofile="../jsp/web-out.xml" />
<loadfile property="jspc.web.fragment" srcfile="../jsp/web-fragment.xml" />
<replace file="../jsp/web-out.xml">
<replacefilter token="&lt;!-- precompiled servlets --&gt;" value="${jspc.web.fragment}" />
</replace>
</target>
<target name="javadoc">
<mkdir dir="./build" />
<mkdir dir="./build/javadoc" />
<javadoc
sourcepath="./src:../../../core/java/src" destdir="./build/javadoc"
packagenames="*"
use="true"
splitindex="true"
windowtitle="syndie" />
</target>
<target name="clean">
<delete dir="./build" />
</target>
<target name="cleandep" depends="clean">
<ant dir="../../../core/java/" target="distclean" />
</target>
<target name="distclean" depends="clean">
<ant dir="../../../core/java/" target="distclean" />
</target>
</project>

View File

@ -0,0 +1,418 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import java.text.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
/**
* Store blog info in the local filesystem.
*
* Entries are stored under:
* $rootDir/$h(blogKey)/$entryId.snd (the index lists them as YYYYMMDD_n_jKB)
* Blog info is stored under:
* $rootDir/$h(blogKey)/meta.snm
* Archive summary is stored under
* $rootDir/archive.txt
* Any key=value pairs in
* $rootDir/archiveHeaders.txt
* are injected into the archive.txt on regeneration.
*
* When entries are loaded for extraction/verification/etc, their contents are written to
* $cacheDir/$h(blogKey)/$entryId/ (e.g. $cacheDir/$h(blogKey)/$entryId/entry.sml)
*/
public class Archive {
private I2PAppContext _context;
private File _rootDir;
private File _cacheDir;
private Map _blogInfo;
private ArchiveIndex _index;
private EntryExtractor _extractor;
private String _defaultSelector;
public static final String METADATA_FILE = "meta.snm";
public static final String INDEX_FILE = "archive.txt";
public static final String HEADER_FILE = "archiveHeaders.txt";
private static final FilenameFilter _entryFilenameFilter = new FilenameFilter() {
public boolean accept(File dir, String name) { return name.endsWith(".snd"); }
};
public Archive(I2PAppContext ctx, String rootDir, String cacheDir) {
_context = ctx;
_rootDir = new File(rootDir);
if (!_rootDir.exists())
_rootDir.mkdirs();
_cacheDir = new File(cacheDir);
if (!_cacheDir.exists())
_cacheDir.mkdirs();
_blogInfo = new HashMap();
_index = null;
_extractor = new EntryExtractor(ctx);
_defaultSelector = ctx.getProperty("syndie.defaultSelector");
if (_defaultSelector == null) _defaultSelector = "";
reloadInfo();
}
public void reloadInfo() {
File f[] = _rootDir.listFiles();
List info = new ArrayList();
for (int i = 0; i < f.length; i++) {
if (f[i].isDirectory()) {
File meta = new File(f[i], METADATA_FILE);
if (meta.exists()) {
BlogInfo bi = new BlogInfo();
try {
bi.load(new FileInputStream(meta));
if (bi.verify(_context)) {
info.add(bi);
} else {
System.err.println("Invalid blog (but we're storing it anyway): " + bi);
new Exception("foo").printStackTrace();
info.add(bi);
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
}
synchronized (_blogInfo) {
_blogInfo.clear();
for (int i = 0; i < info.size(); i++) {
BlogInfo bi = (BlogInfo)info.get(i);
_blogInfo.put(bi.getKey().calculateHash(), bi);
}
}
}
public String getDefaultSelector() { return _defaultSelector; }
public BlogInfo getBlogInfo(BlogURI uri) {
if (uri == null) return null;
synchronized (_blogInfo) {
return (BlogInfo)_blogInfo.get(uri.getKeyHash());
}
}
public BlogInfo getBlogInfo(Hash key) {
synchronized (_blogInfo) {
return (BlogInfo)_blogInfo.get(key);
}
}
public boolean storeBlogInfo(BlogInfo info) {
if (!info.verify(_context)) {
System.err.println("Not storing the invalid blog " + info);
new Exception("foo!").printStackTrace();
return false;
}
boolean isNew = true;
synchronized (_blogInfo) {
BlogInfo old = (BlogInfo)_blogInfo.get(info.getKey().calculateHash());
if ( (old == null) || (old.getEdition() < info.getEdition()) )
_blogInfo.put(info.getKey().calculateHash(), info);
else
isNew = false;
}
if (!isNew) return true; // valid entry, but not stored, since its old
try {
File blogDir = new File(_rootDir, info.getKey().calculateHash().toBase64());
blogDir.mkdirs();
File blogFile = new File(blogDir, "meta.snm");
FileOutputStream out = new FileOutputStream(blogFile);
info.write(out);
out.close();
System.out.println("Blog info written to " + blogFile.getPath());
return true;
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
public List listBlogs() {
synchronized (_blogInfo) {
return new ArrayList(_blogInfo.values());
}
}
private File getEntryDir(File entryFile) {
String name = entryFile.getName();
if (!name.endsWith(".snd")) throw new RuntimeException("hmm, why are we trying to get an entry dir for " + entryFile.getAbsolutePath());
String blog = entryFile.getParentFile().getName();
File blogDir = new File(_cacheDir, blog);
return new File(blogDir, name.substring(0, name.length()-4));
//return new File(entryFile.getParentFile(), "." + name.substring(0, name.length()-4));
}
/**
* Expensive operation, reading all entries within the blog and parsing out the tags.
* Whenever possible, query the index instead of the archive
*
*/
public List listTags(Hash blogKeyHash) {
List rv = new ArrayList();
BlogInfo info = getBlogInfo(blogKeyHash);
if (info == null)
return rv;
File blogDir = new File(_rootDir, Base64.encode(blogKeyHash.getData()));
File entries[] = blogDir.listFiles(_entryFilenameFilter);
for (int j = 0; j < entries.length; j++) {
try {
File entryDir = getEntryDir(entries[j]);
EntryContainer entry = null;
if (entryDir.exists())
entry = getCachedEntry(entryDir);
if ( (entry == null) || (!entryDir.exists()) ) {
if (!extractEntry(entries[j], entryDir, info)) {
System.err.println("Entry " + entries[j].getPath() + " is not valid");
new Exception("foo!!").printStackTrace();
continue;
}
entry = getCachedEntry(entryDir);
}
String tags[] = entry.getTags();
for (int t = 0; t < tags.length; t++) {
if (!rv.contains(tags[t])) {
System.out.println("Found a new tag in cached " + entry.getURI() + ": " + tags[t]);
rv.add(tags[t]);
}
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
} // end iterating over the entries
return rv;
}
/**
* Extract the entry to the given dir, returning true if it was verified properly
*
*/
private boolean extractEntry(File entryFile, File entryDir, BlogInfo info) throws IOException {
if (!entryDir.exists())
entryDir.mkdirs();
boolean ok = _extractor.extract(entryFile, entryDir, null, info);
if (!ok) {
File files[] = entryDir.listFiles();
for (int i = 0; i < files.length; i++)
files[i].delete();
entryDir.delete();
}
return ok;
}
private EntryContainer getCachedEntry(File entryDir) {
try {
return new CachedEntry(entryDir);
} catch (IOException ioe) {
ioe.printStackTrace();
File files[] = entryDir.listFiles();
for (int i = 0; i < files.length; i++)
files[i].delete();
entryDir.delete();
return null;
}
}
public EntryContainer getEntry(BlogURI uri) { return getEntry(uri, null); }
public EntryContainer getEntry(BlogURI uri, SessionKey blogKey) {
List entries = listEntries(uri, null, blogKey);
if (entries.size() > 0)
return (EntryContainer)entries.get(0);
else
return null;
}
public List listEntries(BlogURI uri, String tag, SessionKey blogKey) {
return listEntries(uri.getKeyHash(), uri.getEntryId(), tag, blogKey);
}
public List listEntries(Hash blog, long entryId, String tag, SessionKey blogKey) {
List rv = new ArrayList();
BlogInfo info = getBlogInfo(blog);
if (info == null)
return rv;
File blogDir = new File(_rootDir, blog.toBase64());
File entries[] = blogDir.listFiles(_entryFilenameFilter);
if (entries == null)
return rv;
for (int i = 0; i < entries.length; i++) {
try {
EntryContainer entry = null;
if (blogKey == null) {
// no key, cache.
File entryDir = getEntryDir(entries[i]);
if (entryDir.exists())
entry = getCachedEntry(entryDir);
if ((entry == null) || !entryDir.exists()) {
if (!extractEntry(entries[i], entryDir, info)) {
System.err.println("Entry " + entries[i].getPath() + " is not valid");
new Exception("foo!!!!").printStackTrace();
continue;
}
entry = getCachedEntry(entryDir);
}
} else {
// we have an explicit key - no caching
entry = new EntryContainer();
entry.load(new FileInputStream(entries[i]));
boolean ok = entry.verifySignature(_context, info);
if (!ok) {
System.err.println("Keyed entry " + entries[i].getPath() + " is not valid");
new Exception("foo!!!!!!").printStackTrace();
continue;
}
entry.parseRawData(_context, blogKey);
entry.setCompleteSize((int)entries[i].length());
}
if (entryId >= 0) {
if (entry.getURI().getEntryId() == entryId) {
rv.add(entry);
return rv;
}
} else if (tag != null) {
String tags[] = entry.getTags();
for (int j = 0; j < tags.length; j++) {
if (tags[j].equals(tag)) {
rv.add(entry);
System.out.println("cached entry matched requested tag [" + tag + "]: " + entry.getURI());
break;
}
}
} else {
System.out.println("cached entry is ok and no id or tag was requested: " + entry.getURI());
rv.add(entry);
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
return rv;
}
public boolean storeEntry(EntryContainer container) {
if (container == null) return false;
BlogURI uri = container.getURI();
if (uri == null) return false;
File blogDir = new File(_rootDir, uri.getKeyHash().toBase64());
blogDir.mkdirs();
File entryFile = new File(blogDir, getEntryFilename(uri.getEntryId()));
if (entryFile.exists()) return true;
BlogInfo info = getBlogInfo(uri);
if (info == null) {
System.out.println("no blog metadata for the uri " + uri);
return false;
}
if (!container.verifySignature(_context, info)) {
System.out.println("Not storing the invalid blog entry at " + uri);
return false;
} else {
//System.out.println("Signature is valid: " + container.getSignature() + " for info " + info);
}
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
container.write(baos, true);
byte data[] = baos.toByteArray();
FileOutputStream out = new FileOutputStream(entryFile);
out.write(data);
out.close();
container.setCompleteSize(data.length);
return true;
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
public static String getEntryFilename(long entryId) { return entryId + ".snd"; }
private static SimpleDateFormat _dateFmt = new SimpleDateFormat("yyyyMMdd", Locale.UK);
public static String getIndexName(long entryId, int numBytes) {
try {
synchronized (_dateFmt) {
String yy = _dateFmt.format(new Date(entryId));
long begin = _dateFmt.parse(yy).getTime();
long n = entryId - begin;
int kb = numBytes / 1024;
return yy + '_' + n + '_' + kb + "KB";
}
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return "UNKNOWN";
} catch (ParseException pe) {
pe.printStackTrace();
return "UNKNOWN";
}
}
public static long getEntryIdFromIndexName(String entryIndexName) {
if (entryIndexName == null) return -1;
if (entryIndexName.endsWith(".snd"))
entryIndexName = entryIndexName.substring(0, entryIndexName.length() - 4);
int endYY = entryIndexName.indexOf('_');
if (endYY <= 0) return -1;
int endN = entryIndexName.indexOf('_', endYY+1);
if (endN <= 0) return -1;
String yy = entryIndexName.substring(0, endYY);
String n = entryIndexName.substring(endYY+1, endN);
try {
synchronized (_dateFmt) {
long dayBegin = _dateFmt.parse(yy).getTime();
long dayEntry = Long.parseLong(n);
return dayBegin + dayEntry;
}
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
} catch (ParseException pe) {
pe.printStackTrace();
}
return -1;
}
public static int getSizeFromIndexName(String entryIndexName) {
if (entryIndexName == null) return -1;
if (entryIndexName.endsWith(".snd"))
entryIndexName = entryIndexName.substring(0, entryIndexName.length() - 4);
int beginSize = entryIndexName.lastIndexOf('_');
if ( (beginSize <= 0) || (beginSize >= entryIndexName.length()-3) )
return -1;
try {
String sz = entryIndexName.substring(beginSize+1, entryIndexName.length()-2);
return Integer.parseInt(sz);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
}
return -1;
}
public ArchiveIndex getIndex() {
if (_index == null)
regenerateIndex();
return _index;
}
public File getArchiveDir() { return _rootDir; }
public File getIndexFile() { return new File(_rootDir, INDEX_FILE); }
public void regenerateIndex() {
reloadInfo();
_index = ArchiveIndexer.index(_context, this);
try {
FileOutputStream out = new FileOutputStream(new File(_rootDir, INDEX_FILE));
out.write(DataHelper.getUTF8(_index.toString()));
out.flush();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}

View File

@ -0,0 +1,190 @@
package net.i2p.syndie;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
* Dig through the archive to build an index
*/
class ArchiveIndexer {
private static final int RECENT_BLOG_COUNT = 10;
private static final int RECENT_ENTRY_COUNT = 10;
public static ArchiveIndex index(I2PAppContext ctx, Archive source) {
LocalArchiveIndex rv = new LocalArchiveIndex();
rv.setGeneratedOn(ctx.clock().now());
File rootDir = source.getArchiveDir();
File headerFile = new File(rootDir, Archive.HEADER_FILE);
if (headerFile.exists()) {
try {
BufferedReader in = new BufferedReader(new InputStreamReader(new FileInputStream(headerFile), "UTF-8"));
String line = null;
while ( (line = in.readLine()) != null) {
StringTokenizer tok = new StringTokenizer(line, ":");
if (tok.countTokens() == 2)
rv.setHeader(tok.nextToken(), tok.nextToken());
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
// things are new if we just received them in the last day
long newSince = ctx.clock().now() - 24*60*60*1000;
rv.setVersion(Version.INDEX_VERSION);
/** 0-lowestEntryId --> blog Hash */
Map blogsByAge = new TreeMap();
/** 0-entryId --> BlogURI */
Map entriesByAge = new TreeMap();
List blogs = source.listBlogs();
rv.setAllBlogs(blogs.size());
int newEntries = 0;
int allEntries = 0;
long newSize = 0;
long totalSize = 0;
int newBlogs = 0;
SMLParser parser = new SMLParser();
for (int i = 0; i < blogs.size(); i++) {
BlogInfo cur = (BlogInfo)blogs.get(i);
Hash key = cur.getKey().calculateHash();
String keyStr = Base64.encode(key.getData());
File blogDir = new File(rootDir, Base64.encode(key.getData()));
File metaFile = new File(blogDir, Archive.METADATA_FILE);
long metadate = metaFile.lastModified();
List entries = source.listEntries(key, -1, null, null);
System.out.println("Entries under " + key + ": " + entries);
/** tag name --> ordered map of entryId to EntryContainer */
Map tags = new TreeMap();
for (int j = 0; j < entries.size(); j++) {
EntryContainer entry = (EntryContainer)entries.get(j);
entriesByAge.put(new Long(0-entry.getURI().getEntryId()), entry.getURI());
allEntries++;
totalSize += entry.getCompleteSize();
String entryTags[] = entry.getTags();
for (int t = 0; t < entryTags.length; t++) {
if (!tags.containsKey(entryTags[t])) {
tags.put(entryTags[t], new TreeMap());
//System.err.println("New tag [" + entryTags[t] + "]");
}
Map entriesByTag = (Map)tags.get(entryTags[t]);
entriesByTag.put(new Long(0-entry.getURI().getEntryId()), entry);
System.out.println("Entries under tag " + entryTags[t] + ":" + entriesByTag.values());
}
if (entry.getURI().getEntryId() >= newSince) {
newEntries++;
newSize += entry.getCompleteSize();
}
HeaderReceiver rec = new HeaderReceiver();
parser.parse(entry.getEntry().getText(), rec);
String reply = rec.getHeader(HTMLRenderer.HEADER_IN_REPLY_TO);
if (reply != null) {
BlogURI parent = new BlogURI(reply.trim());
if ( (parent.getKeyHash() != null) && (parent.getEntryId() >= 0) )
rv.addReply(parent, entry.getURI());
else
System.err.println("Parent of " + entry.getURI() + " is not valid: [" + reply.trim() + "]");
}
}
long lowestEntryId = -1;
for (Iterator iter = tags.keySet().iterator(); iter.hasNext(); ) {
String tagName = (String)iter.next();
Map tagEntries = (Map)tags.get(tagName);
long highestId = -1;
if (tagEntries.size() <= 0) break;
Long id = (Long)tagEntries.keySet().iterator().next();
highestId = 0 - id.longValue();
rv.addBlog(key, tagName, highestId);
for (Iterator entryIter = tagEntries.values().iterator(); entryIter.hasNext(); ) {
EntryContainer entry = (EntryContainer)entryIter.next();
String indexName = Archive.getIndexName(entry.getURI().getEntryId(), entry.getCompleteSize());
rv.addBlogEntry(key, tagName, indexName);
if (!entryIter.hasNext())
lowestEntryId = entry.getURI().getEntryId();
}
}
if (lowestEntryId > newSince)
newBlogs++;
blogsByAge.put(new Long(0-lowestEntryId), key);
}
rv.setAllEntries(allEntries);
rv.setNewBlogs(newBlogs);
rv.setNewEntries(newEntries);
rv.setTotalSize(totalSize);
rv.setNewSize(newSize);
int i = 0;
for (Iterator iter = blogsByAge.keySet().iterator(); iter.hasNext() && i < RECENT_BLOG_COUNT; i++) {
Long when = (Long)iter.next();
Hash key = (Hash)blogsByAge.get(when);
rv.addNewestBlog(key);
}
i = 0;
for (Iterator iter = entriesByAge.keySet().iterator(); iter.hasNext() && i < RECENT_ENTRY_COUNT; i++) {
Long when = (Long)iter.next();
BlogURI uri = (BlogURI)entriesByAge.get(when);
rv.addNewestEntry(uri);
}
return rv;
}
private static class HeaderReceiver implements SMLParser.EventReceiver {
private Properties _headers;
public HeaderReceiver() { _headers = null; }
public String getHeader(String name) { return (_headers != null ? _headers.getProperty(name) : null); }
public void receiveHeader(String header, String value) {
if (_headers == null) _headers = new Properties();
_headers.setProperty(header, value);
}
public void receiveAddress(String name, String schema, String location, String anchorText) {}
public void receiveArchive(String name, String description, String locationSchema, String location, String postingKey, String anchorText) {}
public void receiveAttachment(int id, String anchorText) {}
public void receiveBegin() {}
public void receiveBlog(String name, String blogKeyHash, String blogPath, long blogEntryId, List blogArchiveLocations, String anchorText) {}
public void receiveBold(String text) {}
public void receiveCode(String text, String codeLocationSchema, String codeLocation) {}
public void receiveCut(String summaryText) {}
public void receiveEnd() {}
public void receiveGT() {}
public void receiveH1(String text) {}
public void receiveH2(String text) {}
public void receiveH3(String text) {}
public void receiveH4(String text) {}
public void receiveH5(String text) {}
public void receiveHR() {}
public void receiveHeaderEnd() {}
public void receiveImage(String alternateText, int attachmentId) {}
public void receiveItalic(String text) {}
public void receiveLT() {}
public void receiveLeftBracket() {}
public void receiveLink(String schema, String location, String text) {}
public void receiveNewline() {}
public void receivePlain(String text) {}
public void receivePre(String text) {}
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation) {}
public void receiveRightBracket() {}
public void receiveUnderline(String text) {}
}
}

View File

@ -0,0 +1,485 @@
package net.i2p.syndie;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*
*/
public class BlogManager {
private I2PAppContext _context;
private static BlogManager _instance;
private File _blogKeyDir;
private File _privKeyDir;
private File _archiveDir;
private File _userDir;
private File _cacheDir;
private File _tempDir;
private File _rootDir;
private Archive _archive;
static {
TimeZone.setDefault(TimeZone.getTimeZone("GMT"));
String rootDir = I2PAppContext.getGlobalContext().getProperty("syndie.rootDir");
if (false) {
if (rootDir == null)
rootDir = System.getProperty("user.home");
rootDir = rootDir + File.separatorChar + ".syndie";
} else {
if (rootDir == null)
rootDir = "./syndie";
}
_instance = new BlogManager(I2PAppContext.getGlobalContext(), rootDir);
}
public static BlogManager instance() { return _instance; }
public BlogManager(I2PAppContext ctx, String rootDir) {
_context = ctx;
_rootDir = new File(rootDir);
_rootDir.mkdirs();
readConfig();
_blogKeyDir = new File(_rootDir, "blogkeys");
_privKeyDir = new File(_rootDir, "privkeys");
String archiveDir = _context.getProperty("syndie.archiveDir");
if (archiveDir != null)
_archiveDir = new File(archiveDir);
else
_archiveDir = new File(_rootDir, "archive");
_userDir = new File(_rootDir, "users");
_cacheDir = new File(_rootDir, "cache");
_tempDir = new File(_rootDir, "temp");
_blogKeyDir.mkdirs();
_privKeyDir.mkdirs();
_archiveDir.mkdirs();
_cacheDir.mkdirs();
_userDir.mkdirs();
_tempDir.mkdirs();
_archive = new Archive(ctx, _archiveDir.getAbsolutePath(), _cacheDir.getAbsolutePath());
_archive.regenerateIndex();
}
private void readConfig() {
File config = new File(_rootDir, "syndie.config");
if (config.exists()) {
try {
Properties p = new Properties();
DataHelper.loadProps(p, config);
for (Iterator iter = p.keySet().iterator(); iter.hasNext(); ) {
String key = (String)iter.next();
System.setProperty(key, p.getProperty(key));
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
public void writeConfig() {
File config = new File(_rootDir, "syndie.config");
FileOutputStream out = null;
try {
out = new FileOutputStream(config);
for (Iterator iter = _context.getPropertyNames().iterator(); iter.hasNext(); ) {
String name = (String)iter.next();
if (name.startsWith("syndie."))
out.write(DataHelper.getUTF8(name + '=' + _context.getProperty(name) + '\n'));
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
}
public BlogInfo createBlog(String name, String description, String contactURL, String archives[]) {
return createBlog(name, null, description, contactURL, archives);
}
public BlogInfo createBlog(String name, SigningPublicKey posters[], String description, String contactURL, String archives[]) {
Object keys[] = _context.keyGenerator().generateSigningKeypair();
SigningPublicKey pub = (SigningPublicKey)keys[0];
SigningPrivateKey priv = (SigningPrivateKey)keys[1];
try {
FileOutputStream out = new FileOutputStream(new File(_privKeyDir, Base64.encode(pub.calculateHash().getData()) + ".priv"));
pub.writeBytes(out);
priv.writeBytes(out);
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return null;
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
return createInfo(pub, priv, name, posters, description, contactURL, archives, 0);
}
public BlogInfo createInfo(SigningPublicKey pub, SigningPrivateKey priv, String name, SigningPublicKey posters[],
String description, String contactURL, String archives[], int edition) {
Properties opts = new Properties();
opts.setProperty("Name", name);
opts.setProperty("Description", description);
opts.setProperty("Edition", Integer.toString(edition));
opts.setProperty("ContactURL", contactURL);
for (int i = 0; archives != null && i < archives.length; i++)
opts.setProperty("Archive." + i, archives[i]);
BlogInfo info = new BlogInfo(pub, posters, opts);
info.sign(_context, priv);
_archive.storeBlogInfo(info);
return info;
}
public Archive getArchive() { return _archive; }
public File getTempDir() { return _tempDir; }
public List listMyBlogs() {
File files[] = _privKeyDir.listFiles();
List rv = new ArrayList();
for (int i = 0; i < files.length; i++) {
if (files[i].isFile() && !files[i].isHidden()) {
try {
SigningPublicKey pub = new SigningPublicKey();
pub.readBytes(new FileInputStream(files[i]));
BlogInfo info = _archive.getBlogInfo(pub.calculateHash());
if (info != null)
rv.add(info);
} catch (IOException ioe) {
ioe.printStackTrace();
} catch (DataFormatException dfe) {
dfe.printStackTrace();
}
}
}
return rv;
}
public SigningPrivateKey getMyPrivateKey(BlogInfo blog) {
if (blog == null) return null;
File keyFile = new File(_privKeyDir, Base64.encode(blog.getKey().calculateHash().getData()) + ".priv");
try {
FileInputStream in = new FileInputStream(keyFile);
SigningPublicKey pub = new SigningPublicKey();
pub.readBytes(in);
SigningPrivateKey priv = new SigningPrivateKey();
priv.readBytes(in);
return priv;
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return null;
}
}
public String login(User user, String login, String pass) {
Hash userHash = _context.sha().calculateHash(DataHelper.getUTF8(login));
Hash passHash = _context.sha().calculateHash(DataHelper.getUTF8(pass));
File userFile = new File(_userDir, Base64.encode(userHash.getData()));
System.out.println("Attempting to login to " + login + " w/ pass = " + pass
+ ": file = " + userFile.getAbsolutePath() + " passHash = "
+ Base64.encode(passHash.getData()));
if (userFile.exists()) {
try {
Properties props = new Properties();
FileInputStream fin = new FileInputStream(userFile);
BufferedReader in = new BufferedReader(new InputStreamReader(fin, "UTF-8"));
String line = null;
while ( (line = in.readLine()) != null) {
int split = line.indexOf('=');
if (split <= 0) continue;
String key = line.substring(0, split);
String val = line.substring(split+1);
props.setProperty(key.trim(), val.trim());
}
return user.login(login, pass, props);
} catch (IOException ioe) {
ioe.printStackTrace();
return "Error logging in - corrupt userfile";
}
} else {
return "User does not exist";
}
}
/** hash of the password required to register and create a new blog (null means no password required) */
public String getRegistrationPassword() {
String pass = _context.getProperty("syndie.registrationPassword");
if ( (pass == null) || (pass.trim().length() <= 0) ) return null;
return pass;
}
public void saveUser(User user) {
if (!user.getAuthenticated()) return;
String userHash = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(user.getUsername())).getData());
File userFile = new File(_userDir, userHash);
FileOutputStream out = null;
try {
out = new FileOutputStream(userFile);
out.write(DataHelper.getUTF8(user.export()));
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe){}
}
}
public String register(User user, String login, String password, String registrationPassword, String blogName, String blogDescription, String contactURL) {
System.err.println("Register [" + login + "] pass [" + password + "] name [" + blogName + "] descr [" + blogDescription + "] contact [" + contactURL + "]");
System.err.println("reference bad string: [" + EncodingTestGenerator.TEST_STRING + "]");
String hashedRegistrationPassword = getRegistrationPassword();
if (hashedRegistrationPassword != null) {
try {
if (!hashedRegistrationPassword.equals(Base64.encode(_context.sha().calculateHash(registrationPassword.getBytes("UTF-8")).getData())))
return "Invalid registration password";
} catch (UnsupportedEncodingException uee) {
return "Error registering";
}
}
String userHash = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(login)).getData());
File userFile = new File(_userDir, userHash);
if (userFile.exists()) {
return "Cannot register the login " + login + ": it already exists";
} else {
BlogInfo info = createBlog(blogName, blogDescription, contactURL, null);
String hashedPassword = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(password)).getData());
FileOutputStream out = null;
try {
out = new FileOutputStream(userFile);
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(out, "UTF-8"));
bw.write("password=" + hashedPassword + "\n");
bw.write("blog=" + Base64.encode(info.getKey().calculateHash().getData()) + "\n");
bw.write("lastid=-1\n");
bw.write("lastmetaedition=0\n");
bw.write("addressbook=userhosts-"+userHash + ".txt\n");
bw.write("showimages=false\n");
bw.write("showexpanded=false\n");
bw.flush();
} catch (IOException ioe) {
ioe.printStackTrace();
return "Internal error registering - " + ioe.getMessage();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
String loginResult = login(user, login, password);
_archive.regenerateIndex();
return loginResult;
}
}
public BlogURI createBlogEntry(User user, String subject, String tags, String entryHeaders, String sml) {
return createBlogEntry(user, subject, tags, entryHeaders, sml, null, null, null);
}
public BlogURI createBlogEntry(User user, String subject, String tags, String entryHeaders, String sml, List fileNames, List fileStreams, List fileTypes) {
if (!user.getAuthenticated()) return null;
BlogInfo info = getArchive().getBlogInfo(user.getBlog());
if (info == null) return null;
SigningPrivateKey privkey = getMyPrivateKey(info);
if (privkey == null) return null;
long entryId = -1;
long now = _context.clock().now();
long dayBegin = getDayBegin(now);
if (user.getMostRecentEntry() >= dayBegin)
entryId = user.getMostRecentEntry() + 1;
else
entryId = dayBegin;
StringTokenizer tok = new StringTokenizer(tags, " ,\n\t");
String tagList[] = new String[tok.countTokens()];
for (int i = 0; i < tagList.length; i++)
tagList[i] = tok.nextToken().trim();
BlogURI uri = new BlogURI(user.getBlog(), entryId);
try {
StringBuffer raw = new StringBuffer(sml.length() + 128);
raw.append("Subject: ").append(subject).append('\n');
raw.append("Tags: ");
for (int i = 0; i < tagList.length; i++)
raw.append(tagList[i]).append('\t');
raw.append('\n');
if ( (entryHeaders != null) && (entryHeaders.trim().length() > 0) ) {
System.out.println("Entry headers: " + entryHeaders);
BufferedReader userHeaders = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(DataHelper.getUTF8(entryHeaders)), "UTF-8"));
String line = null;
while ( (line = userHeaders.readLine()) != null) {
line = line.trim();
System.out.println("Line: " + line);
if (line.length() <= 0) continue;
int split = line.indexOf('=');
int split2 = line.indexOf(':');
if ( (split < 0) || ( (split2 > 0) && (split2 < split) ) ) split = split2;
String key = line.substring(0,split).trim();
String val = line.substring(split+1).trim();
raw.append(key).append(": ").append(val).append('\n');
}
}
raw.append('\n');
raw.append(sml);
EntryContainer c = new EntryContainer(uri, tagList, DataHelper.getUTF8(raw));
if ((fileNames != null) && (fileStreams != null) && (fileNames.size() == fileStreams.size()) ) {
for (int i = 0; i < fileNames.size(); i++) {
String name = (String)fileNames.get(i);
InputStream in = (InputStream)fileStreams.get(i);
String fileType = (fileTypes != null ? (String)fileTypes.get(i) : "application/octet-stream");
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
while (true) {
int read = in.read(buf);
if (read == -1) break;
baos.write(buf, 0, read);
}
byte att[] = baos.toByteArray();
if ( (att != null) && (att.length > 0) )
c.addAttachment(att, new File(name).getName(), null, fileType);
}
}
//for (int i = 7; i < args.length; i++) {
// c.addAttachment(read(args[i]), new File(args[i]).getName(),
// "Attached file", "application/octet-stream");
//}
SessionKey entryKey = null;
//if (!"NONE".equals(args[5]))
// entryKey = new SessionKey(Base64.decode(args[5]));
c.seal(_context, privkey, null);
boolean ok = getArchive().storeEntry(c);
if (ok) {
getArchive().regenerateIndex();
user.setMostRecentEntry(entryId);
saveUser(user);
return uri;
} else {
return null;
}
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
}
/**
* read in the syndie blog metadata file from the stream, verifying it and adding it to
* the archive if necessary
*
*/
public boolean importBlogMetadata(InputStream metadataStream) throws IOException {
try {
BlogInfo info = new BlogInfo();
info.load(metadataStream);
return _archive.storeBlogInfo(info);
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
/**
* read in the syndie entry file from the stream, verifying it and adding it to
* the archive if necessary
*
*/
public boolean importBlogEntry(InputStream entryStream) throws IOException {
try {
EntryContainer c = new EntryContainer();
c.load(entryStream);
return _archive.storeEntry(c);
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
public String addAddress(User user, String name, String location, String schema) {
if (!user.getAuthenticated()) return "Not logged in";
boolean ok = validateAddressName(name);
if (!ok) return "Invalid name: " + HTMLRenderer.sanitizeString(name);
ok = validateAddressLocation(location);
if (!ok) return "Invalid location: " + HTMLRenderer.sanitizeString(location);
if (!validateAddressSchema(schema)) return "Unsupported schema: " + HTMLRenderer.sanitizeString(schema);
// no need to quote user/location further, as they've been sanitized
FileOutputStream out = null;
try {
File userHostsFile = new File(user.getAddressbookLocation());
Properties knownHosts = getKnownHosts(user, true);
if (knownHosts.containsKey(name)) return "Name is already in use";
out = new FileOutputStream(userHostsFile, true);
out.write(DataHelper.getUTF8(name + "=" + location + '\n'));
return "Address " + name + " written to your hosts file (" + userHostsFile.getName() + ")";
} catch (IOException ioe) {
return "Error writing out host entry: " + ioe.getMessage();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
}
public Properties getKnownHosts(User user, boolean includePublic) throws IOException {
Properties rv = new Properties();
if ( (user != null) && (user.getAuthenticated()) ) {
File userHostsFile = new File(user.getAddressbookLocation());
rv.putAll(getKnownHosts(userHostsFile));
}
if (includePublic) {
rv.putAll(getKnownHosts(new File("hosts.txt")));
}
return rv;
}
private Properties getKnownHosts(File filename) throws IOException {
Properties rv = new Properties();
if (filename.exists()) {
rv.load(new FileInputStream(filename));
}
return rv;
}
private boolean validateAddressName(String name) {
if ( (name == null) || (name.trim().length() <= 0) || (!name.endsWith(".i2p")) ) return false;
for (int i = 0; i < name.length(); i++) {
char c = name.charAt(i);
if (!Character.isLetterOrDigit(c) && ('.' != c) && ('-' != c) && ('_' != c) )
return false;
}
return true;
}
private boolean validateAddressLocation(String location) {
if ( (location == null) || (location.trim().length() <= 0) ) return false;
try {
Destination d = new Destination(location);
return (d.getPublicKey() != null);
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return false;
}
}
private boolean validateAddressSchema(String schema) {
if ( (schema == null) || (schema.trim().length() <= 0) ) return false;
return "eep".equals(schema) || "i2p".equals(schema);
}
private final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
private final long getDayBegin(long now) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(now));
return _dateFormat.parse(str).getTime();
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return -1;
}
}
}
}

View File

@ -0,0 +1,188 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*/
public class CLI {
public static final String USAGE = "Usage: \n" +
"rootDir regenerateIndex\n" +
"rootDir createBlog name description contactURL[ archiveURL]*\n" +
"rootDir createEntry blogPublicKeyHash tag[,tag]* (NOW|entryId) (NONE|entryKeyBase64) smlFile[ attachmentFile]*\n" +
"rootDir listMyBlogs\n" +
"rootDir listTags blogPublicKeyHash\n" +
"rootDir listEntries blogPublicKeyHash blogTag\n" +
"rootDir renderEntry blogPublicKeyHash entryId (NONE|entryKeyBase64) summaryOnly includeImages\n";
public static void main(String args[]) {
//args = new String[] { "~/.syndie/", "listEntries", "9qXCJUyUBCCaiIShURo02ckxjrMvrtiDYENv2ATL3-Y=", "/" };
//args = new String[] { "~/.syndie/", "renderEntry", "Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=", "/", "20050811001", "NONE", "true", "false" };
if (args.length < 2) {
System.err.print(USAGE);
return;
}
String command = args[1];
if ("createBlog".equals(command))
createBlog(args);
else if ("listMyBlogs".equals(command))
listMyBlogs(args);
else if ("createEntry".equals(command))
createEntry(args);
else if ("listTags".equals(command))
listPaths(args);
else if ("listEntries".equals(command))
listEntries(args);
else if ("regenerateIndex".equals(command))
regenerateIndex(args);
else if ("renderEntry".equals(command))
renderEntry(args);
else
System.out.print(USAGE);
}
private static void createBlog(String args[]) {
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
String archives[] = new String[args.length - 5];
System.arraycopy(args, 5, archives, 0, archives.length);
BlogInfo info = mgr.createBlog(args[2], args[3], args[4], archives);
System.out.println("Blog created: " + info);
mgr.getArchive().regenerateIndex();
}
private static void listMyBlogs(String args[]) {
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List info = mgr.listMyBlogs();
for (int i = 0; i < info.size(); i++)
System.out.println(info.get(i).toString());
}
private static void listPaths(String args[]) {
// "rootDir listTags blogPublicKeyHash\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List tags = mgr.getArchive().listTags(new Hash(Base64.decode(args[2])));
System.out.println("tag count: " + tags.size());
for (int i = 0; i < tags.size(); i++)
System.out.println("Tag " + i + ": " + tags.get(i).toString());
}
private static void regenerateIndex(String args[]) {
// "rootDir regenerateIndex\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
mgr.getArchive().regenerateIndex();
System.out.println("Index regenerated");
}
private static void listEntries(String args[]) {
// "rootDir listEntries blogPublicKeyHash tag\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List entries = mgr.getArchive().listEntries(new Hash(Base64.decode(args[2])), -1, args[3], null);
System.out.println("Entry count: " + entries.size());
for (int i = 0; i < entries.size(); i++) {
EntryContainer entry = (EntryContainer)entries.get(i);
System.out.println("***************************************************");
System.out.println("Entry " + i + ": " + entry.getURI().toString());
System.out.println("===================================================");
System.out.println(entry.getEntry().getText());
System.out.println("===================================================");
Attachment attachments[] = entry.getAttachments();
for (int j = 0; j < attachments.length; j++) {
System.out.println("Attachment " + j + ": " + attachments[j]);
}
System.out.println("===================================================");
}
}
private static void renderEntry(String args[]) {
//"rootDir renderEntry blogPublicKeyHash entryId (NONE|entryKeyBase64) summaryOnly includeImages\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
long id = -1;
try {
id = Long.parseLong(args[3]);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return;
}
SessionKey entryKey = null;
if (!("NONE".equals(args[4])))
entryKey = new SessionKey(Base64.decode(args[5]));
EntryContainer entry = mgr.getArchive().getEntry(new BlogURI(new Hash(Base64.decode(args[2])), id), entryKey);
if (entry != null) {
HTMLRenderer renderer = new HTMLRenderer();
boolean summaryOnly = "true".equalsIgnoreCase(args[5]);
boolean showImages = "true".equalsIgnoreCase(args[6]);
try {
File f = File.createTempFile("syndie", ".html");
Writer out = new OutputStreamWriter(new FileOutputStream(f), "UTF-8");
renderer.render(null, mgr.getArchive(), entry, out, summaryOnly, showImages);
out.flush();
out.close();
System.out.println("Rendered to " + f.getAbsolutePath() + ": " + f.length());
} catch (IOException ioe) {
ioe.printStackTrace();
}
} else {
System.err.println("Entry does not exist");
}
}
private static void createEntry(String args[]) {
// "rootDir createEntry blogPublicKey tag[,tag]* (NOW|entryId) (NONE|entryKeyBase64) smlFile[ attachmentFile]*\n" +
I2PAppContext ctx = I2PAppContext.getGlobalContext();
BlogManager mgr = new BlogManager(ctx, args[0]);
long entryId = -1;
if ("NOW".equals(args[4])) {
entryId = ctx.clock().now();
} else {
try {
entryId = Long.parseLong(args[4]);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return;
}
}
StringTokenizer tok = new StringTokenizer(args[3], ",");
String tags[] = new String[tok.countTokens()];
for (int i = 0; i < tags.length; i++)
tags[i] = tok.nextToken();
BlogURI uri = new BlogURI(new Hash(Base64.decode(args[2])), entryId);
BlogInfo blog = mgr.getArchive().getBlogInfo(uri);
if (blog == null) {
System.err.println("Blog does not exist: " + uri);
return;
}
SigningPrivateKey key = mgr.getMyPrivateKey(blog);
try {
byte smlData[] = read(args[6]);
EntryContainer c = new EntryContainer(uri, tags, smlData);
for (int i = 7; i < args.length; i++) {
c.addAttachment(read(args[i]), new File(args[i]).getName(),
"Attached file", "application/octet-stream");
}
SessionKey entryKey = null;
if (!"NONE".equals(args[5]))
entryKey = new SessionKey(Base64.decode(args[5]));
c.seal(ctx, key, entryKey);
boolean ok = mgr.getArchive().storeEntry(c);
System.out.println("Blog entry created: " + c+ "? " + ok);
if (ok)
mgr.getArchive().regenerateIndex();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
private static final byte[] read(String file) throws IOException {
File f = new File(file);
FileInputStream in = new FileInputStream(f);
byte rv[] = new byte[(int)f.length()];
if (rv.length != DataHelper.read(in, rv))
throw new IOException("File not read completely");
return rv;
}
}

View File

@ -0,0 +1,237 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
/**
* Lazy loading wrapper for an entry, pulling data out of a cached & extracted dir,
* rather than dealing with the crypto, zip, etc.
*
*/
class CachedEntry extends EntryContainer {
private File _entryDir;
private int _format;
private int _size;
private BlogURI _blog;
private Properties _headers;
private Entry _entry;
private Attachment _attachments[];
public CachedEntry(File entryDir) throws IOException {
_entryDir = entryDir;
importMeta();
_entry = new CachedEntryDetails();
_attachments = null;
}
// always available, loaded from meta
public int getFormat() { return _format; }
public BlogURI getURI() { return _blog; }
public int getCompleteSize() { return _size; }
// dont need to override it, as it works off getHeader
//public String[] getTags() { return super.getTags(); }
public Entry getEntry() { return _entry; }
public Attachment[] getAttachments() {
importAttachments();
return _attachments;
}
public String getHeader(String key) {
importHeaders();
return _headers.getProperty(key);
}
public String toString() { return getURI().toString(); }
public boolean verifySignature(I2PAppContext ctx, BlogInfo info) { return true; }
// not supported...
public void parseRawData(I2PAppContext ctx) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public void parseRawData(I2PAppContext ctx, SessionKey zipKey) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public void setHeader(String name, String val) {
throw new IllegalStateException("Not supported on cached entries");
}
public void addAttachment(byte data[], String name, String description, String mimeType) {
throw new IllegalStateException("Not supported on cached entries");
}
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public Signature getSignature() {
throw new IllegalStateException("Not supported on cached entries");
}
// now the actual lazy loading code
private void importMeta() {
Properties meta = readProps(new File(_entryDir, EntryExtractor.META));
_format = getInt(meta, "format");
_size = getInt(meta, "size");
_blog = new BlogURI(new Hash(Base64.decode(meta.getProperty("blog"))), getLong(meta, "entry"));
}
private Properties importHeaders() {
if (_headers == null)
_headers = readProps(new File(_entryDir, EntryExtractor.HEADERS));
return _headers;
}
private void importAttachments() {
if (_attachments == null) {
List attachments = new ArrayList();
int i = 0;
while (true) {
File meta = new File(_entryDir, EntryExtractor.ATTACHMENT_PREFIX + i + EntryExtractor.ATTACHMENT_META_SUFFIX);
if (meta.exists())
attachments.add(new CachedAttachment(i, meta));
else
break;
i++;
}
Attachment a[] = new Attachment[attachments.size()];
for (i = 0; i < a.length; i++)
a[i] = (Attachment)attachments.get(i);
_attachments = a;
}
return;
}
private static Properties readProps(File propsFile) {
Properties rv = new Properties();
BufferedReader in = null;
try {
in = new BufferedReader(new InputStreamReader(new FileInputStream(propsFile), "UTF-8"));
String line = null;
while ( (line = in.readLine()) != null) {
int split = line.indexOf('=');
if ( (split <= 0) || (split >= line.length()) ) continue;
rv.setProperty(line.substring(0, split).trim(), line.substring(split+1).trim());
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
}
return rv;
}
private static final int getInt(Properties props, String key) {
String val = props.getProperty(key);
try { return Integer.parseInt(val); } catch (NumberFormatException nfe) {}
return -1;
}
private static final long getLong(Properties props, String key) {
String val = props.getProperty(key);
try { return Long.parseLong(val); } catch (NumberFormatException nfe) {}
return -1l;
}
private class CachedEntryDetails extends Entry {
private String _text;
public CachedEntryDetails() {
super(null);
}
public String getText() {
importText();
return _text;
}
private void importText() {
if (_text == null) {
InputStream in = null;
try {
File f = new File(_entryDir, EntryExtractor.ENTRY);
byte buf[] = new byte[(int)f.length()]; // hmm
in = new FileInputStream(f);
int read = DataHelper.read(in, buf);
if (read != buf.length) throw new IOException("read: " + read + " file size: " + buf.length + " for " + f.getPath());
_text = DataHelper.getUTF8(buf);
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
}
}
}
}
private class CachedAttachment extends Attachment {
private int _attachmentId;
private File _metaFile;
private Properties _attachmentHeaders;
private int _dataSize;
public CachedAttachment(int id, File meta) {
super(null, null);
_attachmentId = id;
_metaFile = meta;
_attachmentHeaders = null;
}
public int getDataLength() {
importAttachmentHeaders();
return _dataSize;
}
public byte[] getData() {
throw new IllegalStateException("Not supported on cached entries");
}
public InputStream getDataStream() throws IOException {
String name = EntryExtractor.ATTACHMENT_PREFIX + _attachmentId + EntryExtractor.ATTACHMENT_DATA_SUFFIX;
File f = new File(_entryDir, name);
return new FileInputStream(f);
}
public byte[] getRawMetadata() {
throw new IllegalStateException("Not supported on cached entries");
}
public String getMeta(String key) {
importAttachmentHeaders();
return _attachmentHeaders.getProperty(key);
}
//public String getName() { return getMeta(NAME); }
//public String getDescription() { return getMeta(DESCRIPTION); }
//public String getMimeType() { return getMeta(MIMETYPE); }
public void setMeta(String key, String val) {
throw new IllegalStateException("Not supported on cached entries");
}
public Map getMeta() {
importAttachmentHeaders();
return _attachmentHeaders;
}
public String toString() {
importAttachmentHeaders();
int len = _dataSize;
return getName()
+ (getDescription() != null ? ": " + getDescription() : "")
+ (getMimeType() != null ? ", type: " + getMimeType() : "")
+ ", size: " + len;
}
private void importAttachmentHeaders() {
if (_attachmentHeaders == null) {
Properties props = readProps(_metaFile);
String sz = (String)props.remove(EntryExtractor.ATTACHMENT_DATA_SIZE);
if (sz != null) {
try {
_dataSize = Integer.parseInt(sz);
} catch (NumberFormatException nfe) {}
}
_attachmentHeaders = props;
}
}
}
}

View File

@ -0,0 +1,132 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.I2PAppContext;
/**
* To cut down on unnecessary IO/cpu load, extract entries onto the disk for
* faster access later. Individual entries are stored in subdirectories based on
* their name - $archiveDir/$blogDir/$entryId.snd extracts its files into various
* files under $cacheDir/$blogDir/$entryId/:
* headers.txt: name=value pairs for attributes of the entry container itself
* info.txt: name=value pairs for implicit attributes of the container (blog, id, format, size)
* entry.sml: raw sml file
* attachmentN_data.dat: raw binary data for attachment N
* attachmentN_meta.dat: name=value pairs for attributes of attachment N
*
*/
public class EntryExtractor {
private I2PAppContext _context;
static final String HEADERS = "headers.txt";
static final String META = "meta.txt";
static final String ENTRY = "entry.sml";
static final String ATTACHMENT_PREFIX = "attachment";
static final String ATTACHMENT_DATA_SUFFIX = "_data.dat";
static final String ATTACHMENT_META_SUFFIX = "_meta.txt";
static final String ATTACHMENT_DATA_SIZE = "EntryExtractor__dataSize";
public EntryExtractor(I2PAppContext context) {
_context = context;
}
public boolean extract(File entryFile, File entryDir, SessionKey entryKey, BlogInfo info) throws IOException {
EntryContainer entry = new EntryContainer();
entry.load(new FileInputStream(entryFile));
boolean ok = entry.verifySignature(_context, info);
if (!ok) {
return false;
} else {
entry.setCompleteSize((int)entryFile.length());
if (entryKey != null)
entry.parseRawData(_context, entryKey);
else
entry.parseRawData(_context);
extract(entry, entryDir);
return true;
}
}
public void extract(EntryContainer entry, File entryDir) throws IOException {
extractHeaders(entry, entryDir);
extractMeta(entry, entryDir);
extractEntry(entry, entryDir);
Attachment attachments[] = entry.getAttachments();
if (attachments != null) {
for (int i = 0; i < attachments.length; i++) {
extractAttachmentData(i, attachments[i], entryDir);
extractAttachmentMetadata(i, attachments[i], entryDir);
}
}
}
private void extractHeaders(EntryContainer entry, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, HEADERS));
Map headers = entry.getHeaders();
for (Iterator iter = headers.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = (String)headers.get(k);
out.write(DataHelper.getUTF8(k.trim() + '=' + v.trim() + '\n'));
}
} finally {
out.close();
}
}
private void extractMeta(EntryContainer entry, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, META));
out.write(DataHelper.getUTF8("format=" + entry.getFormat() + '\n'));
out.write(DataHelper.getUTF8("size=" + entry.getCompleteSize() + '\n'));
out.write(DataHelper.getUTF8("blog=" + entry.getURI().getKeyHash().toBase64() + '\n'));
out.write(DataHelper.getUTF8("entry=" + entry.getURI().getEntryId() + '\n'));
} finally {
out.close();
}
}
private void extractEntry(EntryContainer entry, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, ENTRY));
out.write(DataHelper.getUTF8(entry.getEntry().getText()));
} finally {
out.close();
}
}
private void extractAttachmentData(int num, Attachment attachment, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, ATTACHMENT_PREFIX + num + ATTACHMENT_DATA_SUFFIX));
//out.write(attachment.getData());
InputStream data = attachment.getDataStream();
byte buf[] = new byte[1024];
int read = 0;
while ( (read = data.read(buf)) != -1)
out.write(buf, 0, read);
data.close();
} finally {
out.close();
}
}
private void extractAttachmentMetadata(int num, Attachment attachment, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, ATTACHMENT_PREFIX + num + ATTACHMENT_META_SUFFIX));
Map meta = attachment.getMeta();
for (Iterator iter = meta.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = (String)meta.get(k);
out.write(DataHelper.getUTF8(k + '=' + v + '\n'));
}
out.write(DataHelper.getUTF8(ATTACHMENT_DATA_SIZE + '=' + attachment.getDataLength()));
} finally {
out.close();
}
}
}

View File

@ -0,0 +1,231 @@
package net.i2p.syndie;
import java.io.UnsupportedEncodingException;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
/**
* User session state and preferences.
*
*/
public class User {
private I2PAppContext _context;
private String _username;
private String _hashedPassword;
private Hash _blog;
private long _mostRecentEntry;
/** Group name to List of blog selectors, where the selectors are of the form
* blog://$key, entry://$key/$entryId, blogtag://$key/$tag, tag://$tag
*/
private Map _blogGroups;
/** list of blogs (Hash) we never want to see entries from */
private List _shitlistedBlogs;
/** where our userhosts.txt is */
private String _addressbookLocation;
private boolean _showImagesByDefault;
private boolean _showExpandedByDefault;
private String _defaultSelector;
private long _lastLogin;
private long _lastMetaEntry;
private boolean _allowAccessRemote;
private boolean _authenticated;
private String _eepProxyHost;
private int _eepProxyPort;
private String _webProxyHost;
private int _webProxyPort;
private String _torProxyHost;
private int _torProxyPort;
public User() {
_context = I2PAppContext.getGlobalContext();
init();
}
private void init() {
_authenticated = false;
_username = null;
_hashedPassword = null;
_blog = null;
_mostRecentEntry = -1;
_blogGroups = new HashMap();
_shitlistedBlogs = new ArrayList();
_defaultSelector = null;
_addressbookLocation = "userhosts.txt";
_showImagesByDefault = false;
_showExpandedByDefault = false;
_allowAccessRemote = false;
_eepProxyHost = null;
_webProxyHost = null;
_torProxyHost = null;
_eepProxyPort = -1;
_webProxyPort = -1;
_torProxyPort = -1;
_lastLogin = -1;
_lastMetaEntry = 0;
}
public boolean getAuthenticated() { return _authenticated; }
public String getUsername() { return _username; }
public Hash getBlog() { return _blog; }
public String getBlogStr() { return Base64.encode(_blog.getData()); }
public long getMostRecentEntry() { return _mostRecentEntry; }
public Map getBlogGroups() { return _blogGroups; }
public List getShitlistedBlogs() { return _shitlistedBlogs; }
public String getAddressbookLocation() { return _addressbookLocation; }
public boolean getShowImages() { return _showImagesByDefault; }
public boolean getShowExpanded() { return _showExpandedByDefault; }
public long getLastLogin() { return _lastLogin; }
public String getHashedPassword() { return _hashedPassword; }
public long getLastMetaEntry() { return _lastMetaEntry; }
public String getDefaultSelector() { return _defaultSelector; }
public void setDefaultSelector(String sel) { _defaultSelector = sel; }
public boolean getAllowAccessRemote() { return _allowAccessRemote; }
public void setAllowAccessRemote(boolean allow) { _allowAccessRemote = true; }
public void setMostRecentEntry(long id) { _mostRecentEntry = id; }
public void setLastMetaEntry(long id) { _lastMetaEntry = id; }
public String getEepProxyHost() { return _eepProxyHost; }
public int getEepProxyPort() { return _eepProxyPort; }
public String getWebProxyHost() { return _webProxyHost; }
public int getWebProxyPort() { return _webProxyPort; }
public String getTorProxyHost() { return _torProxyHost; }
public int getTorProxyPort() { return _torProxyPort; }
public void invalidate() {
BlogManager.instance().saveUser(this);
init();
}
public String login(String login, String pass, Properties props) {
String expectedPass = props.getProperty("password");
String hpass = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(pass)).getData());
if (!hpass.equals(expectedPass)) {
_authenticated = false;
return "Incorrect password";
}
_username = login;
_hashedPassword = expectedPass;
// blog=luS9d3uaf....HwAE=
String b = props.getProperty("blog");
if (b != null) _blog = new Hash(Base64.decode(b));
// lastid=12345
String id = props.getProperty("lastid");
if (id != null) try { _mostRecentEntry = Long.parseLong(id); } catch (NumberFormatException nfe) {}
// lastmetaedition=12345
id = props.getProperty("lastmetaedition");
if (id != null) try { _lastMetaEntry = Long.parseLong(id); } catch (NumberFormatException nfe) {}
// groups=abc:selector,selector,selector,selector def:selector,selector,selector
StringTokenizer tok = new StringTokenizer(props.getProperty("groups", ""), " ");
while (tok.hasMoreTokens()) {
String group = tok.nextToken();
int endName = group.indexOf(':');
if (endName <= 0)
continue;
String groupName = group.substring(0, endName);
String sel = group.substring(endName+1);
List selectors = new ArrayList();
while ( (sel != null) && (sel.length() > 0) ) {
int end = sel.indexOf(',');
if (end < 0) {
selectors.add(sel);
sel = null;
} else {
if (end + 1 >= sel.length()) {
selectors.add(sel.substring(0,end));
sel = null;
} else if (end == 0) {
sel = sel.substring(1);
} else {
selectors.add(sel.substring(0, end));
sel = sel.substring(end+1);
}
}
}
_blogGroups.put(groupName.trim(), selectors);
}
// shitlist=hash,hash,hash
tok = new StringTokenizer(props.getProperty("shitlistedblogs", ""), ",");
while (tok.hasMoreTokens()) {
String blog = tok.nextToken();
byte bl[] = Base64.decode(blog);
if ( (bl != null) && (bl.length == Hash.HASH_LENGTH) )
_shitlistedBlogs.add(new Hash(bl));
}
String addr = props.getProperty("addressbook", "userhosts.txt");
if (addr != null)
_addressbookLocation = addr;
String show = props.getProperty("showimages", "false");
_showImagesByDefault = (show != null) && (show.equals("true"));
show = props.getProperty("showexpanded", "false");
_showExpandedByDefault = (show != null) && (show.equals("true"));
_defaultSelector = props.getProperty("defaultselector");
String allow = props.getProperty("allowaccessremote", "false");
_allowAccessRemote = (allow != null) && (allow.equals("true"));
_eepProxyPort = getInt(props.getProperty("eepproxyport"));
_webProxyPort = getInt(props.getProperty("webproxyport"));
_torProxyPort = getInt(props.getProperty("torproxyport"));
_eepProxyHost = props.getProperty("eepproxyhost");
_webProxyHost = props.getProperty("webproxyhost");
_torProxyHost = props.getProperty("torproxyhost");
_lastLogin = _context.clock().now();
_authenticated = true;
return LOGIN_OK;
}
private int getInt(String val) {
if (val == null) return -1;
try { return Integer.parseInt(val); } catch (NumberFormatException nfe) { return -1; }
}
public static final String LOGIN_OK = "Logged in";
public String export() {
StringBuffer buf = new StringBuffer(512);
buf.append("password=" + getHashedPassword() + "\n");
buf.append("blog=" + getBlog().toBase64() + "\n");
buf.append("lastid=" + getMostRecentEntry() + "\n");
buf.append("lastmetaedition=" + getLastMetaEntry() + "\n");
buf.append("lastlogin=" + getLastLogin() + "\n");
buf.append("addressbook=" + getAddressbookLocation() + "\n");
buf.append("showimages=" + getShowImages() + "\n");
buf.append("showexpanded=" + getShowExpanded() + "\n");
buf.append("defaultselector=" + getDefaultSelector() + "\n");
buf.append("allowaccessremote=" + _allowAccessRemote + "\n");
buf.append("groups=");
Map groups = getBlogGroups();
for (Iterator iter = groups.keySet().iterator(); iter.hasNext(); ) {
String name = (String)iter.next();
List selectors = (List)groups.get(name);
buf.append(name).append(':');
for (int i = 0; i < selectors.size(); i++) {
buf.append(selectors.get(i));
if (i + 1 < selectors.size())
buf.append(",");
}
if (iter.hasNext())
buf.append(' ');
}
buf.append('\n');
// shitlist=hash,hash,hash
List shitlistedBlogs = getShitlistedBlogs();
if (shitlistedBlogs.size() > 0) {
buf.setLength(0);
buf.append("shitlistedblogs=");
for (int i = 0; i < shitlistedBlogs.size(); i++) {
Hash blog = (Hash)shitlistedBlogs.get(i);
buf.append(blog.toBase64());
if (i + 1 < shitlistedBlogs.size())
buf.append(',');
}
buf.append('\n');
}
return buf.toString();
}
}

View File

@ -0,0 +1,11 @@
package net.i2p.syndie;
/**
*
*/
public class Version {
public static final String VERSION = "0-alpha";
public static final String BUILD = "0";
public static final String INDEX_VERSION = "1.0";
public static final String ID = "$Id$";
}

View File

@ -0,0 +1,438 @@
package net.i2p.syndie.data;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
import net.i2p.syndie.BlogManager;
/**
* Simple read-only summary of an archive
*/
public class ArchiveIndex {
protected String _version;
protected long _generatedOn;
protected int _allBlogs;
protected int _newBlogs;
protected int _allEntries;
protected int _newEntries;
protected long _totalSize;
protected long _newSize;
/** list of BlogSummary objects */
protected List _blogs;
/** list of Hash objects */
protected List _newestBlogs;
/** list of BlogURI objects */
protected List _newestEntries;
/** parent message to a set of replies, ordered with the oldest first */
protected Map _replies;
protected Properties _headers;
public ArchiveIndex() {
this(false); //true);
}
public ArchiveIndex(boolean shouldLoad) {
_blogs = new ArrayList();
_newestBlogs = new ArrayList();
_newestEntries = new ArrayList();
_headers = new Properties();
_replies = Collections.synchronizedMap(new HashMap());
_generatedOn = -1;
if (shouldLoad)
setIsLocal("true");
}
public String getVersion() { return _version; }
public Properties getHeaders() { return _headers; }
public int getAllBlogs() { return _allBlogs; }
public int getNewBlogs() { return _newBlogs; }
public int getAllEntries() { return _allEntries; }
public int getNewEntries() { return _newEntries; }
public long getTotalSize() { return _totalSize; }
public long getNewSize() { return _newSize; }
public long getGeneratedOn() { return _generatedOn; }
public String getNewSizeStr() {
if (_newSize < 1024) return _newSize + "";
if (_newSize < 1024*1024) return _newSize/1024 + "KB";
else return _newSize/(1024*1024) + "MB";
}
public String getTotalSizeStr() {
if (_totalSize < 1024) return _totalSize + "";
if (_totalSize < 1024*1024) return _totalSize/1024 + "KB";
else return _totalSize/(1024*1024) + "MB";
}
/** how many blogs/tags are indexed */
public int getIndexBlogs() { return _blogs.size(); }
/** get the blog used for the given blog/tag pair */
public Hash getBlog(int index) { return ((BlogSummary)_blogs.get(index)).blog; }
/** get the tag used for the given blog/tag pair */
public String getBlogTag(int index) { return ((BlogSummary)_blogs.get(index)).tag; }
/** get the highest entry ID for the given blog/tag pair */
public long getBlogLastUpdated(int index) { return ((BlogSummary)_blogs.get(index)).lastUpdated; }
/** get the entry count for the given blog/tag pair */
public int getBlogEntryCount(int index) { return ((BlogSummary)_blogs.get(index)).entries.size(); }
/** get the entry from the given blog/tag pair */
public BlogURI getBlogEntry(int index, int entryIndex) { return ((EntrySummary)((BlogSummary)_blogs.get(index)).entries.get(entryIndex)).entry; }
/** get the raw entry size (including attachments) from the given blog/tag pair */
public long getBlogEntrySizeKB(int index, int entryIndex) { return ((EntrySummary)((BlogSummary)_blogs.get(index)).entries.get(entryIndex)).size; }
public boolean getEntryIsKnown(BlogURI uri) { return getEntry(uri) != null; }
public long getBlogEntrySizeKB(BlogURI uri) {
EntrySummary entry = getEntry(uri);
if (entry == null) return -1;
return entry.size;
}
private EntrySummary getEntry(BlogURI uri) {
if ( (uri == null) || (uri.getKeyHash() == null) || (uri.getEntryId() < 0) ) return null;
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (summary.blog.equals(uri.getKeyHash())) {
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary entry = (EntrySummary)summary.entries.get(j);
if (entry.entry.equals(uri))
return entry;
}
}
}
return null;
}
public Set getBlogEntryTags(BlogURI uri) {
Set tags = new HashSet();
if ( (uri == null) || (uri.getKeyHash() == null) || (uri.getEntryId() < 0) ) return tags;
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (summary.blog.equals(uri.getKeyHash())) {
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary entry = (EntrySummary)summary.entries.get(j);
if (entry.entry.equals(uri)) {
tags.add(summary.tag);
break;
}
}
}
}
return tags;
}
/** how many 'new' blogs are listed */
public int getNewestBlogCount() { return _newestBlogs.size(); }
public Hash getNewestBlog(int index) { return (Hash)_newestBlogs.get(index); }
/** how many 'new' entries are listed */
public int getNewestBlogEntryCount() { return _newestEntries.size(); }
public BlogURI getNewestBlogEntry(int index) { return (BlogURI)_newestEntries.get(index); }
/** list of locally known tags (String) under the given blog */
public List getBlogTags(Hash blog) {
List rv = new ArrayList();
for (int i = 0; i < _blogs.size(); i++) {
if (getBlog(i).equals(blog))
rv.add(getBlogTag(i));
}
return rv;
}
/** list of unique blogs locally known (set of Hash) */
public Set getUniqueBlogs() {
Set rv = new HashSet();
for (int i = 0; i < _blogs.size(); i++)
rv.add(getBlog(i));
return rv;
}
public List getReplies(BlogURI uri) {
Set replies = (Set)_replies.get(uri);
if (replies == null) return Collections.EMPTY_LIST;
synchronized (replies) {
return new ArrayList(replies);
}
}
public void setLocation(String location) {
try {
File l = new File(location);
if (l.exists())
load(l);
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
public void setIsLocal(String val) {
if ("true".equals(val)) {
try {
File dir = BlogManager.instance().getArchive().getArchiveDir();
load(new File(dir, Archive.INDEX_FILE));
} catch (IOException ioe) {}
}
}
public void load(File location) throws IOException {
FileInputStream in = null;
try {
in = new FileInputStream(location);
load(in);
} finally {
if (in != null)
try { in.close(); } catch (IOException ioe) {}
}
}
/** load up the index from an archive.txt */
public void load(InputStream index) throws IOException {
_allBlogs = 0;
_allEntries = 0;
_newBlogs = 0;
_newEntries = 0;
_newSize = 0;
_totalSize = 0;
_version = null;
_blogs = new ArrayList();
_newestBlogs = new ArrayList();
_newestEntries = new ArrayList();
_headers = new Properties();
BufferedReader in = new BufferedReader(new InputStreamReader(index, "UTF-8"));
String line = null;
line = in.readLine();
if (line == null)
return;
if (!line.startsWith("SyndieVersion:"))
throw new IOException("Index is invalid - it starts with " + line);
_version = line.substring("SyndieVersion:".length()).trim();
if (!_version.startsWith("1."))
throw new IOException("Index is not supported, we only handle versions 1.*, but it is " + _version);
while ( (line = in.readLine()) != null) {
if (line.length() <= 0)
break;
if (line.startsWith("Blog:")) break;
int split = line.indexOf(':');
if (split <= 0) continue;
if (split >= line.length()-1) continue;
_headers.setProperty(line.substring(0, split), line.substring(split+1));
}
if (line != null) {
do {
if (!line.startsWith("Blog:"))
break;
loadBlog(line);
} while ( (line = in.readLine()) != null);
}
// ignore the first line that doesnt start with blog - its blank
while ( (line = in.readLine()) != null) {
int split = line.indexOf(':');
if (split <= 0) continue;
if (split >= line.length()-1) continue;
String key = line.substring(0, split);
String val = line.substring(split+1);
if (key.equals("AllBlogs"))
_allBlogs = getInt(val);
else if (key.equals("NewBlogs"))
_newBlogs = getInt(val);
else if (key.equals("AllEntries"))
_allEntries = getInt(val);
else if (key.equals("NewEntries"))
_newEntries = getInt(val);
else if (key.equals("TotalSize"))
_totalSize = getInt(val);
else if (key.equals("NewSize"))
_newSize = getInt(val);
else if (key.equals("NewestBlogs"))
_newestBlogs = parseNewestBlogs(val);
else if (key.equals("NewestEntries"))
_newestEntries = parseNewestEntries(val);
//else
// System.err.println("Key: " + key + " val: " + val);
}
}
/**
* Dig through the index for BlogURIs matching the given criteria, ordering the results by
* their own entryIds.
*
* @param out where to store the matches
* @param blog if set, what blog key must the entries be under
* @param tag if set, what tag must the entry be in
*
*/
public void selectMatchesOrderByEntryId(List out, Hash blog, String tag) {
TreeMap ordered = new TreeMap();
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (blog != null) {
if (!blog.equals(summary.blog))
continue;
}
if (tag != null) {
if (!tag.equals(summary.tag)) {
System.out.println("Tag [" + summary.tag + "] does not match the requested [" + tag + "] in " + summary.blog.toBase64());
if (false) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int j = 0; j < tag.length(); j++) {
b.append((int)tag.charAt(j));
b.append('.');
if (summary.tag.length() > j+1)
b.append((int)summary.tag.charAt(j));
else
b.append('_');
b.append(' ');
}
System.out.println("tag.summary: " + b.toString());
}
continue;
}
}
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary entry = (EntrySummary)summary.entries.get(j);
String k = (Long.MAX_VALUE-entry.entry.getEntryId()) + "-" + entry.entry.getKeyHash().toBase64();
ordered.put(k, entry.entry);
//System.err.println("Including match: " + k);
}
}
for (Iterator iter = ordered.values().iterator(); iter.hasNext(); ) {
BlogURI entry = (BlogURI)iter.next();
if (!out.contains(entry))
out.add(entry);
}
}
private static final int getInt(String val) {
try {
return Integer.parseInt(val.trim());
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return 0;
}
}
private List parseNewestBlogs(String vals) {
List rv = new ArrayList();
StringTokenizer tok = new StringTokenizer(vals, " \t\n");
while (tok.hasMoreTokens())
rv.add(new Hash(Base64.decode(tok.nextToken())));
return rv;
}
private List parseNewestEntries(String vals) {
List rv = new ArrayList();
StringTokenizer tok = new StringTokenizer(vals, " \t\n");
while (tok.hasMoreTokens())
rv.add(new BlogURI(tok.nextToken()));
return rv;
}
private void loadBlog(String line) throws IOException {
// Blog: hash YYYYMMDD tag\t[ yyyymmdd_n_sizeKB]*
StringTokenizer tok = new StringTokenizer(line.trim(), " \n\t");
if (tok.countTokens() < 4)
return;
tok.nextToken();
String keyStr = tok.nextToken();
Hash keyHash = new Hash(Base64.decode(keyStr));
String whenStr = tok.nextToken();
long when = getIndexDate(whenStr);
String tag = tok.nextToken();
BlogSummary summary = new BlogSummary();
summary.blog = keyHash;
summary.tag = tag.trim();
summary.lastUpdated = when;
summary.entries = new ArrayList();
while (tok.hasMoreTokens()) {
String entry = tok.nextToken();
long id = Archive.getEntryIdFromIndexName(entry);
int kb = Archive.getSizeFromIndexName(entry);
summary.entries.add(new EntrySummary(new BlogURI(keyHash, id), kb));
}
_blogs.add(summary);
}
private SimpleDateFormat _dateFmt = new SimpleDateFormat("yyyyMMdd", Locale.UK);
private long getIndexDate(String yyyymmdd) {
synchronized (_dateFmt) {
try {
return _dateFmt.parse(yyyymmdd).getTime();
} catch (ParseException pe) {
return -1;
}
}
}
private String getIndexDate(long when) {
synchronized (_dateFmt) {
return _dateFmt.format(new Date(when));
}
}
protected class BlogSummary {
Hash blog;
String tag;
long lastUpdated;
/** list of EntrySummary objects */
List entries;
public BlogSummary() {
entries = new ArrayList();
}
}
protected class EntrySummary {
BlogURI entry;
long size;
public EntrySummary(BlogURI uri, long kb) {
size = kb;
entry = uri;
}
}
/** export the index into an archive.txt */
public String toString() {
StringBuffer rv = new StringBuffer(1024);
rv.append("SyndieVersion: ").append(_version).append('\n');
for (Iterator iter = _headers.keySet().iterator(); iter.hasNext(); ) {
String key = (String)iter.next();
String val = _headers.getProperty(key);
rv.append(key).append(": ").append(val).append('\n');
}
for (int i = 0; i < _blogs.size(); i++) {
rv.append("Blog: ");
Hash blog = getBlog(i);
String tag = getBlogTag(i);
rv.append(Base64.encode(blog.getData())).append(' ');
rv.append(getIndexDate(getBlogLastUpdated(i))).append(' ');
rv.append(tag).append('\t');
int entries = getBlogEntryCount(i);
for (int j = 0; j < entries; j++) {
BlogURI entry = getBlogEntry(i, j);
long kb = getBlogEntrySizeKB(i, j);
rv.append(Archive.getIndexName(entry.getEntryId(), (int)kb*1024)).append(' ');
}
rv.append('\n');
}
rv.append('\n');
rv.append("AllBlogs: ").append(_allBlogs).append('\n');
rv.append("NewBlogs: ").append(_newBlogs).append('\n');
rv.append("AllEntries: ").append(_allEntries).append('\n');
rv.append("NewEntries: ").append(_newEntries).append('\n');
rv.append("TotalSize: ").append(_totalSize).append('\n');
rv.append("NewSize: ").append(_newSize).append('\n');
rv.append("NewestBlogs: ");
for (int i = 0; i < _newestBlogs.size(); i++)
rv.append(((Hash)(_newestBlogs.get(i))).toBase64()).append(' ');
rv.append('\n');
rv.append("NewestEntries: ");
for (int i = 0; i < _newestEntries.size(); i++)
rv.append(((BlogURI)_newestEntries.get(i)).toString()).append(' ');
rv.append('\n');
return rv.toString();
}
/** Usage: ArchiveIndex archive.txt */
public static void main(String args[]) {
try {
ArchiveIndex i = new ArchiveIndex();
i.load(new File(args[0]));
System.out.println(i.toString());
} catch (IOException ioe) { ioe.printStackTrace(); }
}
}

View File

@ -0,0 +1,122 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import net.i2p.data.DataHelper;
/**
*
*/
public class Attachment {
private byte _data[];
private byte _rawMetadata[];
private List _keys;
private List _values;
public Attachment(byte data[], byte metadata[]) {
_data = data;
_rawMetadata = metadata;
_keys = new ArrayList();
_values = new ArrayList();
parseMeta();
}
public static final String NAME = "Name";
public static final String DESCRIPTION = "Description";
public static final String MIMETYPE = "MimeType";
public Attachment(byte data[], String name, String description, String mimeType) {
_data = data;
_keys = new ArrayList();
_values = new ArrayList();
_keys.add(NAME);
_values.add(name);
if ( (description != null) && (description.trim().length() > 0) ) {
_keys.add(DESCRIPTION);
_values.add(description);
}
if ( (mimeType != null) && (mimeType.trim().length() > 0) ) {
_keys.add(MIMETYPE);
_values.add(mimeType);
}
createMeta();
}
public byte[] getData() { return _data; }
public int getDataLength() { return _data.length; }
public byte[] getRawMetadata() { return _rawMetadata; }
public InputStream getDataStream() throws IOException { return new ByteArrayInputStream(_data); }
public String getMeta(String key) {
for (int i = 0; i < _keys.size(); i++) {
if (key.equals(_keys.get(i)))
return (String)_values.get(i);
}
return null;
}
public String getName() { return getMeta(NAME); }
public String getDescription() { return getMeta(DESCRIPTION); }
public String getMimeType() { return getMeta(MIMETYPE); }
public void setMeta(String key, String val) {
for (int i = 0; i < _keys.size(); i++) {
if (key.equals(_keys.get(i))) {
_values.set(i, val);
return;
}
}
_keys.add(key);
_values.add(val);
}
public Map getMeta() {
Map rv = new HashMap(_keys.size());
for (int i = 0; i < _keys.size(); i++) {
String k = (String)_keys.get(i);
String v = (String)_values.get(i);
rv.put(k,v);
}
return rv;
}
private void createMeta() {
StringBuffer meta = new StringBuffer(64);
for (int i = 0; i < _keys.size(); i++) {
meta.append(_keys.get(i)).append(':').append(_values.get(i)).append('\n');
}
_rawMetadata = DataHelper.getUTF8(meta);
}
private void parseMeta() {
if (_rawMetadata == null) return;
String key = null;
String val = null;
int keyBegin = 0;
int valBegin = -1;
for (int i = 0; i < _rawMetadata.length; i++) {
if (_rawMetadata[i] == ':') {
key = DataHelper.getUTF8(_rawMetadata, keyBegin, i - keyBegin);
valBegin = i + 1;
} else if (_rawMetadata[i] == '\n') {
val = DataHelper.getUTF8(_rawMetadata, valBegin, i - valBegin);
_keys.add(key);
_values.add(val);
keyBegin = i + 1;
key = null;
val = null;
}
}
}
public String toString() {
int len = 0;
if (_data != null)
len = _data.length;
return getName()
+ (getDescription() != null ? ": " + getDescription() : "")
+ (getMimeType() != null ? ", type: " + getMimeType() : "")
+ ", size: " + len;
}
}

View File

@ -0,0 +1,277 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Blog metadata. Formatted as: <pre>
* [key:val\n]*
* </pre>
*
* Required keys:
* Owner: base64 of their signing public key
* Signature: base64 of the DSA signature of the rest of the ordered metadata
* Edition: base10 unique identifier for this metadata (higher clobbers lower)
*
* Optional keys:
* Posters: comma delimited list of base64 signing public keys that
* can post to the blog
* Name: name of the blog
* Description: brief description of the blog
*
*/
public class BlogInfo {
private SigningPublicKey _key;
private SigningPublicKey _posters[];
private String _optionNames[];
private String _optionValues[];
private Signature _signature;
public BlogInfo() {}
public BlogInfo(SigningPublicKey key, SigningPublicKey posters[], Properties opts) {
_optionNames = new String[0];
_optionValues = new String[0];
setKey(key);
setPosters(posters);
for (Iterator iter = opts.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = opts.getProperty(k);
setProperty(k.trim(), v.trim());
}
}
public SigningPublicKey getKey() { return _key; }
public void setKey(SigningPublicKey key) {
_key = key;
setProperty(OWNER_KEY, Base64.encode(key.getData()));
}
public static final String OWNER_KEY = "Owner";
public static final String POSTERS = "Posters";
public static final String SIGNATURE = "Signature";
public static final String NAME = "Name";
public static final String DESCRIPTION = "Description";
public static final String EDITION = "Edition";
public void load(InputStream in) throws IOException {
BufferedReader reader = new BufferedReader(new InputStreamReader(in, "UTF-8"));
List names = new ArrayList();
List vals = new ArrayList();
String line = null;
while ( (line = reader.readLine()) != null) {
System.err.println("Read info line [" + line + "]");
line = line.trim();
int len = line.length();
int split = line.indexOf(':');
if ( (len <= 0) || (split <= 0) ) {
continue;
} else if (split >= len - 1) {
names.add(line.substring(0, split).trim());
vals.add("");
continue;
}
String key = line.substring(0, split).trim();
String val = line.substring(split+1).trim();
names.add(key);
vals.add(val);
}
_optionNames = new String[names.size()];
_optionValues = new String[names.size()];
for (int i = 0; i < _optionNames.length; i++) {
_optionNames[i] = (String)names.get(i);
_optionValues[i] = (String)vals.get(i);
System.out.println("Loaded info: [" + _optionNames[i] + "] = [" + _optionValues[i] + "]");
}
String keyStr = getProperty(OWNER_KEY);
if (keyStr == null) throw new IOException("Owner not found");
_key = new SigningPublicKey(Base64.decode(keyStr));
String postersStr = getProperty(POSTERS);
if (postersStr != null) {
StringTokenizer tok = new StringTokenizer(postersStr, ", \t");
_posters = new SigningPublicKey[tok.countTokens()];
for (int i = 0; tok.hasMoreTokens(); i++)
_posters[i] = new SigningPublicKey(Base64.decode(tok.nextToken()));
}
String sigStr = getProperty(SIGNATURE);
if (sigStr == null) throw new IOException("Signature not found");
_signature = new Signature(Base64.decode(sigStr));
}
public void write(OutputStream out) throws IOException { write(out, true); }
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
StringBuffer buf = new StringBuffer(512);
for (int i = 0; i < _optionNames.length; i++) {
if ( (includeRealSignature) || (!SIGNATURE.equals(_optionNames[i])) )
buf.append(_optionNames[i]).append(':').append(_optionValues[i]).append('\n');
}
String s = buf.toString();
out.write(s.getBytes("UTF-8"));
}
public String getProperty(String name) {
for (int i = 0; i < _optionNames.length; i++) {
if (_optionNames[i].equals(name)) {
String val = _optionValues[i];
System.out.println("getProperty[" + name + "] = [" + val + "] [sz=" + val.length() +"]");
for (int j = 0; j < val.length(); j++) {
char c = (char)val.charAt(j);
if (c != (c & 0x7F))
System.out.println("char " + j + ": " + (int)c);
}
return val;
}
}
return null;
}
private void setProperty(String name, String val) {
for (int i = 0; i < _optionNames.length; i++) {
if (_optionNames[i].equals(name)) {
_optionValues[i] = val;
return;
}
}
String names[] = new String[_optionNames.length + 1];
String values[] = new String[_optionValues.length + 1];
for (int i = 0; i < _optionNames.length; i++) {
names[i] = _optionNames[i];
values[i] = _optionValues[i];
}
names[names.length-1] = name;
values[values.length-1] = val;
_optionNames = names;
_optionValues = values;
}
public int getEdition() {
String e = getProperty(EDITION);
if (e != null) {
try {
return Integer.parseInt(e);
} catch (NumberFormatException nfe) {
return 0;
}
}
return 0;
}
public String[] getProperties() { return _optionNames; }
public SigningPublicKey[] getPosters() { return _posters; }
public void setPosters(SigningPublicKey posters[]) {
_posters = posters;
StringBuffer buf = new StringBuffer();
for (int i = 0; posters != null && i < posters.length; i++) {
buf.append(Base64.encode(posters[i].getData()));
if (i + 1 < posters.length)
buf.append(',');
}
setProperty(POSTERS, buf.toString());
}
public boolean verify(I2PAppContext ctx) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream(512);
write(out, false);
out.close();
byte data[] = out.toByteArray();
return ctx.dsa().verifySignature(_signature, data, _key);
} catch (IOException ioe) {
return false;
}
}
public void sign(I2PAppContext ctx, SigningPrivateKey priv) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream(512);
write(out, false);
byte data[] = out.toByteArray();
Signature sig = ctx.dsa().sign(data, priv);
if (sig == null)
throw new IOException("wtf, why is the signature null? data.len = " + data.length + " priv: " + priv);
setProperty(SIGNATURE, Base64.encode(sig.getData()));
_signature = sig;
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
public String toString() {
StringBuffer buf = new StringBuffer();
buf.append("Blog ").append(getKey().calculateHash().toBase64());
for (int i = 0; i < _optionNames.length; i++) {
if ( (!SIGNATURE.equals(_optionNames[i])) &&
(!OWNER_KEY.equals(_optionNames[i])) &&
(!SIGNATURE.equals(_optionNames[i])) )
buf.append(' ').append(_optionNames[i]).append(": ").append(_optionValues[i]);
}
if ( (_posters != null) && (_posters.length > 0) ) {
buf.append(" additional posts by");
for (int i = 0; i < _posters.length; i++) {
buf.append(' ').append(_posters[i].calculateHash().toBase64());
if (i + 1 < _posters.length)
buf.append(',');
}
}
return buf.toString();
}
private static final String TEST_STRING = "\u20AC\u00DF\u6771\u10400\u00F6";
public static void main(String args[]) {
I2PAppContext ctx = I2PAppContext.getGlobalContext();
if (true) {
try {
Object keys[] = ctx.keyGenerator().generateSigningKeypair();
SigningPublicKey pub = (SigningPublicKey)keys[0];
SigningPrivateKey priv = (SigningPrivateKey)keys[1];
Properties opts = new Properties();
opts.setProperty("Name", TEST_STRING);
opts.setProperty("Description", TEST_STRING);
opts.setProperty("Edition", "0");
opts.setProperty("ContactURL", TEST_STRING);
String nameOrig = opts.getProperty("Name");
BlogInfo info = new BlogInfo(pub, null, opts);
info.sign(ctx, priv);
boolean ok = info.verify(ctx);
System.err.println("sign&verify: " + ok);
FileOutputStream o = new FileOutputStream("bloginfo-test.dat");
info.write(o, true);
o.close();
FileInputStream i = new FileInputStream("bloginfo-test.dat");
byte buf[] = new byte[4096];
int sz = DataHelper.read(i, buf);
BlogInfo read = new BlogInfo();
read.load(new ByteArrayInputStream(buf, 0, sz));
ok = read.verify(ctx);
System.err.println("write to disk, verify read: " + ok);
System.err.println("Data: " + Base64.encode(buf, 0, sz));
System.err.println("Str : " + new String(buf, 0, sz));
System.err.println("Name ok? " + read.getProperty("Name").equals(TEST_STRING));
System.err.println("Desc ok? " + read.getProperty("Description").equals(TEST_STRING));
System.err.println("Name ok? " + read.getProperty("ContactURL").equals(TEST_STRING));
} catch (Exception e) { e.printStackTrace(); }
} else {
try {
FileInputStream in = new FileInputStream(args[0]);
BlogInfo info = new BlogInfo();
info.load(in);
boolean ok = info.verify(I2PAppContext.getGlobalContext());
System.out.println("OK? " + ok + " :" + info);
} catch (Exception e) { e.printStackTrace(); }
}
}
}

View File

@ -0,0 +1,96 @@
package net.i2p.syndie.data;
import java.util.*;
import net.i2p.data.*;
/**
*
*/
public class BlogURI {
private Hash _blogHash;
private long _entryId;
public BlogURI() {
this(null, -1);
}
public BlogURI(Hash blogHash, long entryId) {
_blogHash = blogHash;
_entryId = entryId;
}
public BlogURI(String uri) {
if (uri.startsWith("blog://")) {
int off = "blog://".length();
_blogHash = new Hash(Base64.decode(uri.substring(off, off+44))); // 44 chars == base64(32 bytes)
int entryStart = uri.indexOf('/', off+1);
if (entryStart < 0) {
_entryId = -1;
} else {
try {
_entryId = Long.parseLong(uri.substring(entryStart+1).trim());
} catch (NumberFormatException nfe) {
_entryId = -1;
}
}
} else if (uri.startsWith("entry://")) {
int off = "entry://".length();
_blogHash = new Hash(Base64.decode(uri.substring(off, off+44))); // 44 chars == base64(32 bytes)
int entryStart = uri.indexOf('/', off+1);
if (entryStart < 0) {
_entryId = -1;
} else {
try {
_entryId = Long.parseLong(uri.substring(entryStart+1).trim());
} catch (NumberFormatException nfe) {
_entryId = -1;
}
}
} else {
_blogHash = null;
_entryId = -1;
}
}
public Hash getKeyHash() { return _blogHash; }
public long getEntryId() { return _entryId; }
public void setKeyHash(Hash hash) { _blogHash = hash; }
public void setEntryId(long id) { _entryId = id; }
public String toString() {
if ( (_blogHash == null) || (_blogHash.getData() == null) )
return "";
StringBuffer rv = new StringBuffer(64);
rv.append("blog://").append(Base64.encode(_blogHash.getData()));
rv.append('/');
if (_entryId >= 0)
rv.append(_entryId);
return rv.toString();
}
public boolean equals(Object obj) {
if (obj == null) return false;
if (obj.getClass() != getClass()) return false;
return DataHelper.eq(_entryId, ((BlogURI)obj)._entryId) &&
DataHelper.eq(_blogHash, ((BlogURI)obj)._blogHash);
}
public int hashCode() {
int rv = (int)_entryId;
if (_blogHash != null)
rv += _blogHash.hashCode();
return rv;
}
public static void main(String args[]) {
test("http://asdf/");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/123456789");
test("entry://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/");
test("entry://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/123456789");
}
private static void test(String uri) {
BlogURI u = new BlogURI(uri);
if (!u.toString().equals(uri))
System.err.println("Not a match: [" + uri + "] != [" + u.toString() + "]");
}
}

View File

@ -0,0 +1,86 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Create a new blog metadata & set of entries using some crazy UTF8 encoded chars,
* then make sure they're always valid. These blogs & entries can then be fed into
* jetty/syndie/etc to see how and where they are getting b0rked.
*/
public class EncodingTestGenerator {
public EncodingTestGenerator() {}
public static final String TEST_STRING = "\u20AC\u00DF\u6771\u10400\u00F6";
public static void main(String args[]) {
I2PAppContext ctx = I2PAppContext.getGlobalContext();
try {
Object keys[] = ctx.keyGenerator().generateSigningKeypair();
SigningPublicKey pub = (SigningPublicKey)keys[0];
SigningPrivateKey priv = (SigningPrivateKey)keys[1];
Properties opts = new Properties();
opts.setProperty("Name", TEST_STRING);
opts.setProperty("Description", TEST_STRING);
opts.setProperty("Edition", "0");
opts.setProperty("ContactURL", TEST_STRING);
String nameOrig = opts.getProperty("Name");
BlogInfo info = new BlogInfo(pub, null, opts);
info.sign(ctx, priv);
boolean ok = info.verify(ctx);
System.err.println("sign&verify: " + ok);
FileOutputStream o = new FileOutputStream("encodedMeta.dat");
info.write(o, true);
o.close();
FileInputStream i = new FileInputStream("encodedMeta.dat");
byte buf[] = new byte[4096];
int sz = DataHelper.read(i, buf);
BlogInfo read = new BlogInfo();
read.load(new ByteArrayInputStream(buf, 0, sz));
ok = read.verify(ctx);
System.err.println("write to disk, verify read: " + ok);
System.err.println("Name ok? " + read.getProperty("Name").equals(TEST_STRING));
System.err.println("Desc ok? " + read.getProperty("Description").equals(TEST_STRING));
System.err.println("Name ok? " + read.getProperty("ContactURL").equals(TEST_STRING));
// ok now lets create some entries
BlogURI uri = new BlogURI(read.getKey().calculateHash(), 0);
String tags[] = new String[4];
for (int j = 0; j < tags.length; j++)
tags[j] = TEST_STRING + "_" + j;
StringBuffer smlOrig = new StringBuffer(512);
smlOrig.append("Subject: ").append(TEST_STRING).append("\n\n");
smlOrig.append("Hi with ").append(TEST_STRING);
EntryContainer container = new EntryContainer(uri, tags, DataHelper.getUTF8(smlOrig));
container.seal(ctx, priv, null);
ok = container.verifySignature(ctx, read);
System.err.println("Sealed and verified entry: " + ok);
FileOutputStream fos = new FileOutputStream("encodedEntry.dat");
container.write(fos, true);
fos.close();
System.out.println("Written to " + new File("encodedEntry.dat").getAbsolutePath());
FileInputStream fis = new FileInputStream("encodedEntry.dat");
EntryContainer read2 = new EntryContainer();
read2.load(fis);
ok = read2.verifySignature(ctx, read);
System.out.println("Read ok? " + ok);
read2.parseRawData(ctx);
String tagsRead[] = read2.getTags();
for (int j = 0; j < tagsRead.length; j++) {
if (!tags[j].equals(tagsRead[j]))
System.err.println("Tag error [" + j + "]: read = [" + tagsRead[j] + "] want [" + tags[j] + "]");
else
System.err.println("Tag ok [" + j + "]");
}
String readText = read2.getEntry().getText();
ok = readText.equals(smlOrig.toString());
System.err.println("SML text ok? " + ok);
} catch (Exception e) { e.printStackTrace(); }
}
}

View File

@ -0,0 +1,14 @@
package net.i2p.syndie.data;
/**
*
*/
public class Entry {
private String _text;
public Entry(String raw) {
_text = raw;
}
public String getText() { return _text; }
}

View File

@ -0,0 +1,426 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Securely wrap up an entry and any attachments. Container format:<pre>
* $format\n
* [$key: $val\n]*
* \n
* Signature: $base64(DSA signature)\n
* Size: sizeof(data)\n
* [data bytes]
* </pre>
*
* Required keys:
* BlogKey: base64 of the SHA256 of the blog's public key
* BlogTags: tab delimited list of tags under which this entry should be organized
* BlogEntryId: base10 unique identifier of this entry within the key/path. Typically starts
* as the current day (in unix time, milliseconds) plus further milliseconds for
* each entry within the day.
*
* The data bytes contains zip file, either in the clear or encrypted. If the format
* is encrypted, the BlogPath key will (likely) be encrypted as well.
*
*/
public class EntryContainer {
private List _rawKeys;
private List _rawValues;
private Signature _signature;
private byte _rawData[];
private BlogURI _entryURI;
private int _format;
private Entry _entryData;
private Attachment _attachments[];
private int _completeSize;
public static final int FORMAT_ZIP_UNENCRYPTED = 0;
public static final int FORMAT_ZIP_ENCRYPTED = 1;
public static final String FORMAT_ZIP_UNENCRYPTED_STR = "syndie.entry.zip-unencrypted";
public static final String FORMAT_ZIP_ENCRYPTED_STR = "syndie.entry.zip-encrypted";
public static final String HEADER_BLOGKEY = "BlogKey";
public static final String HEADER_BLOGTAGS = "BlogTags";
public static final String HEADER_ENTRYID = "BlogEntryId";
public EntryContainer() {
_rawKeys = new ArrayList();
_rawValues = new ArrayList();
_completeSize = -1;
}
public EntryContainer(BlogURI uri, String tags[], byte smlData[]) {
this();
_entryURI = uri;
_entryData = new Entry(DataHelper.getUTF8(smlData));
setHeader(HEADER_BLOGKEY, Base64.encode(uri.getKeyHash().getData()));
StringBuffer buf = new StringBuffer();
for (int i = 0; tags != null && i < tags.length; i++)
buf.append(tags[i]).append('\t');
setHeader(HEADER_BLOGTAGS, buf.toString());
if (uri.getEntryId() < 0)
uri.setEntryId(System.currentTimeMillis());
setHeader(HEADER_ENTRYID, Long.toString(uri.getEntryId()));
}
public int getFormat() { return _format; }
private String readLine(InputStream in) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream(512);
int i = 0;
while (true) {
int c = in.read();
if ( (c == (int)'\n') || (c == (int)'\r') ) {
break;
} else if (c == -1) {
if (i == 0)
return null;
else
break;
} else {
baos.write(c);
}
i++;
}
return DataHelper.getUTF8(baos.toByteArray());
//BufferedReader r = new BufferedReader(new InputStreamReader(in, "UTF-8"), 1);
//String line = r.readLine();
//return line;
}
public void load(InputStream source) throws IOException {
String line = readLine(source);
if (line == null) throw new IOException("No format line in the entry");
//System.err.println("read container format line [" + line + "]");
String fmt = line.trim();
if (FORMAT_ZIP_UNENCRYPTED_STR.equals(fmt)) {
_format = FORMAT_ZIP_UNENCRYPTED;
} else if (FORMAT_ZIP_ENCRYPTED_STR.equals(fmt)) {
_format = FORMAT_ZIP_ENCRYPTED;
} else {
throw new IOException("Unsupported entry format: " + fmt);
}
while ( (line = readLine(source)) != null) {
//System.err.println("read container header line [" + line + "]");
line = line.trim();
int len = line.length();
if (len <= 0)
break;
int split = line.indexOf(':');
if ( (split <= 0) || (split >= len - 2) )
throw new IOException("Invalid format of the syndie entry: line=" + line);
String key = line.substring(0, split);
String val = line.substring(split+1);
_rawKeys.add(key);
_rawValues.add(val);
}
parseHeaders();
String sigStr = readLine(source);
//System.err.println("read container signature line [" + line + "]");
if ( (sigStr == null) || (sigStr.indexOf("Signature:") == -1) )
throw new IOException("No signature line");
sigStr = sigStr.substring("Signature:".length()+1).trim();
_signature = new Signature(Base64.decode(sigStr));
//System.out.println("Sig: " + _signature.toBase64());
line = readLine(source);
//System.err.println("read container size line [" + line + "]");
if (line == null)
throw new IOException("No size line");
line = line.trim();
int dataSize = -1;
try {
int index = line.indexOf("Size:");
if (index == 0)
dataSize = Integer.parseInt(line.substring("Size:".length()+1).trim());
else
throw new IOException("Invalid size line");
} catch (NumberFormatException nfe) {
throw new IOException("Invalid entry size: " + line);
}
byte data[] = new byte[dataSize];
int read = DataHelper.read(source, data);
if (read != dataSize)
throw new IOException("Incomplete entry: read " + read + " expected " + dataSize);
_rawData = data;
}
public void seal(I2PAppContext ctx, SigningPrivateKey signingKey, SessionKey entryKey) throws IOException {
System.out.println("Sealing " + _entryURI);
if (entryKey == null)
_format = FORMAT_ZIP_UNENCRYPTED;
else
_format = FORMAT_ZIP_ENCRYPTED;
setHeader(HEADER_BLOGKEY, Base64.encode(_entryURI.getKeyHash().getData()));
if (_entryURI.getEntryId() < 0)
_entryURI.setEntryId(ctx.clock().now());
setHeader(HEADER_ENTRYID, Long.toString(_entryURI.getEntryId()));
_rawData = createRawData(ctx, entryKey);
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024);
write(baos, false);
byte data[] = baos.toByteArray();
_signature = ctx.dsa().sign(data, signingKey);
}
private byte[] createRawData(I2PAppContext ctx, SessionKey entryKey) throws IOException {
byte raw[] = createRawData();
if (entryKey != null) {
byte iv[] = new byte[16];
ctx.random().nextBytes(iv);
byte rv[] = new byte[raw.length + iv.length];
ctx.aes().encrypt(raw, 0, rv, iv.length, entryKey, iv, raw.length);
System.arraycopy(iv, 0, rv, 0, iv.length);
return rv;
} else {
return raw;
}
}
private byte[] createRawData() throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ZipOutputStream out = new ZipOutputStream(baos);
ZipEntry ze = new ZipEntry(ZIP_ENTRY);
byte data[] = DataHelper.getUTF8(_entryData.getText());
ze.setTime(0);
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
for (int i = 0; (_attachments != null) && (i < _attachments.length); i++) {
ze = new ZipEntry(ZIP_ATTACHMENT_PREFIX + i + ZIP_ATTACHMENT_SUFFIX);
data = _attachments[i].getData();
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
ze = new ZipEntry(ZIP_ATTACHMENT_META_PREFIX + i + ZIP_ATTACHMENT_META_SUFFIX);
data = _attachments[i].getRawMetadata();
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
}
out.finish();
out.close();
return baos.toByteArray();
}
public static final String ZIP_ENTRY = "entry.sml";
public static final String ZIP_ATTACHMENT_PREFIX = "attachmentdata";
public static final String ZIP_ATTACHMENT_SUFFIX = ".szd";
public static final String ZIP_ATTACHMENT_META_PREFIX = "attachmentmeta";
public static final String ZIP_ATTACHMENT_META_SUFFIX = ".szm";
public void parseRawData(I2PAppContext ctx) throws IOException { parseRawData(ctx, null); }
public void parseRawData(I2PAppContext ctx, SessionKey zipKey) throws IOException {
int dataOffset = 0;
if (zipKey != null) {
byte iv[] = new byte[16];
System.arraycopy(_rawData, 0, iv, 0, iv.length);
ctx.aes().decrypt(_rawData, iv.length, _rawData, iv.length, zipKey, iv, _rawData.length - iv.length);
dataOffset = iv.length;
}
ByteArrayInputStream in = new ByteArrayInputStream(_rawData, dataOffset, _rawData.length - dataOffset);
ZipInputStream zi = new ZipInputStream(in);
Map attachments = new HashMap();
Map attachmentMeta = new HashMap();
while (true) {
ZipEntry entry = zi.getNextEntry();
if (entry == null)
break;
ByteArrayOutputStream out = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
int read = -1;
while ( (read = zi.read(buf)) != -1)
out.write(buf, 0, read);
byte entryData[] = out.toByteArray();
String name = entry.getName();
if (ZIP_ENTRY.equals(name)) {
_entryData = new Entry(DataHelper.getUTF8(entryData));
} else if (name.startsWith(ZIP_ATTACHMENT_PREFIX)) {
attachments.put(name, (Object)entryData);
} else if (name.startsWith(ZIP_ATTACHMENT_META_PREFIX)) {
attachmentMeta.put(name, (Object)entryData);
}
//System.out.println("Read entry [" + name + "] with size=" + entryData.length);
}
_attachments = new Attachment[attachments.size()];
for (int i = 0; i < attachments.size(); i++) {
byte data[] = (byte[])attachments.get(ZIP_ATTACHMENT_PREFIX + i + ZIP_ATTACHMENT_SUFFIX);
byte metadata[] = (byte[])attachmentMeta.get(ZIP_ATTACHMENT_META_PREFIX + i + ZIP_ATTACHMENT_META_SUFFIX);
if ( (data != null) && (metadata != null) )
_attachments[i] = new Attachment(data, metadata);
else
System.out.println("Unable to get " + i + ": " + data + "/" + metadata);
}
//System.out.println("Attachments: " + _attachments.length + "/" + attachments.size() + ": " + attachments);
}
public BlogURI getURI() { return _entryURI; }
private static final String NO_TAGS[] = new String[0];
public String[] getTags() {
String tags = getHeader(HEADER_BLOGTAGS);
if ( (tags == null) || (tags.trim().length() <= 0) ) {
return NO_TAGS;
} else {
StringTokenizer tok = new StringTokenizer(tags, "\t");
String rv[] = new String[tok.countTokens()];
for (int i = 0; i < rv.length; i++)
rv[i] = tok.nextToken().trim();
return rv;
}
}
public Signature getSignature() { return _signature; }
public Entry getEntry() { return _entryData; }
public Attachment[] getAttachments() { return _attachments; }
public void setCompleteSize(int bytes) { _completeSize = bytes; }
public int getCompleteSize() { return _completeSize; }
public String getHeader(String key) {
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
if (k.equals(key))
return (String)_rawValues.get(i);
}
return null;
}
public Map getHeaders() {
Map rv = new HashMap(_rawKeys.size());
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
String v = (String)_rawValues.get(i);
rv.put(k,v);
}
return rv;
}
public void setHeader(String name, String val) {
int index = _rawKeys.indexOf(name);
if (index < 0) {
_rawKeys.add(name);
_rawValues.add(val);
} else {
_rawValues.set(index, val);
}
}
public void addAttachment(byte data[], String name, String description, String mimeType) {
Attachment a = new Attachment(data, name, description, mimeType);
int old = (_attachments == null ? 0 : _attachments.length);
Attachment nv[] = new Attachment[old+1];
if (old > 0)
for (int i = 0; i < old; i++)
nv[i] = _attachments[i];
nv[old] = a;
_attachments = nv;
}
private void parseHeaders() throws IOException {
String keyHash = getHeader(HEADER_BLOGKEY);
String idVal = getHeader(HEADER_ENTRYID);
if (keyHash == null) {
System.err.println("Headers: " + _rawKeys);
System.err.println("Values : " + _rawValues);
throw new IOException("Missing " + HEADER_BLOGKEY + " header");
}
long entryId = -1;
if ( (idVal != null) && (idVal.length() > 0) ) {
try {
entryId = Long.parseLong(idVal.trim());
} catch (NumberFormatException nfe) {
System.err.println("Headers: " + _rawKeys);
System.err.println("Values : " + _rawValues);
throw new IOException("Invalid format of entryId (" + idVal + ")");
}
}
_entryURI = new BlogURI(new Hash(Base64.decode(keyHash)), entryId);
}
public boolean verifySignature(I2PAppContext ctx, BlogInfo info) {
if (_signature == null) throw new NullPointerException("sig is null");
if (info == null) throw new NullPointerException("info is null");
if (info.getKey() == null) throw new NullPointerException("info key is null");
if (info.getKey().getData() == null) throw new NullPointerException("info key data is null");
//System.out.println("Verifying " + _entryURI + " for " + info);
ByteArrayOutputStream out = new ByteArrayOutputStream(_rawData.length + 512);
try {
write(out, false);
byte dat[] = out.toByteArray();
//System.out.println("Raw data to verify: " + ctx.sha().calculateHash(dat).toBase64() + " sig: " + _signature.toBase64());
ByteArrayInputStream in = new ByteArrayInputStream(dat);
boolean ok = ctx.dsa().verifySignature(_signature, in, info.getKey());
if (!ok && info.getPosters() != null) {
for (int i = 0; !ok && i < info.getPosters().length; i++) {
in.reset();
ok = ctx.dsa().verifySignature(_signature, in, info.getPosters()[i]);
}
}
//System.out.println("Verified ok? " + ok + " key: " + info.getKey().calculateHash().toBase64());
//new Exception("verifying").printStackTrace();
return ok;
} catch (IOException ioe) {
//System.out.println("Verification failed! " + ioe.getMessage());
return false;
}
}
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
StringBuffer buf = new StringBuffer(512);
switch (_format) {
case FORMAT_ZIP_ENCRYPTED:
buf.append(FORMAT_ZIP_ENCRYPTED_STR).append('\n');
break;
case FORMAT_ZIP_UNENCRYPTED:
buf.append(FORMAT_ZIP_UNENCRYPTED_STR).append('\n');
break;
default:
throw new IOException("Invalid format " + _format);
}
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
buf.append(k.trim());
buf.append(": ");
buf.append(((String)_rawValues.get(i)).trim());
buf.append('\n');
}
buf.append('\n');
buf.append("Signature: ");
if (includeRealSignature)
buf.append(Base64.encode(_signature.getData()));
buf.append("\n");
buf.append("Size: ").append(_rawData.length).append('\n');
String str = buf.toString();
//System.out.println("Writing raw: \n[" + str + "] / " + I2PAppContext.getGlobalContext().sha().calculateHash(str.getBytes()) + ", raw data: " + I2PAppContext.getGlobalContext().sha().calculateHash(_rawData).toBase64() + "\n");
out.write(DataHelper.getUTF8(str));
out.write(_rawData);
}
public String toString() { return _entryURI.toString(); }
}

View File

@ -0,0 +1,102 @@
package net.i2p.syndie.data;
import java.util.*;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
/**
* writable archive index (most are readonly)
*/
public class LocalArchiveIndex extends ArchiveIndex {
public LocalArchiveIndex() {
super(false);
}
public void setGeneratedOn(long when) { _generatedOn = when; }
public void setVersion(String v) { _version = v; }
public void setHeaders(Properties headers) { _headers = headers; }
public void setHeader(String key, String val) { _headers.setProperty(key, val); }
public void setAllBlogs(int count) { _allBlogs = count; }
public void setNewBlogs(int count) { _newBlogs = count; }
public void setAllEntries(int count) { _allEntries = count; }
public void setNewEntries(int count) { _newEntries = count; }
public void setTotalSize(long bytes) { _totalSize = bytes; }
public void setNewSize(long bytes) { _newSize = bytes; }
public void addBlog(Hash key, String tag, long lastUpdated) {
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary s = (BlogSummary)_blogs.get(i);
if ( (s.blog.equals(key)) && (s.tag.equals(tag)) ) {
s.lastUpdated = Math.max(s.lastUpdated, lastUpdated);
return;
}
}
BlogSummary summary = new ArchiveIndex.BlogSummary();
summary.blog = key;
summary.tag = tag;
summary.lastUpdated = lastUpdated;
_blogs.add(summary);
}
public void addBlogEntry(Hash key, String tag, String entry) {
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (summary.blog.equals(key) && (summary.tag.equals(tag)) ) {
long entryId = Archive.getEntryIdFromIndexName(entry);
int kb = Archive.getSizeFromIndexName(entry);
System.out.println("Adding entry " + entryId + ", size=" + kb + "KB [" + entry + "]");
EntrySummary entrySummary = new EntrySummary(new BlogURI(key, entryId), kb);
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary cur = (EntrySummary)summary.entries.get(j);
if (cur.entry.equals(entrySummary.entry))
return;
}
summary.entries.add(entrySummary);
return;
}
}
}
public void addNewestBlog(Hash key) {
if (!_newestBlogs.contains(key))
_newestBlogs.add(key);
}
public void addNewestEntry(BlogURI entry) {
if (!_newestEntries.contains(entry))
_newestEntries.add(entry);
}
public void addReply(BlogURI parent, BlogURI reply) {
Set replies = (Set)_replies.get(parent);
if (replies == null) {
replies = Collections.synchronizedSet(new TreeSet(BlogURIComparator.HIGHEST_ID_FIRST));
_replies.put(parent, replies);
}
replies.add(reply);
//System.err.println("Adding reply to " + parent + " from child " + reply + " (# replies: " + replies.size() + ")");
}
private static class BlogURIComparator implements Comparator {
public static final BlogURIComparator HIGHEST_ID_FIRST = new BlogURIComparator(true);
public static final BlogURIComparator HIGHEST_ID_LAST = new BlogURIComparator(false);
private boolean _highestFirst;
public BlogURIComparator(boolean highestFirst) {
_highestFirst = highestFirst;
}
public int compare(Object lhs, Object rhs) {
if ( (lhs == null) || !(lhs instanceof BlogURI) ) return 1;
if ( (rhs == null) || !(rhs instanceof BlogURI) ) return -1;
BlogURI l = (BlogURI)lhs;
BlogURI r = (BlogURI)rhs;
if (l.getEntryId() > r.getEntryId())
return (_highestFirst ? 1 : -1);
else if (l.getEntryId() < r.getEntryId())
return (_highestFirst ? -1 : 1);
else
return DataHelper.compareTo(l.getKeyHash().getData(), r.getKeyHash().getData());
}
}
}

View File

@ -0,0 +1,32 @@
package net.i2p.syndie.data;
/**
*
*/
public class SafeURL {
private String _schema;
private String _location;
private String _name;
private String _description;
public SafeURL(String raw) {
parse(raw);
}
private void parse(String raw) {
if (raw != null) {
int index = raw.indexOf("://");
if ( (index <= 0) || (index + 1 >= raw.length()) )
return;
_schema = raw.substring(0, index);
_location = raw.substring(index+3);
_location.replace('>', '_');
_location.replace('<', '^');
}
}
public String getSchema() { return _schema; }
public String getLocation() { return _location; }
public String toString() { return _schema + "://" + _location; }
}

View File

@ -0,0 +1,81 @@
package net.i2p.syndie.data;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
import net.i2p.syndie.BlogManager;
/**
* Simple read-only summary of an archive, proxied to the BlogManager's instance
*/
public class TransparentArchiveIndex extends ArchiveIndex {
public TransparentArchiveIndex() { super(false); }
private static ArchiveIndex index() { return BlogManager.instance().getArchive().getIndex(); }
public String getVersion() { return index().getVersion(); }
public Properties getHeaders() { return index().getHeaders(); }
public int getAllBlogs() { return index().getAllBlogs(); }
public int getNewBlogs() { return index().getNewBlogs(); }
public int getAllEntries() { return index().getAllEntries(); }
public int getNewEntries() { return index().getNewEntries(); }
public long getTotalSize() { return index().getTotalSize(); }
public long getNewSize() { return index().getNewSize(); }
public long getGeneratedOn() { return index().getGeneratedOn(); }
public String getNewSizeStr() { return index().getNewSizeStr(); }
public String getTotalSizeStr() { return index().getTotalSizeStr(); }
/** how many blogs/tags are indexed */
public int getIndexBlogs() { return index().getIndexBlogs(); }
/** get the blog used for the given blog/tag pair */
public Hash getBlog(int index) { return index().getBlog(index); }
/** get the tag used for the given blog/tag pair */
public String getBlogTag(int index) { return index().getBlogTag(index); }
/** get the highest entry ID for the given blog/tag pair */
public long getBlogLastUpdated(int index) { return index().getBlogLastUpdated(index); }
/** get the entry count for the given blog/tag pair */
public int getBlogEntryCount(int index) { return index().getBlogEntryCount(index); }
/** get the entry from the given blog/tag pair */
public BlogURI getBlogEntry(int index, int entryIndex) { return index().getBlogEntry(index, entryIndex); }
/** get the raw entry size (including attachments) from the given blog/tag pair */
public long getBlogEntrySizeKB(int index, int entryIndex) { return index().getBlogEntrySizeKB(index, entryIndex); }
public boolean getEntryIsKnown(BlogURI uri) { return index().getEntryIsKnown(uri); }
public long getBlogEntrySizeKB(BlogURI uri) { return index().getBlogEntrySizeKB(uri); }
public Set getBlogEntryTags(BlogURI uri) { return index().getBlogEntryTags(uri); }
/** how many 'new' blogs are listed */
public int getNewestBlogCount() { return index().getNewestBlogCount(); }
public Hash getNewestBlog(int index) { return index().getNewestBlog(index); }
/** how many 'new' entries are listed */
public int getNewestBlogEntryCount() { return index().getNewestBlogEntryCount(); }
public BlogURI getNewestBlogEntry(int index) { return index().getNewestBlogEntry(index); }
/** list of locally known tags (String) under the given blog */
public List getBlogTags(Hash blog) { return index().getBlogTags(blog); }
/** list of unique blogs locally known (set of Hash) */
public Set getUniqueBlogs() { return index().getUniqueBlogs(); }
public void setLocation(String location) { return; }
public void setIsLocal(String val) { return; }
public void load(File location) throws IOException { return; }
/** load up the index from an archive.txt */
public void load(InputStream index) throws IOException { return; }
/**
* Dig through the index for BlogURIs matching the given criteria, ordering the results by
* their own entryIds.
*
* @param out where to store the matches
* @param blog if set, what blog key must the entries be under
* @param tag if set, what tag must the entry be in
*
*/
public void selectMatchesOrderByEntryId(List out, Hash blog, String tag) {
index().selectMatchesOrderByEntryId(out, blog, tag);
}
/** export the index into an archive.txt */
public String toString() { return index().toString(); }
}

View File

@ -0,0 +1,59 @@
package net.i2p.syndie.sml;
import java.util.List;
/**
*
*/
public class EventReceiverImpl implements SMLParser.EventReceiver {
public void receiveHeader(String header, String value) {
System.out.println("Receive header [" + header + "] = [" + value + "]");
}
public void receiveLink(String schema, String location, String text) {
System.out.println("Receive link [" + schema + "]/[" + location+ "]/[" + text + "]");
}
public void receiveBlog(String name, String blogKeyHash, String blogPath, long blogEntryId,
List blogArchiveLocations, String anchorText) {
System.out.println("Receive blog [" + name + "]/[" + blogKeyHash + "]/[" + blogPath
+ "]/[" + blogEntryId + "]/[" + blogArchiveLocations + "]/[" + anchorText + "]");
}
public void receiveArchive(String name, String description, String locationSchema, String location,
String postingKey, String anchorText) {
System.out.println("Receive archive [" + name + "]/[" + description + "]/[" + locationSchema
+ "]/[" + location + "]/[" + postingKey + "]/[" + anchorText + "]");
}
public void receiveImage(String alternateText, int attachmentId) {
System.out.println("Receive image [" + alternateText + "]/[" + attachmentId + "]");
}
public void receiveAddress(String name, String schema, String location, String anchorText) {
System.out.println("Receive address [" + name + "]/[" + schema + "]/[" + location + "]/[" + anchorText+ "]");
}
public void receiveBold(String text) { System.out.println("Receive bold [" + text+ "]"); }
public void receiveItalic(String text) { System.out.println("Receive italic [" + text+ "]"); }
public void receiveUnderline(String text) { System.out.println("Receive underline [" + text+ "]"); }
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation) {
System.out.println("Receive quote [" + text + "]/[" + whoQuoted + "]/[" + quoteLocationSchema + "]/[" + quoteLocation + "]");
}
public void receiveCode(String text, String codeLocationSchema, String codeLocation) {
System.out.println("Receive code [" + text+ "]/[" + codeLocationSchema + "]/[" + codeLocation + "]");
}
public void receiveCut(String summaryText) { System.out.println("Receive cut [" + summaryText + "]"); }
public void receivePlain(String text) { System.out.println("Receive plain [" + text + "]"); }
public void receiveNewline() { System.out.println("Receive NL"); }
public void receiveLT() { System.out.println("Receive LT"); }
public void receiveGT() { System.out.println("Receive GT"); }
public void receiveBegin() { System.out.println("Receive begin"); }
public void receiveEnd() { System.out.println("Receive end"); }
public void receiveHeaderEnd() { System.out.println("Receive header end"); }
public void receiveLeftBracket() { System.out.println("Receive ["); }
public void receiveRightBracket() { System.out.println("Receive ]"); }
public void receiveH1(String text) {}
public void receiveH2(String text) {}
public void receiveH3(String text) {}
public void receiveH4(String text) {}
public void receiveH5(String text) {}
public void receivePre(String text) {}
public void receiveHR() {}
public void receiveAttachment(int id, String anchorText) {}
}

View File

@ -0,0 +1,121 @@
package net.i2p.syndie.sml;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.web.*;
/**
*
*/
public class HTMLPreviewRenderer extends HTMLRenderer {
private List _filenames;
private List _fileTypes;
private List _files;
public HTMLPreviewRenderer(List filenames, List fileTypes, List files) {
super();
_filenames = filenames;
_fileTypes = fileTypes;
_files = files;
}
protected String getAttachmentURLBase() { return "viewtempattachment.jsp"; }
protected String getAttachmentURL(int id) {
return getAttachmentURLBase() + "?" +
ArchiveViewerBean.PARAM_ATTACHMENT + "=" + id;
}
public void receiveAttachment(int id, String anchorText) {
if (!continueBody()) { return; }
if ( (id < 0) || (_files == null) || (id >= _files.size()) ) {
_bodyBuffer.append(sanitizeString(anchorText));
} else {
File f = (File)_files.get(id);
String name = (String)_filenames.get(id);
String type = (String)_fileTypes.get(id);
_bodyBuffer.append("<a href=\"").append(getAttachmentURL(id)).append("\">");
_bodyBuffer.append(sanitizeString(anchorText)).append("</a>");
_bodyBuffer.append(" (").append(f.length()/1024).append("KB, ");
_bodyBuffer.append(" \"").append(sanitizeString(name)).append("\", ");
_bodyBuffer.append(sanitizeString(type)).append(")");
}
}
public void receiveEnd() {
_postBodyBuffer.append("</td></tr>\n");
_postBodyBuffer.append("<tr>\n");
_postBodyBuffer.append("<form action=\"").append(getAttachmentURLBase()).append("\">\n");
_postBodyBuffer.append("<td colspan=\"2\" valign=\"top\" align=\"left\" class=\"syndieEntryAttachmentsCell\"\n");
if (_files.size() > 0) {
_postBodyBuffer.append("<b>Attachments:</b> ");
_postBodyBuffer.append("<select name=\"").append(ArchiveViewerBean.PARAM_ATTACHMENT).append("\">\n");
for (int i = 0; i < _files.size(); i++) {
_postBodyBuffer.append("<option value=\"").append(i).append("\">");
File f = (File)_files.get(i);
String name = (String)_filenames.get(i);
String type = (String)_fileTypes.get(i);
_postBodyBuffer.append(sanitizeString(name));
_postBodyBuffer.append(" (").append(f.length()/1024).append("KB");
_postBodyBuffer.append(", type ").append(sanitizeString(type)).append(")</option>\n");
}
_postBodyBuffer.append("</select>\n");
_postBodyBuffer.append("<input type=\"submit\" value=\"Download\" name=\"Download\" /><br />\n");
}
if (_blogs.size() > 0) {
_postBodyBuffer.append("<b>Blog references:</b> ");
for (int i = 0; i < _blogs.size(); i++) {
Blog b = (Blog)_blogs.get(i);
_postBodyBuffer.append("<a href=\"").append(getPageURL(new Hash(Base64.decode(b.hash)), b.tag, b.entryId, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false)));
_postBodyBuffer.append("\">").append(sanitizeString(b.name)).append("</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_links.size() > 0) {
_postBodyBuffer.append("<b>External links:</b> ");
for (int i = 0; i < _links.size(); i++) {
Link l = (Link)_links.get(i);
_postBodyBuffer.append("<a href=\"externallink.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(l.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(l.location));
_postBodyBuffer.append("\">").append(sanitizeString(l.location));
_postBodyBuffer.append(" (").append(sanitizeString(l.schema)).append(")</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_addresses.size() > 0) {
_postBodyBuffer.append("<b>Addresses:</b> ");
for (int i = 0; i < _addresses.size(); i++) {
Address a = (Address)_addresses.get(i);
_postBodyBuffer.append("<a href=\"addaddress.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(a.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(a.location)).append("&name=");
_postBodyBuffer.append(sanitizeURL(a.name));
_postBodyBuffer.append("\">").append(sanitizeString(a.name));
}
_postBodyBuffer.append("<br />\n");
}
if (_archives.size() > 0) {
_postBodyBuffer.append("<b>Archives:</b>");
for (int i = 0; i < _archives.size(); i++) {
ArchiveRef a = (ArchiveRef)_archives.get(i);
_postBodyBuffer.append(" <a href=\"").append(getArchiveURL(null, new SafeURL(a.locationSchema + "://" + a.location)));
_postBodyBuffer.append("\">").append(sanitizeString(a.name)).append("</a>");
if (a.description != null)
_postBodyBuffer.append(": ").append(sanitizeString(a.description));
}
_postBodyBuffer.append("<br />\n");
}
_postBodyBuffer.append("</td>\n</form>\n</tr>\n");
_postBodyBuffer.append("</table>\n");
}
}

View File

@ -0,0 +1,823 @@
package net.i2p.syndie.sml;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.web.*;
/**
*
*/
public class HTMLRenderer extends EventReceiverImpl {
protected SMLParser _parser;
protected Writer _out;
protected User _user;
protected Archive _archive;
protected EntryContainer _entry;
protected boolean _showImages;
protected boolean _cutBody;
protected boolean _cutReached;
protected int _cutSize;
protected int _lastNewlineAt;
protected Map _headers;
protected List _addresses;
protected List _links;
protected List _blogs;
protected List _archives;
protected StringBuffer _preBodyBuffer;
protected StringBuffer _bodyBuffer;
protected StringBuffer _postBodyBuffer;
public HTMLRenderer() {
_parser = new SMLParser();
}
/**
* Usage: HTMLRenderer smlFile outputFile
*/
public static void main(String args[]) {
if (args.length != 2) {
System.err.println("Usage: HTMLRenderer smlFile outputFile");
return;
}
HTMLRenderer renderer = new HTMLRenderer();
Writer out = null;
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024*512);
FileInputStream in = new FileInputStream(args[0]);
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
baos.write(buf, 0, read);
out = new OutputStreamWriter(new FileOutputStream(args[1]), "UTF-8");
renderer.render(new User(), BlogManager.instance().getArchive(), null, DataHelper.getUTF8(baos.toByteArray()), out, false, true);
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
}
public void renderUnknownEntry(User user, Archive archive, BlogURI uri, Writer out) throws IOException {
BlogInfo info = archive.getBlogInfo(uri);
if (info == null)
out.write("<br />The blog " + uri.getKeyHash().toBase64() + " is not known locally. "
+ "Please get it from an archive and <a href=\""
+ getPageURL(uri.getKeyHash(), null, uri.getEntryId(), -1, -1, user.getShowExpanded(), user.getShowImages())
+ "\">try again</a>");
else
out.write("<br />The blog <a href=\""
+ getPageURL(uri.getKeyHash(), null, -1, -1, -1, user.getShowExpanded(), user.getShowImages())
+ "\">" + info.getProperty(BlogInfo.NAME) + "</a> is known, but the entry " + uri.getEntryId() + " is not. "
+ "Please get it from an archive and <a href=\""
+ getPageURL(uri.getKeyHash(), null, uri.getEntryId(), -1, -1, user.getShowExpanded(), user.getShowImages())
+ "\">try again</a>");
}
public void render(User user, Archive archive, EntryContainer entry, Writer out, boolean cutBody, boolean showImages) throws IOException {
if (entry == null)
return;
render(user, archive, entry, entry.getEntry().getText(), out, cutBody, showImages);
}
public void render(User user, Archive archive, EntryContainer entry, String rawSML, Writer out, boolean cutBody, boolean showImages) throws IOException {
_user = user;
_archive = archive;
_entry = entry;
_out = out;
_headers = new HashMap();
_preBodyBuffer = new StringBuffer(1024);
_bodyBuffer = new StringBuffer(1024);
_postBodyBuffer = new StringBuffer(1024);
_addresses = new ArrayList();
_links = new ArrayList();
_blogs = new ArrayList();
_archives = new ArrayList();
_cutBody = cutBody;
_showImages = showImages;
_cutReached = false;
_cutSize = 1024;
_parser.parse(rawSML, this);
_out.write(_preBodyBuffer.toString());
_out.write(_bodyBuffer.toString());
_out.write(_postBodyBuffer.toString());
//int len = _preBodyBuffer.length() + _bodyBuffer.length() + _postBodyBuffer.length();
//System.out.println("Wrote " + len);
}
public void receivePlain(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append(sanitizeString(text));
}
public void receiveBold(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append("<b>").append(sanitizeString(text)).append("</b>");
}
public void receiveItalic(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append("<i>").append(sanitizeString(text)).append("</i>");
}
public void receiveUnderline(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append("<u>").append(sanitizeString(text)).append("</u>");
}
public void receiveHR() {
if (!continueBody()) { return; }
_bodyBuffer.append("<hr />");
}
public void receiveH1(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h1>").append(sanitizeString(body)).append("</h1>");
}
public void receiveH2(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h2>").append(sanitizeString(body)).append("</h2>");
}
public void receiveH3(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h3>").append(sanitizeString(body)).append("</h3>");
}
public void receiveH4(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h4>").append(sanitizeString(body)).append("</h4>");
}
public void receiveH5(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h5>").append(sanitizeString(body)).append("</h5>");
}
public void receivePre(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<pre>").append(sanitizeString(body)).append("</pre>");
}
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation) {
if (!continueBody()) { return; }
_bodyBuffer.append("<quote>").append(sanitizeString(text)).append("</quote>");
}
public void receiveCode(String text, String codeLocationSchema, String codeLocation) {
if (!continueBody()) { return; }
_bodyBuffer.append("<code>").append(sanitizeString(text)).append("</code>");
}
public void receiveImage(String alternateText, int attachmentId) {
if (!continueBody()) { return; }
if (_showImages) {
_bodyBuffer.append("<img src=\"").append(getAttachmentURL(attachmentId)).append("\"");
if (alternateText != null)
_bodyBuffer.append(" alt=\"").append(sanitizeTagParam(alternateText)).append("\"");
_bodyBuffer.append(" />");
} else {
_bodyBuffer.append("[image: attachment ").append(attachmentId);
_bodyBuffer.append(": ").append(sanitizeString(alternateText));
_bodyBuffer.append(" <a href=\"").append(getEntryURL(true)).append("\">view images</a>]");
}
}
public void receiveCut(String summaryText) {
if (!continueBody()) { return; }
_cutReached = true;
if (_cutBody) {
_bodyBuffer.append("<a href=\"").append(getEntryURL()).append("\">");
if ( (summaryText != null) && (summaryText.length() > 0) )
_bodyBuffer.append(sanitizeString(summaryText));
else
_bodyBuffer.append("more inside...");
_bodyBuffer.append("</a>\n");
} else {
if (summaryText != null)
_bodyBuffer.append(sanitizeString(summaryText));
}
}
/** are we either before the cut or rendering without cutting? */
protected boolean continueBody() {
boolean rv = ( (!_cutReached) && (_bodyBuffer.length() <= _cutSize) ) || (!_cutBody);
//if (!rv)
// System.out.println("rv: " + rv + " Cut reached: " + _cutReached + " bodyBufferSize: " + _bodyBuffer.length() + " cutBody? " + _cutBody);
if (!rv && !_cutReached) {
// exceeded the allowed size
_bodyBuffer.append("<a href=\"").append(getEntryURL()).append("\">more inside...</a>");
_cutReached = true;
}
return rv;
}
public void receiveNewline() {
if (!continueBody()) { return; }
if (true || (_lastNewlineAt >= _bodyBuffer.length()))
_bodyBuffer.append("<br />\n");
else
_lastNewlineAt = _bodyBuffer.length();
}
public void receiveLT() {
if (!continueBody()) { return; }
_bodyBuffer.append("&lt;");
}
public void receiveGT() {
if (!continueBody()) { return; }
_bodyBuffer.append("&gt;");
}
public void receiveBegin() {}
public void receiveLeftBracket() {
if (!continueBody()) { return; }
_bodyBuffer.append('[');
}
public void receiveRightBracket() {
if (!continueBody()) { return; }
_bodyBuffer.append(']');
}
protected static class Blog {
public String name;
public String hash;
public String tag;
public long entryId;
public List locations;
public int hashCode() { return -1; }
public boolean equals(Object o) {
Blog b = (Blog)o;
return DataHelper.eq(hash, b.hash) && DataHelper.eq(tag, b.tag) && DataHelper.eq(name, b.name)
&& DataHelper.eq(entryId, b.entryId) && DataHelper.eq(locations, b.locations);
}
}
/**
* when we see a link to a blog, we may want to:
* = view the blog entry
* = view all entries in that blog
* = view all entries in that blog with the given tag
* = view the blog's metadata
* = [fetch the blog from other locations]
* = [add the blog's locations to our list of known locations]
* = [shitlist the blog]
* = [add the blog to one of our groups]
*
* [blah] implies *later*.
*
* Currently renders to:
* <a href="$entryURL">$description</a>
* [blog: <a href="$blogURL">$name</a> (<a href="$metaURL">meta</a>)
* [tag: <a href="$blogTagURL">$tag</a>]
* archived at $location*]
*
*/
public void receiveBlog(String name, String hash, String tag, long entryId, List locations, String description) {
System.out.println("Receiving the blog: " + name + "/" + hash + "/" + tag + "/" + entryId +"/" + locations + ": "+ description);
byte blogData[] = Base64.decode(hash);
if ( (blogData == null) || (blogData.length != Hash.HASH_LENGTH) )
return;
Blog b = new Blog();
b.name = name;
b.hash = hash;
b.tag = tag;
b.entryId = entryId;
b.locations = locations;
if (!_blogs.contains(b))
_blogs.add(b);
if (!continueBody()) { return; }
if (hash == null) return;
Hash blog = new Hash(blogData);
if (entryId > 0) {
String pageURL = getPageURL(blog, tag, entryId, -1, -1, true, (_user != null ? _user.getShowImages() : false));
_bodyBuffer.append("<a href=\"").append(pageURL).append("\">");
if ( (description != null) && (description.trim().length() > 0) ) {
_bodyBuffer.append(sanitizeString(description));
} else if ( (name != null) && (name.trim().length() > 0) ) {
_bodyBuffer.append(sanitizeString(name));
} else {
_bodyBuffer.append("[view entry]");
}
_bodyBuffer.append("</a>");
}
String url = getPageURL(blog, null, -1, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false));
_bodyBuffer.append(" [<a href=\"").append(url);
_bodyBuffer.append("\">");
if ( (name != null) && (name.trim().length() > 0) )
_bodyBuffer.append(sanitizeString(name));
else
_bodyBuffer.append("view");
_bodyBuffer.append("</a> (<a href=\"").append(getMetadataURL(blog)).append("\">meta</a>)");
if ( (tag != null) && (tag.trim().length() > 0) ) {
url = getPageURL(blog, tag, -1, -1, -1, false, false);
_bodyBuffer.append(" <a href=\"").append(url);
_bodyBuffer.append("\">Tag: ").append(sanitizeString(tag)).append("</a>");
}
if ( (locations != null) && (locations.size() > 0) ) {
_bodyBuffer.append(" Archives: ");
for (int i = 0; i < locations.size(); i++) {
SafeURL surl = (SafeURL)locations.get(i);
if (_user.getAuthenticated() && _user.getAllowAccessRemote())
_bodyBuffer.append("<a href=\"").append(getArchiveURL(blog, surl)).append("\">").append(sanitizeString(surl.toString())).append("</a> ");
else
_bodyBuffer.append(sanitizeString(surl.toString())).append(' ');
}
}
_bodyBuffer.append("] ");
}
protected static class ArchiveRef {
public String name;
public String description;
public String locationSchema;
public String location;
public int hashCode() { return -1; }
public boolean equals(Object o) {
ArchiveRef a = (ArchiveRef)o;
return DataHelper.eq(name, a.name) && DataHelper.eq(description, a.description)
&& DataHelper.eq(locationSchema, a.locationSchema)
&& DataHelper.eq(location, a.location);
}
}
public void receiveArchive(String name, String description, String locationSchema, String location,
String postingKey, String anchorText) {
ArchiveRef a = new ArchiveRef();
a.name = name;
a.description = description;
a.locationSchema = locationSchema;
a.location = location;
if (!_archives.contains(a))
_archives.add(a);
if (!continueBody()) { return; }
_bodyBuffer.append(sanitizeString(anchorText)).append(" [Archive ");
if (name != null)
_bodyBuffer.append(sanitizeString(name));
if (location != null) {
_bodyBuffer.append(" at ");
SafeURL surl = new SafeURL(locationSchema + "://" + location);
_bodyBuffer.append("<a href=\"").append(getArchiveURL(null, surl));
_bodyBuffer.append("\">").append(sanitizeString(surl.toString())).append("</a>");
}
if (description != null)
_bodyBuffer.append(": ").append(sanitizeString(description));
_bodyBuffer.append("]");
}
protected static class Link {
public String schema;
public String location;
public int hashCode() { return -1; }
public boolean equals(Object o) {
Link l = (Link)o;
return DataHelper.eq(schema, l.schema) && DataHelper.eq(location, l.location);
}
}
public void receiveLink(String schema, String location, String text) {
Link l = new Link();
l.schema = schema;
l.location = location;
if (!_links.contains(l))
_links.add(l);
if (!continueBody()) { return; }
if ( (schema == null) || (location == null) ) return;
_bodyBuffer.append("<a href=\"externallink.jsp?schema=");
_bodyBuffer.append(sanitizeURL(schema)).append("&location=");
_bodyBuffer.append(sanitizeURL(location)).append("&description=");
_bodyBuffer.append(sanitizeURL(text)).append("\">").append(sanitizeString(text)).append("</a>");
}
protected static class Address {
public String name;
public String schema;
public String location;
public int hashCode() { return -1; }
public boolean equals(Object o) {
Address a = (Address)o;
return DataHelper.eq(schema, a.schema) && DataHelper.eq(location, a.location) && DataHelper.eq(name, a.name);
}
}
public void receiveAddress(String name, String schema, String location, String anchorText) {
Address a = new Address();
a.name = name;
a.schema = schema;
a.location = location;
if (!_addresses.contains(a))
_addresses.add(a);
if (!continueBody()) { return; }
if ( (schema == null) || (location == null) ) return;
_bodyBuffer.append("<a href=\"addaddress.jsp?schema=");
_bodyBuffer.append(sanitizeURL(schema)).append("&name=");
_bodyBuffer.append(sanitizeURL(name)).append("&location=");
_bodyBuffer.append(sanitizeURL(location)).append("\">").append(sanitizeString(anchorText)).append("</a>");
}
public void receiveAttachment(int id, String anchorText) {
if (!continueBody()) { return; }
Attachment attachments[] = _entry.getAttachments();
if ( (id < 0) || (id >= attachments.length)) {
_bodyBuffer.append(sanitizeString(anchorText));
} else {
_bodyBuffer.append("<a href=\"").append(getAttachmentURL(id)).append("\">");
_bodyBuffer.append(sanitizeString(anchorText)).append("</a>");
_bodyBuffer.append(" (").append(attachments[id].getDataLength()/1024).append("KB, ");
_bodyBuffer.append(" \"").append(sanitizeString(attachments[id].getName())).append("\", ");
_bodyBuffer.append(sanitizeString(attachments[id].getMimeType())).append(")");
}
}
public void receiveEnd() {
_postBodyBuffer.append("</td></tr>\n");
if (_cutBody) {
_postBodyBuffer.append("<tr class=\"syndieEntryAttachmentsCell\">\n");
_postBodyBuffer.append("<td colspan=\"2\" valign=\"top\" align=\"left\" class=\"syndieEntryAttachmentsCell\">");
_postBodyBuffer.append("<a href=\"").append(getEntryURL()).append("\">View details...</a> ");
if ( (_entry != null) && (_entry.getAttachments() != null) && (_entry.getAttachments().length > 0) ) {
int num = _entry.getAttachments().length;
if (num == 1)
_postBodyBuffer.append("1 attachment ");
else
_postBodyBuffer.append(num + " attachments ");
}
int blogs = _blogs.size();
if (blogs == 1)
_postBodyBuffer.append("1 blog reference ");
else if (blogs > 1)
_postBodyBuffer.append(blogs).append(" blog references ");
int links = _links.size();
if (links == 1)
_postBodyBuffer.append("1 external link ");
else if (links > 1)
_postBodyBuffer.append(links).append(" external links");
int addrs = _addresses.size();
if (addrs == 1)
_postBodyBuffer.append("1 address ");
else if (addrs > 1)
_postBodyBuffer.append(addrs).append(" addresses ");
int archives = _archives.size();
if (archives == 1)
_postBodyBuffer.append("1 archive ");
else if (archives > 1)
_postBodyBuffer.append(archives).append(" archives ");
if (_entry != null) {
List replies = _archive.getIndex().getReplies(_entry.getURI());
if ( (replies != null) && (replies.size() > 0) ) {
if (replies.size() == 1)
_postBodyBuffer.append("1 reply ");
else
_postBodyBuffer.append(replies.size()).append(" replies ");
}
}
String inReplyTo = (String)_headers.get(HEADER_IN_REPLY_TO);
if ( (inReplyTo != null) && (inReplyTo.trim().length() > 0) )
_postBodyBuffer.append(" <a href=\"").append(getPageURL(sanitizeTagParam(inReplyTo))).append("\">(view parent)</a>\n");
_postBodyBuffer.append("</td></tr>\n");
} else {
_postBodyBuffer.append("<tr class=\"syndieEntryAttachmentsCell\">\n");
_postBodyBuffer.append("<form action=\"").append(getAttachmentURLBase()).append("\">\n");
_postBodyBuffer.append("<input type=\"hidden\" name=\"").append(ArchiveViewerBean.PARAM_BLOG);
_postBodyBuffer.append("\" value=\"");
if (_entry != null)
_postBodyBuffer.append(Base64.encode(_entry.getURI().getKeyHash().getData()));
else
_postBodyBuffer.append("unknown");
_postBodyBuffer.append("\" />\n");
_postBodyBuffer.append("<input type=\"hidden\" name=\"").append(ArchiveViewerBean.PARAM_ENTRY);
_postBodyBuffer.append("\" value=\"");
if (_entry != null)
_postBodyBuffer.append(_entry.getURI().getEntryId());
else
_postBodyBuffer.append("unknown");
_postBodyBuffer.append("\" />\n");
_postBodyBuffer.append("<td colspan=\"2\" valign=\"top\" align=\"left\" class=\"syndieEntryAttachmentsCell\">\n");
if ( (_entry != null) && (_entry.getAttachments() != null) && (_entry.getAttachments().length > 0) ) {
_postBodyBuffer.append("<b>Attachments:</b> ");
_postBodyBuffer.append("<select name=\"").append(ArchiveViewerBean.PARAM_ATTACHMENT).append("\">\n");
for (int i = 0; i < _entry.getAttachments().length; i++) {
_postBodyBuffer.append("<option value=\"").append(i).append("\">");
Attachment a = _entry.getAttachments()[i];
_postBodyBuffer.append(sanitizeString(a.getName()));
if ( (a.getDescription() != null) && (a.getDescription().trim().length() > 0) ) {
_postBodyBuffer.append(": ");
_postBodyBuffer.append(sanitizeString(a.getDescription()));
}
_postBodyBuffer.append(" (").append(a.getDataLength()/1024).append("KB");
_postBodyBuffer.append(", type ").append(sanitizeString(a.getMimeType())).append(")</option>\n");
}
_postBodyBuffer.append("</select>\n");
_postBodyBuffer.append("<input type=\"submit\" value=\"Download\" name=\"Download\" /><br />\n");
}
if (_blogs.size() > 0) {
_postBodyBuffer.append("<b>Blog references:</b> ");
for (int i = 0; i < _blogs.size(); i++) {
Blog b = (Blog)_blogs.get(i);
_postBodyBuffer.append("<a href=\"").append(getPageURL(new Hash(Base64.decode(b.hash)), b.tag, b.entryId, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false)));
_postBodyBuffer.append("\">").append(sanitizeString(b.name)).append("</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_links.size() > 0) {
_postBodyBuffer.append("<b>External links:</b> ");
for (int i = 0; i < _links.size(); i++) {
Link l = (Link)_links.get(i);
_postBodyBuffer.append("<a href=\"externallink.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(l.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(l.location));
_postBodyBuffer.append("\">").append(sanitizeString(l.location));
_postBodyBuffer.append(" (").append(sanitizeString(l.schema)).append(")</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_addresses.size() > 0) {
_postBodyBuffer.append("<b>Addresses:</b>");
for (int i = 0; i < _addresses.size(); i++) {
Address a = (Address)_addresses.get(i);
_postBodyBuffer.append(" <a href=\"addaddress.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(a.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(a.location)).append("&name=");
_postBodyBuffer.append(sanitizeURL(a.name));
_postBodyBuffer.append("\">").append(sanitizeString(a.name));
}
_postBodyBuffer.append("<br />\n");
}
if (_archives.size() > 0) {
_postBodyBuffer.append("<b>Archives:</b>");
for (int i = 0; i < _archives.size(); i++) {
ArchiveRef a = (ArchiveRef)_archives.get(i);
_postBodyBuffer.append(" <a href=\"").append(getArchiveURL(null, new SafeURL(a.locationSchema + "://" + a.location)));
_postBodyBuffer.append("\">").append(sanitizeString(a.name)).append("</a>");
if (a.description != null)
_postBodyBuffer.append(": ").append(sanitizeString(a.description));
}
_postBodyBuffer.append("<br />\n");
}
if (_entry != null) {
List replies = _archive.getIndex().getReplies(_entry.getURI());
if ( (replies != null) && (replies.size() > 0) ) {
_postBodyBuffer.append("<b>Replies:</b> ");
for (int i = 0; i < replies.size(); i++) {
BlogURI reply = (BlogURI)replies.get(i);
_postBodyBuffer.append("<a href=\"");
_postBodyBuffer.append(getPageURL(reply.getKeyHash(), null, reply.getEntryId(), -1, -1, true, _user.getShowImages()));
_postBodyBuffer.append("\">");
BlogInfo replyAuthor = _archive.getBlogInfo(reply);
if (replyAuthor != null) {
_postBodyBuffer.append(sanitizeString(replyAuthor.getProperty(BlogInfo.NAME)));
} else {
_postBodyBuffer.append(reply.getKeyHash().toBase64().substring(0,16));
}
_postBodyBuffer.append(" on ");
_postBodyBuffer.append(getEntryDate(reply.getEntryId()));
_postBodyBuffer.append("</a> ");
}
_postBodyBuffer.append("<br />");
}
}
String inReplyTo = (String)_headers.get(HEADER_IN_REPLY_TO);
if ( (inReplyTo != null) && (inReplyTo.trim().length() > 0) ) {
_postBodyBuffer.append(" <a href=\"").append(getPageURL(sanitizeTagParam(inReplyTo))).append("\">(view parent)</a><br />\n");
}
_postBodyBuffer.append("</td>\n</form>\n</tr>\n");
}
_postBodyBuffer.append("</table>\n");
}
public void receiveHeader(String header, String value) {
//System.err.println("Receive header [" + header + "] = [" + value + "]");
_headers.put(header, value);
}
public void receiveHeaderEnd() {
_preBodyBuffer.append("<table width=\"100%\" border=\"0\">\n");
renderSubjectCell();
renderMetaCell();
renderPreBodyCell();
}
public static final String HEADER_SUBJECT = "Subject";
public static final String HEADER_BGCOLOR = "bgcolor";
public static final String HEADER_IN_REPLY_TO = "InReplyTo";
private void renderSubjectCell() {
_preBodyBuffer.append("<tr class=\"syndieEntrySubjectCell\"><td align=\"left\" valign=\"top\" class=\"syndieEntrySubjectCell\" width=\"400\"> ");
String subject = (String)_headers.get(HEADER_SUBJECT);
if (subject == null)
subject = "[no subject]";
_preBodyBuffer.append(sanitizeString(subject));
_preBodyBuffer.append("</td>\n");
}
private void renderPreBodyCell() {
String bgcolor = (String)_headers.get(HEADER_BGCOLOR);
if (_cutBody)
_preBodyBuffer.append("<tr class=\"syndieEntrySummaryCell\"><td colspan=\"2\" align=\"left\" valign=\"top\" class=\"syndieEntrySummaryCell\" " + (bgcolor != null ? "bgcolor=\"" + sanitizeTagParam(bgcolor) + "\"" : "") + "\">");
else
_preBodyBuffer.append("<tr class=\"syndieEntryBodyCell\"><td colspan=\"2\" align=\"left\" valign=\"top\" class=\"syndieEntryBodyCell\" " + (bgcolor != null ? "bgcolor=\"" + sanitizeTagParam(bgcolor) + "\"" : "") + "\">");
}
private void renderMetaCell() {
String tags[] = (_entry != null ? _entry.getTags() : null);
if ( (tags != null) && (tags.length > 0) )
_preBodyBuffer.append("<form action=\"index.jsp\">");
_preBodyBuffer.append("<td nowrap=\"true\" align=\"right\" valign=\"top\" class=\"syndieEntryMetaCell\">\n");
BlogInfo info = null;
if (_entry != null)
info = _archive.getBlogInfo(_entry.getURI());
if (info != null) {
_preBodyBuffer.append("<a href=\"").append(getMetadataURL()).append("\">");
String nameStr = info.getProperty("Name");
if (nameStr == null)
_preBodyBuffer.append("[no name]");
else
_preBodyBuffer.append(sanitizeString(nameStr));
_preBodyBuffer.append("</a>");
} else {
_preBodyBuffer.append("[unknown blog]");
}
if ( (tags != null) && (tags.length > 0) ) {
_preBodyBuffer.append(" Tags: ");
_preBodyBuffer.append("<select name=\"selector\">");
for (int i = 0; tags != null && i < tags.length; i++) {
_preBodyBuffer.append("<option value=\"blogtag://");
_preBodyBuffer.append(_entry.getURI().getKeyHash().toBase64());
_preBodyBuffer.append('/').append(Base64.encode(DataHelper.getUTF8(tags[i]))).append("\">");
_preBodyBuffer.append(sanitizeString(tags[i]));
_preBodyBuffer.append("</option>\n");
/*
_preBodyBuffer.append("<a href=\"");
_preBodyBuffer.append(getPageURL(_entry.getURI().getKeyHash(), tags[i], -1, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false)));
_preBodyBuffer.append("\">");
_preBodyBuffer.append(sanitizeString(tags[i]));
_preBodyBuffer.append("</a>");
if (i + 1 < tags.length)
_preBodyBuffer.append(", ");
*/
}
_preBodyBuffer.append("</select>");
_preBodyBuffer.append("<input type=\"submit\" value=\"View\" />\n");
//_preBodyBuffer.append("</i>");
}
_preBodyBuffer.append(" ");
/*
String inReplyTo = (String)_headers.get(HEADER_IN_REPLY_TO);
if ( (inReplyTo != null) && (inReplyTo.trim().length() > 0) )
_preBodyBuffer.append(" <a href=\"").append(getPageURL(sanitizeTagParam(inReplyTo))).append("\">In reply to</a>\n");
*/
if (_entry != null)
_preBodyBuffer.append(getEntryDate(_entry.getURI().getEntryId()));
else
_preBodyBuffer.append(getEntryDate(new Date().getTime()));
if ( (_user != null) && (_user.getAuthenticated()) )
_preBodyBuffer.append(" <a href=\"").append(getPostURL(_user.getBlog(), true)).append("\">Reply</a>\n");
_preBodyBuffer.append("\n</td>");
if ( (tags != null) && (tags.length > 0) )
_preBodyBuffer.append("</form>");
_preBodyBuffer.append("</tr>\n");
}
private final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
private final String getEntryDate(long when) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(when));
long dayBegin = _dateFormat.parse(str).getTime();
return str + "." + (when - dayBegin);
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return "unknown";
}
}
}
public static final String sanitizeString(String str) { return sanitizeString(str, true); }
public static final String sanitizeString(String str, boolean allowNL) {
if (str == null) return null;
boolean unsafe = false;
unsafe = unsafe || str.indexOf('<') >= 0;
unsafe = unsafe || str.indexOf('>') >= 0;
if (!allowNL) {
unsafe = unsafe || str.indexOf('\n') >= 0;
unsafe = unsafe || str.indexOf('\r') >= 0;
unsafe = unsafe || str.indexOf('\f') >= 0;
}
if (!unsafe) return str;
str = str.replace('<', '_'); // this should be &lt;
str = str.replace('>', '-'); // this should be &gt;
if (!allowNL) {
str = str.replace('\n', ' ');
str = str.replace('\r', ' ');
str = str.replace('\f', ' ');
}
return str;
}
public static final String sanitizeURL(String str) { return Base64.encode(DataHelper.getUTF8(str)); }
public static final String sanitizeTagParam(String str) {
str = str.replace('&', '_'); // this should be &amp;
if (str.indexOf('\"') < 0)
return sanitizeString(str);
str = str.replace('\"', '\'');
return sanitizeString(str);
}
private String getEntryURL() { return getEntryURL(_user != null ? _user.getShowImages() : false); }
private String getEntryURL(boolean showImages) {
if (_entry == null) return "unknown";
return "index.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" +
Base64.encode(_entry.getURI().getKeyHash().getData()) +
"&" + ArchiveViewerBean.PARAM_ENTRY + "=" + _entry.getURI().getEntryId() +
"&" + ArchiveViewerBean.PARAM_SHOW_IMAGES + (showImages ? "=true" : "=false") +
"&" + ArchiveViewerBean.PARAM_EXPAND_ENTRIES + "=true";
}
protected String getAttachmentURLBase() { return "viewattachment.jsp"; }
protected String getAttachmentURL(int id) {
if (_entry == null) return "unknown";
return getAttachmentURLBase() + "?" +
ArchiveViewerBean.PARAM_BLOG + "=" +
Base64.encode(_entry.getURI().getKeyHash().getData()) +
"&" + ArchiveViewerBean.PARAM_ENTRY + "=" + _entry.getURI().getEntryId() +
"&" + ArchiveViewerBean.PARAM_ATTACHMENT + "=" + id;
}
public String getMetadataURL() {
if (_entry == null) return "unknown";
return getMetadataURL(_entry.getURI().getKeyHash());
}
public static String getMetadataURL(Hash blog) {
return "viewmetadata.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" +
Base64.encode(blog.getData());
}
public static String getPostURL(Hash blog) {
return "post.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" + Base64.encode(blog.getData());
}
public String getPostURL(Hash blog, boolean asReply) {
if (asReply && _entry != null) {
return "post.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" + Base64.encode(blog.getData())
+ "&" + ArchiveViewerBean.PARAM_IN_REPLY_TO + '='
+ Base64.encode("entry://" + _entry.getURI().getKeyHash().toBase64() + "/" + _entry.getURI().getEntryId());
} else {
return getPostURL(blog);
}
}
public String getPageURL(String selector) { return getPageURL(_user, selector); }
public static String getPageURL(User user, String selector) { return getPageURL(user, selector, -1, -1); }
public static String getPageURL(User user, String selector, int numPerPage, int pageNum) {
StringBuffer buf = new StringBuffer(128);
buf.append("index.jsp?");
buf.append("selector=").append(sanitizeTagParam(selector)).append("&");
if ( (pageNum >= 0) && (numPerPage > 0) ) {
buf.append(ArchiveViewerBean.PARAM_PAGE_NUMBER).append('=').append(pageNum).append('&');
buf.append(ArchiveViewerBean.PARAM_NUM_PER_PAGE).append('=').append(numPerPage).append('&');
}
buf.append(ArchiveViewerBean.PARAM_EXPAND_ENTRIES).append('=').append(user.getShowExpanded()).append('&');
buf.append(ArchiveViewerBean.PARAM_SHOW_IMAGES).append('=').append(user.getShowImages()).append('&');
return buf.toString();
}
public static String getPageURL(Hash blog, String tag, long entryId, int numPerPage, int pageNum, boolean expandEntries, boolean showImages) {
return getPageURL(blog, tag, entryId, null, numPerPage, pageNum, expandEntries, showImages);
}
public static String getPageURL(Hash blog, String tag, long entryId, String group, int numPerPage, int pageNum, boolean expandEntries, boolean showImages) {
StringBuffer buf = new StringBuffer(128);
buf.append("index.jsp?");
if (blog != null)
buf.append(ArchiveViewerBean.PARAM_BLOG).append('=').append(Base64.encode(blog.getData())).append('&');
if (tag != null)
buf.append(ArchiveViewerBean.PARAM_TAG).append('=').append(Base64.encode(DataHelper.getUTF8(tag))).append('&');
if (entryId >= 0)
buf.append(ArchiveViewerBean.PARAM_ENTRY).append('=').append(entryId).append('&');
if (group != null)
buf.append(ArchiveViewerBean.PARAM_GROUP).append('=').append(Base64.encode(DataHelper.getUTF8(group))).append('&');
if ( (pageNum >= 0) && (numPerPage > 0) ) {
buf.append(ArchiveViewerBean.PARAM_PAGE_NUMBER).append('=').append(pageNum).append('&');
buf.append(ArchiveViewerBean.PARAM_NUM_PER_PAGE).append('=').append(numPerPage).append('&');
}
buf.append(ArchiveViewerBean.PARAM_EXPAND_ENTRIES).append('=').append(expandEntries).append('&');
buf.append(ArchiveViewerBean.PARAM_SHOW_IMAGES).append('=').append(showImages).append('&');
return buf.toString();
}
public static String getArchiveURL(Hash blog, SafeURL archiveLocation) {
return "remote.jsp?"
//+ "action=Continue..." // should this be the case?
+ "&schema=" + sanitizeTagParam(archiveLocation.getSchema())
+ "&location=" + sanitizeTagParam(archiveLocation.getLocation());
}
}

View File

@ -0,0 +1,442 @@
package net.i2p.syndie.sml;
import java.lang.String;
import java.util.*;
import net.i2p.syndie.data.*;
/**
* Parse out the SML from the text, firing off info to the receiver whenever certain
* elements are available. This is a very simple parser, with no support for nested
* tags. A simple stack would be good to add, but DTSTTCPW.
*
*
*/
public class SMLParser {
private static final char TAG_BEGIN = '[';
private static final char TAG_END = ']';
private static final char LT = '<';
private static final char GT = '>';
private static final char EQ = '=';
private static final char DQUOTE = '"';
private static final char QUOTE = '\'';
private static final String WHITESPACE = " \t\n\r";
private static final char NL = '\n';
private static final char CR = '\n';
private static final char LF = '\f';
public void parse(String rawSML, EventReceiver receiver) {
receiver.receiveBegin();
int off = 0;
off = parseHeaders(rawSML, off, receiver);
receiver.receiveHeaderEnd();
parseBody(rawSML, off, receiver);
receiver.receiveEnd();
}
private int parseHeaders(String rawSML, int off, EventReceiver receiver) {
if (rawSML == null) return off;
int len = rawSML.length();
if (len == off) return off;
int keyBegin = off;
int valBegin = -1;
while (off < len) {
char c = rawSML.charAt(off);
if ( (c == ':') && (valBegin < 0) ) {
// moving on to the value
valBegin = off + 1;
} else if (c == '\n') {
if (valBegin < 0) {
// end of the headers
off++;
break;
} else {
String key = rawSML.substring(keyBegin, valBegin-1);
String val = rawSML.substring(valBegin, off);
receiver.receiveHeader(key.trim(), val.trim());
valBegin = -1;
keyBegin = off + 1;
}
}
off++;
}
if ( (off >= len) && (valBegin > 0) ) {
String key = rawSML.substring(keyBegin, valBegin-1);
String val = rawSML.substring(valBegin, len);
receiver.receiveHeader(key.trim(), val.trim());
}
return off;
}
private void parseBody(String rawSMLBody, int off, EventReceiver receiver) {
if (rawSMLBody == null) return;
int begin = off;
int len = rawSMLBody.length();
if (len <= off) return;
int openTagBegin = -1;
int openTagEnd = -1;
int closeTagBegin = -1;
int closeTagEnd = -1;
while (off < len) {
char c = rawSMLBody.charAt(off);
if ( (c == NL) || (c == CR) || (c == LF) ) {
if (openTagBegin < 0) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveNewline();
off++;
begin = off;
continue;
} else {
// ignore NL inside a tag or between tag blocks
}
} else if (c == TAG_BEGIN) {
if ( (off + 1 < len) && (TAG_BEGIN == rawSMLBody.charAt(off+1))) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveLeftBracket();
off += 2;
begin = off;
continue;
} else if (openTagBegin < 0) {
// push everything seen and not accounted for into a plain area
if (closeTagEnd < 0) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
} else {
if (closeTagEnd + 1 < off)
receiver.receivePlain(rawSMLBody.substring(closeTagEnd+1, off));
}
openTagBegin = off;
closeTagBegin = -1;
begin = off + 1;
} else {
// ok, we are at the end of the tag, process it
closeTagBegin = off;
while ( (c != TAG_END) && (off < len) ) {
off++;
c = rawSMLBody.charAt(off);
}
parseTag(rawSMLBody, openTagBegin, openTagEnd, closeTagBegin, off, receiver);
begin = off + 1;
openTagBegin = -1;
openTagEnd = -1;
closeTagBegin = -1;
closeTagEnd = -1;
}
} else if (c == TAG_END) {
if ( (openTagBegin > 0) && (closeTagBegin < 0) ) {
openTagEnd = off;
} else if ( (off + 1 < len) && (TAG_END == rawSMLBody.charAt(off+1))) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveRightBracket();
off += 2;
begin = off;
continue;
}
} else if (c == LT) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveLT();
off++;
begin = off;
continue;
} else if (c == GT) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveGT();
off++;
begin = off;
continue;
}
off++;
}
if ( (off >= len) && (openTagBegin < 0) ) {
if (closeTagEnd < 0) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
} else {
if (closeTagEnd + 1 < off)
receiver.receivePlain(rawSMLBody.substring(closeTagEnd+1, off));
}
}
}
private void parseTag(String source, int openTagBegin, int openTagEnd, int closeTagBegin, int closeTagEnd, EventReceiver receiver) {
String tagName = getTagName(source, openTagBegin+1);
Map attributes = getAttributes(source, openTagBegin+1+tagName.length(), openTagEnd);
String body = null;
if (openTagEnd + 1 >= closeTagBegin)
body = "";
else
body = source.substring(openTagEnd+1, closeTagBegin);
//System.out.println("Receiving tag [" + tagName + "] w/ open [" + source.substring(openTagBegin+1, openTagEnd)
// + "], close [" + source.substring(closeTagBegin+1, closeTagEnd) + "] body ["
// + body + "] attributes: " + attributes);
parseTag(tagName, attributes, body, receiver);
}
private static final String T_BOLD = "b";
private static final String T_ITALIC = "i";
private static final String T_UNDERLINE = "u";
private static final String T_CUT = "cut";
private static final String T_IMAGE = "img";
private static final String T_QUOTE = "quote";
private static final String T_CODE = "code";
private static final String T_BLOG = "blog";
private static final String T_LINK = "link";
private static final String T_ADDRESS = "address";
private static final String T_H1 = "h1";
private static final String T_H2 = "h2";
private static final String T_H3 = "h3";
private static final String T_H4 = "h4";
private static final String T_H5 = "h5";
private static final String T_HR = "hr";
private static final String T_PRE = "pre";
private static final String T_ATTACHMENT = "attachment";
private static final String T_ARCHIVE = "archive";
private static final String P_ATTACHMENT = "attachment";
private static final String P_WHO_QUOTED = "author";
private static final String P_QUOTE_LOCATION = "location";
private static final String P_CODE_LOCATION = "location";
private static final String P_BLOG_NAME = "name";
private static final String P_BLOG_HASH = "bloghash";
private static final String P_BLOG_TAG = "blogtag";
private static final String P_BLOG_ENTRY = "blogentry";
private static final String P_LINK_LOCATION = "location";
private static final String P_LINK_SCHEMA = "schema";
private static final String P_ADDRESS_NAME = "name";
private static final String P_ADDRESS_LOCATION = "location";
private static final String P_ADDRESS_SCHEMA = "schema";
private static final String P_ATTACHMENT_ID = "id";
private static final String P_ARCHIVE_NAME = "name";
private static final String P_ARCHIVE_DESCRIPTION = "description";
private static final String P_ARCHIVE_LOCATION_SCHEMA = "schema";
private static final String P_ARCHIVE_LOCATION = "location";
private static final String P_ARCHIVE_POSTING_KEY = "postingkey";
private void parseTag(String tagName, Map attr, String body, EventReceiver receiver) {
tagName = tagName.toLowerCase();
if (T_BOLD.equals(tagName)) {
receiver.receiveBold(body);
} else if (T_ITALIC.equals(tagName)) {
receiver.receiveItalic(body);
} else if (T_UNDERLINE.equals(tagName)) {
receiver.receiveUnderline(body);
} else if (T_CUT.equals(tagName)) {
receiver.receiveCut(body);
} else if (T_IMAGE.equals(tagName)) {
receiver.receiveImage(body, getInt(P_ATTACHMENT, attr));
} else if (T_QUOTE.equals(tagName)) {
receiver.receiveQuote(body, getString(P_WHO_QUOTED, attr), getSchema(P_QUOTE_LOCATION, attr), getLocation(P_QUOTE_LOCATION, attr));
} else if (T_CODE.equals(tagName)) {
receiver.receiveCode(body, getSchema(P_CODE_LOCATION, attr), getLocation(P_CODE_LOCATION, attr));
} else if (T_BLOG.equals(tagName)) {
List locations = new ArrayList();
int i = 0;
while (true) {
String s = getString("archive" + i, attr);
if (s != null)
locations.add(new SafeURL(s));
else
break;
i++;
}
receiver.receiveBlog(getString(P_BLOG_NAME, attr), getString(P_BLOG_HASH, attr), getString(P_BLOG_TAG, attr),
getLong(P_BLOG_ENTRY, attr), locations, body);
} else if (T_ARCHIVE.equals(tagName)) {
receiver.receiveArchive(getString(P_ARCHIVE_NAME, attr), getString(P_ARCHIVE_DESCRIPTION, attr),
getString(P_ARCHIVE_LOCATION_SCHEMA, attr), getString(P_ARCHIVE_LOCATION, attr),
getString(P_ARCHIVE_POSTING_KEY, attr), body);
} else if (T_LINK.equals(tagName)) {
receiver.receiveLink(getString(P_LINK_SCHEMA, attr), getString(P_LINK_LOCATION, attr), body);
} else if (T_ADDRESS.equals(tagName)) {
receiver.receiveAddress(getString(P_ADDRESS_NAME, attr), getString(P_ADDRESS_SCHEMA, attr), getString(P_ADDRESS_LOCATION, attr), body);
} else if (T_H1.equals(tagName)) {
receiver.receiveH1(body);
} else if (T_H2.equals(tagName)) {
receiver.receiveH2(body);
} else if (T_H3.equals(tagName)) {
receiver.receiveH3(body);
} else if (T_H4.equals(tagName)) {
receiver.receiveH4(body);
} else if (T_H5.equals(tagName)) {
receiver.receiveH5(body);
} else if (T_HR.equals(tagName)) {
receiver.receiveHR();
} else if (T_PRE.equals(tagName)) {
receiver.receivePre(body);
} else if (T_ATTACHMENT.equals(tagName)) {
receiver.receiveAttachment((int)getLong(P_ATTACHMENT_ID, attr), body);
} else {
System.out.println("need to learn how to parse the tag [" + tagName + "]");
}
}
private String getString(String param, Map attributes) { return (String)attributes.get(param); }
private String getSchema(String param, Map attributes) {
String url = getString(param, attributes);
if (url != null) {
SafeURL u = new SafeURL(url);
return u.getSchema();
} else {
return null;
}
}
private String getLocation(String param, Map attributes) {
String url = getString(param, attributes);
if (url != null) {
SafeURL u = new SafeURL(url);
return u.getLocation();
} else {
return null;
}
}
private int getInt(String attributeName, Map attributes) {
String val = (String)attributes.get(attributeName.toLowerCase());
if (val != null) {
try {
return Integer.parseInt(val.trim());
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return -1;
}
} else {
return -1;
}
}
private long getLong(String attributeName, Map attributes) {
String val = (String)attributes.get(attributeName.toLowerCase());
if (val != null) {
try {
return Long.parseLong(val.trim());
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return -1;
}
} else {
return -1;
}
}
private String getTagName(String source, int nameStart) {
int off = nameStart;
while (true) {
char c = source.charAt(off);
if ( (c == TAG_END) || (WHITESPACE.indexOf(c) >= 0) )
return source.substring(nameStart, off);
off++;
}
}
private Map getAttributes(String source, int attributesStart, int openTagEnd) {
Map rv = new HashMap();
int off = attributesStart;
int nameStart = -1;
int nameEnd = -1;
int valStart = -1;
int valEnd = -1;
while (true) {
char c = source.charAt(off);
if ( (c == TAG_END) || (off >= openTagEnd) )
break;
if (WHITESPACE.indexOf(c) < 0) {
if (nameStart < 0) {
nameStart = off;
} else if (c == EQ) {
if (nameEnd < 0)
nameEnd = off;
} else if ( (c == QUOTE) || (c == DQUOTE) ) {
if (valStart < 0) {
valStart = off;
} else {
valEnd = off;
String name = source.substring(nameStart, nameEnd);
String val = source.substring(valStart+1, valEnd);
rv.put(name.trim(), val.trim());
nameStart = -1;
nameEnd = -1;
valStart = -1;
valEnd = -1;
}
}
}
off++;
}
return rv;
}
public interface EventReceiver {
public void receiveHeader(String header, String value);
public void receiveLink(String schema, String location, String text);
/** @param blogArchiveLocations list of SafeURL */
public void receiveBlog(String name, String blogKeyHash, String blogPath, long blogEntryId,
List blogArchiveLocations, String anchorText);
public void receiveArchive(String name, String description, String locationSchema, String location,
String postingKey, String anchorText);
public void receiveImage(String alternateText, int attachmentId);
public void receiveAddress(String name, String schema, String location, String anchorText);
public void receiveAttachment(int id, String anchorText);
public void receiveBold(String text);
public void receiveItalic(String text);
public void receiveUnderline(String text);
public void receiveH1(String text);
public void receiveH2(String text);
public void receiveH3(String text);
public void receiveH4(String text);
public void receiveH5(String text);
public void receivePre(String text);
public void receiveHR();
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation);
public void receiveCode(String text, String codeLocationSchema, String codeLocation);
public void receiveCut(String summaryText);
public void receivePlain(String text);
public void receiveNewline();
public void receiveLT();
public void receiveGT();
public void receiveLeftBracket();
public void receiveRightBracket();
public void receiveBegin();
public void receiveEnd();
public void receiveHeaderEnd();
}
public static void main(String args[]) {
test(null);
test("");
test("A: B");
test("A: B\n");
test("A: B\nC: D");
test("A: B\nC: D\n");
test("A: B\nC: D\n\n");
test("A: B\nC: D\n\nblah");
test("A: B\nC: D\n\nblah[[");
test("A: B\nC: D\n\nblah]]");
test("A: B\nC: D\n\nblah]]blah");
test("A: B\nC: D\n\nfoo[a]b[/a]bar");
test("A: B\nC: D\n\nfoo[a]b[/a]bar[b][/b]");
test("A: B\nC: D\n\nfoo[a]b[/a]bar[b][/b]baz");
test("A: B\nC: D\n\n<a href=\"http://odci.gov\">hi</a>");
test("A: B\n\n[a b='c']d[/a]");
test("A: B\n\n[a b='c' d='e' f='g']h[/a]");
test("A: B\n\n[a b='c' d='e' f='g']h[/a][a b='c' d='e' f='g']h[/a][a b='c' d='e' f='g']h[/a]");
test("A: B\n\n[a b='c' ]d[/a]");
test("A: B\n\n[a b=\"c\" ]d[/a]");
test("A: B\n\n[b]This[/b] is [i]special[/i][cut]why?[/cut][u]because I say so[/u].\neven if you dont care");
}
private static void test(String rawSML) {
SMLParser parser = new SMLParser();
parser.parse(rawSML, new EventReceiverImpl());
}
}

View File

@ -0,0 +1,183 @@
package net.i2p.syndie.web;
import java.io.*;
import java.util.*;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.ServletException;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
/**
*
*/
public class ArchiveServlet extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
String path = req.getPathInfo();
if ( (path == null) || (path.trim().length() <= 1) ) {
renderRootIndex(resp);
return;
} else if (path.endsWith(Archive.INDEX_FILE)) {
renderSummary(resp);
} else if (path.endsWith("export.zip")) {
ExportServlet.export(req, resp);
} else {
String blog = getBlog(path);
if (path.endsWith(Archive.METADATA_FILE)) {
renderMetadata(blog, resp);
} else if (path.endsWith(".snd")) {
renderEntry(blog, getEntry(path), resp);
} else {
renderBlogIndex(blog, resp);
}
}
}
private String getBlog(String path) {
//System.err.println("Blog: [" + path + "]");
int start = 0;
int end = -1;
int len = path.length();
for (int i = 0; i < len; i++) {
if (path.charAt(i) != '/') {
start = i;
break;
}
}
for (int j = start + 1; j < len; j++) {
if (path.charAt(j) == '/') {
end = j;
break;
}
}
if (end < 0) end = len;
String rv = path.substring(start, end);
//System.err.println("Blog: [" + path + "] rv: [" + rv + "]");
return rv;
}
private long getEntry(String path) {
int start = path.lastIndexOf('/');
if (start < 0) return -1;
if (!(path.endsWith(".snd"))) return -1;
String rv = path.substring(start+1, path.length()-".snd".length());
//System.err.println("Entry: [" + path + "] rv: [" + rv + "]");
try {
return Long.parseLong(rv);
} catch (NumberFormatException nfe) {
return -1;
}
}
private void renderRootIndex(HttpServletResponse resp) throws ServletException, IOException {
resp.setContentType("text/html;charset=utf-8");
//resp.setCharacterEncoding("UTF-8");
OutputStream out = resp.getOutputStream();
out.write(DataHelper.getUTF8("<a href=\"archive.txt\">archive.txt</a><br />\n"));
ArchiveIndex index = BlogManager.instance().getArchive().getIndex();
Set blogs = index.getUniqueBlogs();
for (Iterator iter = blogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
String s = blog.toBase64();
out.write(DataHelper.getUTF8("<a href=\"" + s + "/\">" + s + "</a><br />\n"));
}
out.close();
}
private void renderSummary(HttpServletResponse resp) throws ServletException, IOException {
resp.setContentType("text/plain;charset=utf-8");
//resp.setCharacterEncoding("UTF-8");
OutputStream out = resp.getOutputStream();
ArchiveIndex index = BlogManager.instance().getArchive().getIndex();
out.write(DataHelper.getUTF8(index.toString()));
out.close();
}
private void renderMetadata(String blog, HttpServletResponse resp) throws ServletException, IOException {
byte b[] = Base64.decode(blog);
if ( (b == null) || (b.length != Hash.HASH_LENGTH) ) {
resp.sendError(404, "Invalid blog requested");
return;
}
Hash h = new Hash(b);
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(h);
if (info == null) {
resp.sendError(404, "Blog does not exist");
return;
}
resp.setContentType("application/x-syndie-meta");
OutputStream out = resp.getOutputStream();
info.write(out);
out.close();
}
private void renderBlogIndex(String blog, HttpServletResponse resp) throws ServletException, IOException {
byte b[] = Base64.decode(blog);
if ( (b == null) || (b.length != Hash.HASH_LENGTH) ) {
resp.sendError(404, "Invalid blog requested");
return;
}
Hash h = new Hash(b);
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(h);
if (info == null) {
resp.sendError(404, "Blog does not exist");
return;
}
resp.setContentType("text/html;charset=utf-8");
//resp.setCharacterEncoding("UTF-8");
OutputStream out = resp.getOutputStream();
out.write(DataHelper.getUTF8("<a href=\"..\">..</a><br />\n"));
out.write(DataHelper.getUTF8("<a href=\"" + Archive.METADATA_FILE + "\">" + Archive.METADATA_FILE + "</a><br />\n"));
List entries = new ArrayList(64);
BlogManager.instance().getArchive().getIndex().selectMatchesOrderByEntryId(entries, h, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI entry = (BlogURI)entries.get(i);
out.write(DataHelper.getUTF8("<a href=\"" + entry.getEntryId() + ".snd\">" + entry.getEntryId() + ".snd</a><br />\n"));
}
out.close();
}
private void renderEntry(String blog, long entryId, HttpServletResponse resp) throws ServletException, IOException {
byte b[] = Base64.decode(blog);
if ( (b == null) || (b.length != Hash.HASH_LENGTH) ) {
resp.sendError(404, "Invalid blog requested");
return;
}
Hash h = new Hash(b);
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(h);
if (info == null) {
resp.sendError(404, "Blog does not exist");
return;
}
File root = BlogManager.instance().getArchive().getArchiveDir();
File blogDir = new File(root, blog);
if (!blogDir.exists()) {
resp.sendError(404, "Blog does not exist");
return;
}
File entry = new File(blogDir, entryId + ".snd");
if (!entry.exists()) {
resp.sendError(404, "Entry does not exist");
return;
}
resp.setContentType("application/x-syndie-post");
dump(entry, resp);
}
private void dump(File source, HttpServletResponse resp) throws ServletException, IOException {
FileInputStream in = new FileInputStream(source);
OutputStream out = resp.getOutputStream();
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
out.write(buf, 0, read);
out.close();
in.close();
}
}

View File

@ -0,0 +1,625 @@
package net.i2p.syndie.web;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*
*/
public class ArchiveViewerBean {
public static String getBlogName(String keyHash) {
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(new Hash(Base64.decode(keyHash)));
if (info == null)
return HTMLRenderer.sanitizeString(keyHash);
else
return HTMLRenderer.sanitizeString(info.getProperty("Name"));
}
public static String getEntryTitle(String keyHash, long entryId) {
String name = getBlogName(keyHash);
return getEntryTitleDate(name, entryId);
}
private static final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
public static final String getEntryTitleDate(String blogName, long when) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(when));
long dayBegin = _dateFormat.parse(str).getTime();
return blogName + ":<br /> <i>" + str + "-" + (when - dayBegin) + "</i>";
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return "unknown";
}
}
}
/** base64 encoded hash of the blog's public key, or null for no filtering by blog */
public static final String PARAM_BLOG = "blog";
/** base64 encoded tag to filter by, or blank for no filtering by tags */
public static final String PARAM_TAG = "tag";
/** entry id within the blog if we only want to see that one */
public static final String PARAM_ENTRY = "entry";
/** base64 encoded group within the user's filters */
public static final String PARAM_GROUP = "group";
/** how many entries per page to show at once */
public static final String PARAM_NUM_PER_PAGE = "pageSize";
/** which page of entries to render */
public static final String PARAM_PAGE_NUMBER = "pageNum";
/** should we expand each entry to show the full contents */
public static final String PARAM_EXPAND_ENTRIES = "expand";
/** should entries be rendered with the images shown inline */
public static final String PARAM_SHOW_IMAGES = "images";
/** should we regenerate an index to the archive before rendering */
public static final String PARAM_REGENERATE_INDEX = "regenerateIndex";
/** which attachment should we serve up raw */
public static final String PARAM_ATTACHMENT = "attachment";
/** we are replying to a particular blog/tag/entry/whatever (value == base64 encoded selector) */
public static final String PARAM_IN_REPLY_TO = "inReplyTo";
/**
* Drop down multichooser:
* blog://base64(key)
* tag://base64(tag)
* blogtag://base64(key)/base64(tag)
* entry://base64(key)/entryId
* group://base64(groupName)
* ALL
*/
public static final String PARAM_SELECTOR = "selector";
public static final String SEL_ALL = "ALL";
public static final String SEL_BLOG = "blog://";
public static final String SEL_TAG = "tag://";
public static final String SEL_BLOGTAG = "blogtag://";
public static final String SEL_ENTRY = "entry://";
public static final String SEL_GROUP = "group://";
/** submit field for the selector form */
public static final String PARAM_SELECTOR_ACTION = "action";
public static final String SEL_ACTION_SET_AS_DEFAULT = "Set as default";
public static void renderBlogSelector(User user, Map parameters, Writer out) throws IOException {
String sel = getString(parameters, PARAM_SELECTOR);
String action = getString(parameters, PARAM_SELECTOR_ACTION);
if ( (sel != null) && (action != null) && (SEL_ACTION_SET_AS_DEFAULT.equals(action)) ) {
user.setDefaultSelector(HTMLRenderer.sanitizeString(sel, false));
BlogManager.instance().saveUser(user);
}
out.write("<select name=\"");
out.write(PARAM_SELECTOR);
out.write("\">");
out.write("<option value=\"");
out.write(getDefaultSelector(user, parameters));
out.write("\">Default blog filter</option>\n");
out.write("\">");
out.write("<option value=\"");
out.write(SEL_ALL);
out.write("\">All posts from all blogs</option>\n");
Map groups = null;
if (user != null)
groups = user.getBlogGroups();
if (groups != null) {
for (Iterator iter = groups.keySet().iterator(); iter.hasNext(); ) {
String name = (String)iter.next();
out.write("<option value=\"group://" + Base64.encode(DataHelper.getUTF8(name)) + "\">" +
"Group: " + HTMLRenderer.sanitizeString(name) + "</option>\n");
}
}
Archive archive = BlogManager.instance().getArchive();
ArchiveIndex index = archive.getIndex();
for (int i = 0; i < index.getNewestBlogCount(); i++) {
Hash cur = index.getNewestBlog(i);
String blog = Base64.encode(cur.getData());
out.write("<option value=\"blog://" + blog + "\">");
out.write("New blog: ");
BlogInfo info = archive.getBlogInfo(cur);
String name = info.getProperty(BlogInfo.NAME);
if (name != null)
name = HTMLRenderer.sanitizeString(name);
else
name = Base64.encode(cur.getData());
out.write(name);
out.write("</option>\n");
}
List allTags = new ArrayList();
// perhaps sort this by name (even though it isnt unique...)
Set blogs = index.getUniqueBlogs();
for (Iterator iter = blogs.iterator(); iter.hasNext(); ) {
Hash cur = (Hash)iter.next();
String blog = Base64.encode(cur.getData());
out.write("<option value=\"blog://");
out.write(blog);
out.write("\">");
BlogInfo info = archive.getBlogInfo(cur);
String name = info.getProperty(BlogInfo.NAME);
if (name != null)
name = HTMLRenderer.sanitizeString(name);
else
name = Base64.encode(cur.getData());
out.write(name);
out.write("- all posts</option>\n");
List tags = index.getBlogTags(cur);
for (int j = 0; j < tags.size(); j++) {
String tag = (String)tags.get(j);
if (false) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int k = 0; k < tag.length(); k++) {
b.append((int)tag.charAt(k));
b.append(' ');
}
System.out.println("tag in select: " + tag + ": " + b.toString());
}
if (!allTags.contains(tag))
allTags.add(tag);
out.write("<option value=\"blogtag://");
out.write(blog);
out.write("/");
byte utf8tag[] = DataHelper.getUTF8(tag);
String encoded = Base64.encode(utf8tag);
if (false) {
byte utf8dec[] = Base64.decode(encoded);
String travel = DataHelper.getUTF8(utf8dec);
StringBuffer b = new StringBuffer();
for (int k = 0; k < travel.length(); k++) {
b.append((int)travel.charAt(k));
b.append(' ');
}
b.append(" encoded into: ");
for (int k = 0; k < encoded.length(); k++) {
b.append((int)encoded.charAt(k));
b.append(' ');
}
System.out.println("UTF8(unbase64(base64(UTF8(tag)))) == tag: " + b.toString());
}
out.write(encoded);
out.write("\">");
out.write(name);
out.write("- posts with the tag &quot;");
out.write(tag);
out.write("&quot;</option>\n");
}
}
for (int i = 0; i < allTags.size(); i++) {
String tag = (String)allTags.get(i);
out.write("<option value=\"tag://");
out.write(Base64.encode(DataHelper.getUTF8(tag)));
out.write("\">Posts in any blog with the tag &quot;");
out.write(tag);
out.write("&quot;</option>\n");
}
out.write("</select>");
int numPerPage = getInt(parameters, PARAM_NUM_PER_PAGE, 5);
int pageNum = getInt(parameters, PARAM_PAGE_NUMBER, 0);
boolean expandEntries = getBool(parameters, PARAM_EXPAND_ENTRIES, (user != null ? user.getShowExpanded() : false));
boolean showImages = getBool(parameters, PARAM_SHOW_IMAGES, (user != null ? user.getShowImages() : false));
out.write("<input type=\"hidden\" name=\"" + PARAM_NUM_PER_PAGE+ "\" value=\"" + numPerPage+ "\" />");
out.write("<input type=\"hidden\" name=\"" + PARAM_PAGE_NUMBER+ "\" value=\"" + pageNum+ "\" />");
out.write("<input type=\"hidden\" name=\"" + PARAM_EXPAND_ENTRIES+ "\" value=\"" + expandEntries+ "\" />");
out.write("<input type=\"hidden\" name=\"" + PARAM_SHOW_IMAGES + "\" value=\"" + showImages + "\" />");
}
private static String getDefaultSelector(User user, Map parameters) {
if ( (user == null) || (user.getDefaultSelector() == null) )
return BlogManager.instance().getArchive().getDefaultSelector();
else
return user.getDefaultSelector();
}
public static void renderBlogs(User user, Map parameters, Writer out, String afterPagination) throws IOException {
String blogStr = getString(parameters, PARAM_BLOG);
Hash blog = null;
if (blogStr != null) blog = new Hash(Base64.decode(blogStr));
String tag = getString(parameters, PARAM_TAG);
if (tag != null) tag = DataHelper.getUTF8(Base64.decode(tag));
long entryId = -1;
if (blogStr != null) {
String entryIdStr = getString(parameters, PARAM_ENTRY);
try {
entryId = Long.parseLong(entryIdStr);
} catch (NumberFormatException nfe) {}
}
String group = getString(parameters, PARAM_GROUP);
if (group != null) group = DataHelper.getUTF8(Base64.decode(group));
String sel = getString(parameters, PARAM_SELECTOR);
if ( (sel == null) && (blog == null) && (group == null) && (tag == null) )
sel = getDefaultSelector(user, parameters);
if (sel != null) {
Selector s = new Selector(sel);
blog = s.blog;
tag = s.tag;
entryId = s.entry;
group = s.group;
}
int numPerPage = getInt(parameters, PARAM_NUM_PER_PAGE, 5);
int pageNum = getInt(parameters, PARAM_PAGE_NUMBER, 0);
boolean expandEntries = getBool(parameters, PARAM_EXPAND_ENTRIES, (user != null ? user.getShowExpanded() : false));
boolean showImages = getBool(parameters, PARAM_SHOW_IMAGES, (user != null ? user.getShowImages() : false));
boolean regenerateIndex = getBool(parameters, PARAM_REGENERATE_INDEX, false);
try {
renderBlogs(user, blog, tag, entryId, group, numPerPage, pageNum, expandEntries, showImages, regenerateIndex, sel, out, afterPagination);
} catch (IOException ioe) {
ioe.printStackTrace();
throw ioe;
} catch (RuntimeException re) {
re.printStackTrace();
throw re;
}
}
public static class Selector {
public Hash blog;
public String tag;
public long entry;
public String group;
public Selector(String selector) {
entry = -1;
blog = null;
tag = null;
if (selector != null) {
if (selector.startsWith(SEL_BLOG)) {
String blogStr = selector.substring(SEL_BLOG.length());
System.out.println("Selector [" + selector + "] blogString: [" + blogStr + "]");
blog = new Hash(Base64.decode(blogStr));
} else if (selector.startsWith(SEL_BLOGTAG)) {
int tagStart = selector.lastIndexOf('/');
String blogStr = selector.substring(SEL_BLOGTAG.length(), tagStart);
blog = new Hash(Base64.decode(blogStr));
tag = selector.substring(tagStart+1);
String origTag = tag;
byte rawDecode[] = null;
if (tag != null) {
rawDecode = Base64.decode(tag);
tag = DataHelper.getUTF8(rawDecode);
}
System.out.println("Selector [" + selector + "] blogString: [" + blogStr + "] tag: [" + tag + "]");
if (false && tag != null) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int j = 0; j < tag.length(); j++) {
b.append((int)tag.charAt(j));
if (rawDecode.length > j)
b.append('.').append((int)rawDecode[j]);
b.append(' ');
}
b.append("encoded as ");
for (int j = 0; j < origTag.length(); j++) {
b.append((int)origTag.charAt(j)).append(' ');
}
System.out.println("selected tag: " + b.toString());
}
} else if (selector.startsWith(SEL_TAG)) {
tag = selector.substring(SEL_TAG.length());
byte rawDecode[] = null;
if (tag != null) {
rawDecode = Base64.decode(tag);
tag = DataHelper.getUTF8(rawDecode);
}
System.out.println("Selector [" + selector + "] tag: [" + tag + "]");
if (false && tag != null) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int j = 0; j < tag.length(); j++) {
b.append((int)tag.charAt(j));
if (rawDecode.length > j)
b.append('.').append((int)rawDecode[j]);
b.append(' ');
}
System.out.println("selected tag: " + b.toString());
}
} else if (selector.startsWith(SEL_ENTRY)) {
int entryStart = selector.lastIndexOf('/');
String blogStr = blogStr = selector.substring(SEL_ENTRY.length(), entryStart);
String entryStr = selector.substring(entryStart+1);
try {
entry = Long.parseLong(entryStr);
blog = new Hash(Base64.decode(blogStr));
System.out.println("Selector [" + selector + "] blogString: [" + blogStr + "] entry: [" + entry + "]");
} catch (NumberFormatException nfe) {}
} else if (selector.startsWith(SEL_GROUP)) {
group = DataHelper.getUTF8(Base64.decode(selector.substring(SEL_GROUP.length())));
System.out.println("Selector [" + selector + "] group: [" + group + "]");
}
}
}
}
private static void renderBlogs(User user, Hash blog, String tag, long entryId, String group, int numPerPage, int pageNum,
boolean expandEntries, boolean showImages, boolean regenerateIndex, String selector, Writer out, String afterPagination) throws IOException {
Archive archive = BlogManager.instance().getArchive();
if (regenerateIndex)
archive.regenerateIndex();
ArchiveIndex index = archive.getIndex();
List entries = pickEntryURIs(user, index, blog, tag, entryId, group);
System.out.println("Searching for " + blog + "/" + tag + "/" + entryId + "/" + pageNum + "/" + numPerPage + "/" + group);
System.out.println("Entry URIs: " + entries);
HTMLRenderer renderer = new HTMLRenderer();
int start = pageNum * numPerPage;
int end = start + numPerPage;
int pages = 1;
if (entries.size() <= 1) {
// just one, so no pagination, etc
start = 0;
end = 1;
} else {
if (end >= entries.size())
end = entries.size();
if ( (pageNum < 0) || (numPerPage <= 0) ) {
start = 0;
end = entries.size() - 1;
} else {
pages = entries.size() / numPerPage;
if (numPerPage * pages < entries.size())
pages++;
out.write("<i>");
if (pageNum > 0) {
String prevURL = null;
if ( (selector == null) || (selector.trim().length() <= 0) )
prevURL = HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum-1, expandEntries, showImages);
else
prevURL = HTMLRenderer.getPageURL(user, selector, numPerPage, pageNum-1);
System.out.println("prevURL: " + prevURL);
out.write(" <a href=\"" + prevURL + "\">&lt;&lt;</a>");
} else {
out.write(" &lt;&lt; ");
}
out.write("Page " + (pageNum+1) + " of " + pages);
if (pageNum + 1 < pages) {
String nextURL = null;
if ( (selector == null) || (selector.trim().length() <= 0) )
nextURL = HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum+1, expandEntries, showImages);
else
nextURL = HTMLRenderer.getPageURL(user, selector, numPerPage, pageNum+1);
System.out.println("nextURL: " + nextURL);
out.write(" <a href=\"" + nextURL + "\">&gt;&gt;</a>");
} else {
out.write(" &gt;&gt;");
}
out.write("</i>");
}
}
/*
out.write(" <i>");
if (showImages)
out.write("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, expandEntries, false) +
"\">Hide images</a>");
else
out.write("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, expandEntries, true) +
"\">Show images</a>");
if (expandEntries)
out.write(" <a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, false, showImages) +
"\">Hide details</a>");
else
out.write(" <a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, true, showImages) +
"\">Expand details</a>");
out.write("</i>");
*/
if (afterPagination != null)
out.write(afterPagination);
if (entries.size() <= 0) end = -1;
System.out.println("Entries.size: " + entries.size() + " start=" + start + " end=" + end);
for (int i = start; i < end; i++) {
BlogURI uri = (BlogURI)entries.get(i);
EntryContainer c = archive.getEntry(uri);
try {
if (c == null)
renderer.renderUnknownEntry(user, archive, uri, out);
else
renderer.render(user, archive, c, out, !expandEntries, showImages);
} catch (RuntimeException e) {
e.printStackTrace();
throw e;
}
}
}
private static List pickEntryURIs(User user, ArchiveIndex index, Hash blog, String tag, long entryId, String group) {
List rv = new ArrayList(16);
if ( (blog != null) && (entryId >= 0) ) {
rv.add(new BlogURI(blog, entryId));
return rv;
}
if ( (group != null) && (user != null) ) {
List selectors = (List)user.getBlogGroups().get(group);
if (selectors != null) {
System.out.println("Selectors for group " + group + ": " + selectors);
for (int i = 0; i < selectors.size(); i++) {
String sel = (String)selectors.get(i);
Selector s = new Selector(sel);
if ( (s.entry >= 0) && (s.blog != null) && (s.group == null) && (s.tag == null) )
rv.add(new BlogURI(s.blog, s.entry));
else
index.selectMatchesOrderByEntryId(rv, s.blog, s.tag);
}
return rv;
}
}
index.selectMatchesOrderByEntryId(rv, blog, tag);
return rv;
}
public static final String getString(Map parameters, String param) {
if ( (parameters == null) || (parameters.get(param) == null) )
return null;
Object vals = parameters.get(param);
if (vals.getClass().isArray()) {
String v[] = (String[])vals;
if (v.length > 0)
return ((String[])vals)[0];
else
return null;
} else if (vals instanceof Collection) {
Collection c = (Collection)vals;
if (c.size() > 0)
return (String)c.iterator().next();
else
return null;
} else {
return null;
}
}
public static final String[] getStrings(Map parameters, String param) {
if ( (parameters == null) || (parameters.get(param) == null) )
return null;
Object vals = parameters.get(param);
if (vals.getClass().isArray()) {
return (String[])vals;
} else if (vals instanceof Collection) {
Collection c = (Collection)vals;
if (c.size() <= 0) return null;
String rv[] = new String[c.size()];
int i = 0;
for (Iterator iter = c.iterator(); iter.hasNext(); i++)
rv[i] = (String)iter.next();
return rv;
} else {
return null;
}
}
private static final int getInt(Map param, String key, int defaultVal) {
String val = getString(param, key);
if (val != null) {
try { return Integer.parseInt(val); } catch (NumberFormatException nfe) {}
}
return defaultVal;
}
private static final boolean getBool(Map param, String key, boolean defaultVal) {
String val = getString(param, key);
if (val != null) {
return ("true".equals(val) || "yes".equals(val));
}
return defaultVal;
}
public static void renderAttachment(Map parameters, OutputStream out) throws IOException {
Attachment a = getAttachment(parameters);
if (a == null) {
renderInvalidAttachment(parameters, out);
} else {
InputStream data = a.getDataStream();
byte buf[] = new byte[1024];
int read = 0;
while ( (read = data.read(buf)) != -1)
out.write(buf, 0, read);
data.close();
}
}
public static final String getAttachmentContentType(Map parameters) {
Attachment a = getAttachment(parameters);
if (a == null)
return "text/html";
String mime = a.getMimeType();
if ( (mime != null) && ((mime.startsWith("image/") || mime.startsWith("text/plain"))) )
return mime;
return "application/octet-stream";
}
public static final int getAttachmentContentLength(Map parameters) {
Attachment a = getAttachment(parameters);
if (a != null)
return a.getDataLength();
else
return -1;
}
private static final Attachment getAttachment(Map parameters) {
String blogStr = getString(parameters, PARAM_BLOG);
Hash blog = null;
if (blogStr != null) blog = new Hash(Base64.decode(blogStr));
long entryId = -1;
if (blogStr != null) {
String entryIdStr = getString(parameters, PARAM_ENTRY);
try {
entryId = Long.parseLong(entryIdStr);
} catch (NumberFormatException nfe) {}
}
int attachment = getInt(parameters, PARAM_ATTACHMENT, -1);
Archive archive = BlogManager.instance().getArchive();
EntryContainer entry = archive.getEntry(new BlogURI(blog, entryId));
if ( (entry != null) && (attachment >= 0) && (attachment < entry.getAttachments().length) ) {
return entry.getAttachments()[attachment];
}
return null;
}
private static void renderInvalidAttachment(Map parameters, OutputStream out) throws IOException {
out.write(DataHelper.getUTF8("<b>No such entry, or no such attachment</b>"));
}
public static void renderMetadata(Map parameters, Writer out) throws IOException {
String blogStr = getString(parameters, PARAM_BLOG);
if (blogStr != null) {
Hash blog = new Hash(Base64.decode(blogStr));
Archive archive = BlogManager.instance().getArchive();
BlogInfo info = archive.getBlogInfo(blog);
if (info == null) {
out.write("Blog " + blog.toBase64() + " does not exist");
return;
}
String props[] = info.getProperties();
out.write("<table border=\"0\">");
for (int i = 0; i < props.length; i++) {
if (props[i].equals(BlogInfo.OWNER_KEY)) {
out.write("<tr><td><b>Blog:</b></td><td>");
String blogURL = HTMLRenderer.getPageURL(blog, null, -1, -1, -1, false, false);
out.write("<a href=\"" + blogURL + "\">" + Base64.encode(blog.getData()) + "</td></tr>\n");
} else if (props[i].equals(BlogInfo.SIGNATURE)) {
continue;
} else if (props[i].equals(BlogInfo.POSTERS)) {
SigningPublicKey keys[] = info.getPosters();
if ( (keys != null) && (keys.length > 0) ) {
out.write("<tr><td><b>Allowed authors:</b></td><td>");
for (int j = 0; j < keys.length; j++) {
out.write(keys[j].calculateHash().toBase64());
if (j + 1 < keys.length)
out.write("<br />\n");
}
out.write("</td></tr>\n");
}
} else {
out.write("<tr><td>" + HTMLRenderer.sanitizeString(props[i]) + ":</td><td>" +
HTMLRenderer.sanitizeString(info.getProperty(props[i])) + "</td></tr>\n");
}
}
List tags = BlogManager.instance().getArchive().getIndex().getBlogTags(blog);
if ( (tags != null) && (tags.size() > 0) ) {
out.write("<tr><td>Known tags:</td><td>");
for (int i = 0; i < tags.size(); i++) {
String tag = (String)tags.get(i);
out.write("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, false, false) + "\">" +
HTMLRenderer.sanitizeString(tag) + "</a> ");
}
out.write("</td></tr>");
}
out.write("</table>");
} else {
out.write("Blog not specified");
}
}
}

View File

@ -0,0 +1,100 @@
package net.i2p.syndie.web;
import java.io.*;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.ServletException;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
/**
* Dump out a whole series of blog metadata and entries as a zip stream. All metadata
* is written before any entries, so it can be processed in order safely.
*
* HTTP parameters:
* = meta (multiple values): base64 hash of the blog for which metadata is requested
* = entry (multiple values): blog URI of an entry being requested
*/
public class ExportServlet extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
export(req, resp);
}
public static void export(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
String meta[] = req.getParameterValues("meta");
String entries[] = req.getParameterValues("entry");
resp.setContentType("application/x-syndie-zip");
resp.setStatus(200);
OutputStream out = resp.getOutputStream();
ZipOutputStream zo = new ZipOutputStream(out);
List metaFiles = getMetaFiles(meta);
ZipEntry ze = null;
byte buf[] = new byte[1024];
int read = -1;
for (int i = 0; metaFiles != null && i < metaFiles.size(); i++) {
ze = new ZipEntry("meta" + i);
ze.setTime(0);
zo.putNextEntry(ze);
FileInputStream in = new FileInputStream((File)metaFiles.get(i));
while ( (read = in.read(buf)) != -1)
zo.write(buf, 0, read);
zo.closeEntry();
}
List entryFiles = getEntryFiles(entries);
for (int i = 0; entryFiles != null && i < entryFiles.size(); i++) {
ze = new ZipEntry("entry" + i);
ze.setTime(0);
zo.putNextEntry(ze);
FileInputStream in = new FileInputStream((File)entryFiles.get(i));
while ( (read = in.read(buf)) != -1)
zo.write(buf, 0, read);
zo.closeEntry();
}
zo.finish();
zo.close();
}
private static List getMetaFiles(String blogHashes[]) {
if ( (blogHashes == null) || (blogHashes.length <= 0) ) return null;
File dir = BlogManager.instance().getArchive().getArchiveDir();
List rv = new ArrayList(blogHashes.length);
for (int i = 0; i < blogHashes.length; i++) {
byte hv[] = Base64.decode(blogHashes[i]);
if ( (hv == null) || (hv.length != Hash.HASH_LENGTH) )
continue;
File blogDir = new File(dir, blogHashes[i]);
File metaFile = new File(blogDir, Archive.METADATA_FILE);
if (metaFile.exists())
rv.add(metaFile);
}
return rv;
}
private static List getEntryFiles(String blogURIs[]) {
if ( (blogURIs == null) || (blogURIs.length <= 0) ) return null;
File dir = BlogManager.instance().getArchive().getArchiveDir();
List rv = new ArrayList(blogURIs.length);
for (int i = 0; i < blogURIs.length; i++) {
BlogURI uri = new BlogURI(blogURIs[i]);
if (uri.getEntryId() < 0)
continue;
File blogDir = new File(dir, uri.getKeyHash().toBase64());
File entryFile = new File(blogDir, uri.getEntryId() + ".snd");
if (entryFile.exists())
rv.add(entryFile);
}
return rv;
}
}

View File

@ -0,0 +1,136 @@
package net.i2p.syndie.web;
import java.io.*;
import java.util.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.BlogURI;
import net.i2p.syndie.sml.HTMLPreviewRenderer;
/**
*
*/
public class PostBean {
private User _user;
private String _subject;
private String _tags;
private String _headers;
private String _text;
private List _filenames;
private List _fileStreams;
private List _localFiles;
private List _fileTypes;
private boolean _previewed;
public PostBean() { reinitialize(); }
public void reinitialize() {
System.out.println("Reinitializing " + (_text != null ? "(with " + _text.length() + " bytes of sml!)" : ""));
_user = null;
_subject = null;
_tags = null;
_text = null;
_headers = null;
_filenames = new ArrayList();
_fileStreams = new ArrayList();
_fileTypes = new ArrayList();
if (_localFiles != null)
for (int i = 0; i < _localFiles.size(); i++)
((File)_localFiles.get(i)).delete();
_localFiles = new ArrayList();
_previewed = false;
}
public User getUser() { return _user; }
public String getSubject() { return (_subject != null ? _subject : ""); }
public String getTags() { return (_tags != null ? _tags : ""); }
public String getText() { return (_text != null ? _text : ""); }
public String getHeaders() { return (_headers != null ? _headers : ""); }
public void setUser(User user) { _user = user; }
public void setSubject(String subject) { _subject = subject; }
public void setTags(String tags) { _tags = tags; }
public void setText(String text) { _text = text; }
public void setHeaders(String headers) { _headers = headers; }
public String getContentType(int id) {
if ( (id >= 0) && (id < _fileTypes.size()) )
return (String)_fileTypes.get(id);
return "application/octet-stream";
}
public void writeAttachmentData(int id, OutputStream out) throws IOException {
FileInputStream in = new FileInputStream((File)_localFiles.get(id));
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
out.write(buf, 0, read);
out.close();
}
public void addAttachment(String filename, InputStream fileStream, String mimeType) {
_filenames.add(filename);
_fileStreams.add(fileStream);
_fileTypes.add(mimeType);
}
public int getAttachmentCount() { return (_filenames != null ? _filenames.size() : 0); }
public BlogURI postEntry() throws IOException {
if (!_previewed) return null;
List localStreams = new ArrayList(_localFiles.size());
for (int i = 0; i < _localFiles.size(); i++) {
File f = (File)_localFiles.get(i);
localStreams.add(new FileInputStream(f));
}
return BlogManager.instance().createBlogEntry(_user, _subject, _tags, _headers, _text,
_filenames, localStreams, _fileTypes);
}
public void renderPreview(Writer out) throws IOException {
System.out.println("Subject: " + _subject);
System.out.println("Text: " + _text);
System.out.println("Headers: " + _headers);
// cache all the _fileStreams into temporary files, storing those files in _localFiles
// then render the page accordingly with an HTMLRenderer, altered to use a different
// 'view attachment'
cacheAttachments();
String smlContent = renderSMLContent();
HTMLPreviewRenderer r = new HTMLPreviewRenderer(_filenames, _fileTypes, _localFiles);
r.render(_user, BlogManager.instance().getArchive(), null, smlContent, out, false, true);
_previewed = true;
}
private String renderSMLContent() {
StringBuffer raw = new StringBuffer();
raw.append("Subject: ").append(_subject).append('\n');
raw.append("Tags: ");
StringTokenizer tok = new StringTokenizer(_tags, " \t\n");
while (tok.hasMoreTokens())
raw.append(tok.nextToken()).append('\t');
raw.append('\n');
raw.append(_headers.trim());
raw.append("\n\n");
raw.append(_text.trim());
return raw.toString();
}
private void cacheAttachments() throws IOException {
File postCacheDir = new File(BlogManager.instance().getTempDir(), _user.getBlog().toBase64());
if (!postCacheDir.exists())
postCacheDir.mkdirs();
for (int i = 0; i < _fileStreams.size(); i++) {
InputStream in = (InputStream)_fileStreams.get(i);
File f = File.createTempFile("attachment", ".dat", postCacheDir);
FileOutputStream o = new FileOutputStream(f);
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
o.write(buf, 0, read);
o.close();
in.close();
_localFiles.add(f);
System.out.println("Caching attachment " + i + " temporarily in "
+ f.getAbsolutePath() + " w/ " + f.length() + "bytes");
}
_fileStreams.clear();
}
}

View File

@ -0,0 +1,656 @@
package net.i2p.syndie.web;
import java.io.*;
import java.text.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.util.EepGet;
import net.i2p.util.EepGetScheduler;
import net.i2p.util.EepPost;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
import net.i2p.syndie.*;
/**
*
*/
public class RemoteArchiveBean {
private String _remoteSchema;
private String _remoteLocation;
private String _proxyHost;
private int _proxyPort;
private ArchiveIndex _remoteIndex;
private List _statusMessages;
private boolean _fetchIndexInProgress;
public RemoteArchiveBean() {
reinitialize();
}
public void reinitialize() {
_remoteSchema = null;
_remoteLocation = null;
_remoteIndex = null;
_fetchIndexInProgress = false;
_proxyHost = null;
_proxyPort = -1;
_statusMessages = new ArrayList();
}
public String getRemoteSchema() { return _remoteSchema; }
public String getRemoteLocation() { return _remoteLocation; }
public ArchiveIndex getRemoteIndex() { return _remoteIndex; }
public String getProxyHost() { return _proxyHost; }
public int getProxyPort() { return _proxyPort; }
public boolean getFetchIndexInProgress() { return _fetchIndexInProgress; }
public String getStatus() {
StringBuffer buf = new StringBuffer();
while (_statusMessages.size() > 0)
buf.append(_statusMessages.remove(0)).append("\n");
return buf.toString();
}
public void fetchMetadata(User user, Map parameters) {
String meta = ArchiveViewerBean.getString(parameters, "blog");
if (meta == null) return;
Set blogs = new HashSet();
if ("ALL".equals(meta)) {
Set localBlogs = BlogManager.instance().getArchive().getIndex().getUniqueBlogs();
Set remoteBlogs = _remoteIndex.getUniqueBlogs();
for (Iterator iter = remoteBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
if (!localBlogs.contains(blog)) {
blogs.add(blog);
}
}
} else {
blogs.add(new Hash(Base64.decode(meta.trim())));
}
List urls = new ArrayList(blogs.size());
List tmpFiles = new ArrayList(blogs.size());
for (Iterator iter = blogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
urls.add(buildMetaURL(blog));
try {
tmpFiles.add(File.createTempFile("fetchMeta", ".txt", BlogManager.instance().getTempDir()));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + blog.toBase64() + ": " + ioe.getMessage());
}
}
for (int i = 0; i < urls.size(); i++)
_statusMessages.add("Scheduling up metadata fetches for " + HTMLRenderer.sanitizeString((String)urls.get(i)));
fetch(urls, tmpFiles, user, new MetadataStatusListener());
}
private String buildMetaURL(Hash blog) {
String loc = _remoteLocation.trim();
int root = loc.lastIndexOf('/');
return loc.substring(0, root + 1) + blog.toBase64() + "/" + Archive.METADATA_FILE;
}
public void fetchSelectedEntries(User user, Map parameters) {
String entries[] = ArchiveViewerBean.getStrings(parameters, "entry");
if ( (entries == null) || (entries.length <= 0) ) return;
List urls = new ArrayList(entries.length);
List tmpFiles = new ArrayList(entries.length);
for (int i = 0; i < entries.length; i++) {
urls.add(buildEntryURL(new BlogURI(entries[i])));
try {
tmpFiles.add(File.createTempFile("fetchBlog", ".txt", BlogManager.instance().getTempDir()));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + HTMLRenderer.sanitizeString(entries[i]) + ": " + ioe.getMessage());
}
}
for (int i = 0; i < urls.size(); i++)
_statusMessages.add("Scheduling blog post fetching for " + HTMLRenderer.sanitizeString(entries[i]));
fetch(urls, tmpFiles, user, new BlogStatusListener());
}
public void fetchSelectedBulk(User user, Map parameters) {
String entries[] = ArchiveViewerBean.getStrings(parameters, "entry");
String action = ArchiveViewerBean.getString(parameters, "action");
if ("Fetch all new entries".equals(action)) {
ArchiveIndex localIndex = BlogManager.instance().getArchive().getIndex();
List uris = new ArrayList();
List matches = new ArrayList();
for (Iterator iter = _remoteIndex.getUniqueBlogs().iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
_remoteIndex.selectMatchesOrderByEntryId(matches, blog, null);
for (int i = 0; i < matches.size(); i++) {
BlogURI uri = (BlogURI)matches.get(i);
if (!localIndex.getEntryIsKnown(uri))
uris.add(uri);
}
matches.clear();
}
entries = new String[uris.size()];
for (int i = 0; i < uris.size(); i++)
entries[i] = ((BlogURI)uris.get(i)).toString();
}
if ( (entries == null) || (entries.length <= 0) ) return;
StringBuffer url = new StringBuffer(512);
url.append(buildExportURL());
Set meta = new HashSet();
for (int i = 0; i < entries.length; i++) {
BlogURI uri = new BlogURI(entries[i]);
if (uri.getEntryId() >= 0) {
url.append("entry=").append(uri.toString()).append('&');
meta.add(uri.getKeyHash());
_statusMessages.add("Scheduling blog post fetching for " + HTMLRenderer.sanitizeString(entries[i]));
}
}
for (Iterator iter = meta.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
url.append("meta=").append(blog.toBase64()).append('&');
_statusMessages.add("Scheduling blog metadata fetching for " + blog.toBase64());
}
List urls = new ArrayList(1);
urls.add(url.toString());
List tmpFiles = new ArrayList(1);
try {
File tmp = File.createTempFile("fetchBulk", ".zip", BlogManager.instance().getTempDir());
tmpFiles.add(tmp);
fetch(urls, tmpFiles, user, new BulkFetchListener(tmp));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + HTMLRenderer.sanitizeString(url.toString()) + ": " + ioe.getMessage());
}
}
private String buildExportURL() {
String loc = _remoteLocation.trim();
int root = loc.lastIndexOf('/');
return loc.substring(0, root + 1) + "export.zip?";
}
private String buildEntryURL(BlogURI uri) {
String loc = _remoteLocation.trim();
int root = loc.lastIndexOf('/');
return loc.substring(0, root + 1) + uri.getKeyHash().toBase64() + "/" + uri.getEntryId() + ".snd";
}
public void fetchAllEntries(User user, Map parameters) {
ArchiveIndex localIndex = BlogManager.instance().getArchive().getIndex();
List uris = new ArrayList();
List entries = new ArrayList();
for (Iterator iter = _remoteIndex.getUniqueBlogs().iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
_remoteIndex.selectMatchesOrderByEntryId(entries, blog, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
if (!localIndex.getEntryIsKnown(uri))
uris.add(uri);
}
entries.clear();
}
List urls = new ArrayList(uris.size());
List tmpFiles = new ArrayList(uris.size());
for (int i = 0; i < uris.size(); i++) {
urls.add(buildEntryURL((BlogURI)uris.get(i)));
try {
tmpFiles.add(File.createTempFile("fetchBlog", ".txt", BlogManager.instance().getTempDir()));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + HTMLRenderer.sanitizeString(uris.get(i).toString()) + ": " + ioe.getMessage());
}
}
for (int i = 0; i < urls.size(); i++)
_statusMessages.add("Fetch all entries: " + HTMLRenderer.sanitizeString((String)urls.get(i)));
fetch(urls, tmpFiles, user, new BlogStatusListener());
}
private void fetch(List urls, List tmpFiles, User user, EepGet.StatusListener lsnr) {
EepGetScheduler scheduler = new EepGetScheduler(I2PAppContext.getGlobalContext(), urls, tmpFiles, _proxyHost, _proxyPort, lsnr);
scheduler.fetch();
}
public void fetchIndex(User user, String schema, String location, String proxyHost, String proxyPort) {
_fetchIndexInProgress = true;
_remoteIndex = null;
_remoteLocation = location;
_remoteSchema = schema;
_proxyHost = null;
_proxyPort = -1;
if ( (schema == null) || (schema.trim().length() <= 0) ||
(location == null) || (location.trim().length() <= 0) ) {
_statusMessages.add("Location must be specified");
_fetchIndexInProgress = false;
return;
}
if ("web".equals(schema)) {
if ( (proxyHost != null) && (proxyHost.trim().length() > 0) &&
(proxyPort != null) && (proxyPort.trim().length() > 0) ) {
_proxyHost = proxyHost;
try {
_proxyPort = Integer.parseInt(proxyPort);
} catch (NumberFormatException nfe) {
_statusMessages.add("Proxy port " + HTMLRenderer.sanitizeString(proxyPort) + " is invalid");
_fetchIndexInProgress = false;
return;
}
}
} else {
_statusMessages.add(new String("Remote schema " + HTMLRenderer.sanitizeString(schema) + " currently not supported"));
_fetchIndexInProgress = false;
return;
}
_statusMessages.add("Fetching index from " + HTMLRenderer.sanitizeString(_remoteLocation) +
(_proxyHost != null ? " via " + HTMLRenderer.sanitizeString(_proxyHost) + ":" + _proxyPort : ""));
File archiveFile = new File(BlogManager.instance().getTempDir(), user.getBlog().toBase64() + "_remoteArchive.txt");
archiveFile.delete();
EepGet eep = new EepGet(I2PAppContext.getGlobalContext(), ((_proxyHost != null) && (_proxyPort > 0)),
_proxyHost, _proxyPort, 0, archiveFile.getAbsolutePath(), location);
eep.addStatusListener(new IndexFetcherStatusListener(archiveFile));
eep.fetch();
}
private class IndexFetcherStatusListener implements EepGet.StatusListener {
private File _archiveFile;
public IndexFetcherStatusListener(File file) {
_archiveFile = file;
}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " successful");
_fetchIndexInProgress = false;
ArchiveIndex i = new ArchiveIndex(false);
try {
i.load(_archiveFile);
_statusMessages.add("Archive fetched and loaded");
_remoteIndex = i;
} catch (IOException ioe) {
_statusMessages.add("Archive is corrupt: " + ioe.getMessage());
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);
_fetchIndexInProgress = false;
}
}
private class MetadataStatusListener implements EepGet.StatusListener {
public MetadataStatusListener() {}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " successful");
File info = new File(outputFile);
FileInputStream in = null;
try {
BlogInfo i = new BlogInfo();
in = new FileInputStream(info);
i.load(in);
boolean ok = BlogManager.instance().getArchive().storeBlogInfo(i);
if (ok) {
_statusMessages.add("Blog info for " + HTMLRenderer.sanitizeString(i.getProperty(BlogInfo.NAME)) + " imported");
BlogManager.instance().getArchive().reloadInfo();
} else {
_statusMessages.add("Blog info at " + HTMLRenderer.sanitizeString(url) + " was corrupt / invalid / forged");
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
info.delete();
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);;
}
}
private class BlogStatusListener implements EepGet.StatusListener {
public BlogStatusListener() {}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " successful");
File file = new File(outputFile);
FileInputStream in = null;
try {
EntryContainer c = new EntryContainer();
in = new FileInputStream(file);
c.load(in);
BlogURI uri = c.getURI();
if ( (uri == null) || (uri.getKeyHash() == null) ) {
_statusMessages.add("Blog post at " + HTMLRenderer.sanitizeString(url) + " was corrupt - no URI");
return;
}
Archive a = BlogManager.instance().getArchive();
BlogInfo info = a.getBlogInfo(uri);
if (info == null) {
_statusMessages.add("Blog post " + uri.toString() + " cannot be imported, as we don't have their blog metadata");
return;
}
boolean ok = a.storeEntry(c);
if (!ok) {
_statusMessages.add("Blog post at " + url + ": " + uri.toString() + " has an invalid signature");
return;
} else {
_statusMessages.add("Blog post " + uri.toString() + " imported");
BlogManager.instance().getArchive().regenerateIndex();
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
file.delete();
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);
}
}
/**
* Receive the status of a fetch for the zip containing blogs and metadata (as generated by
* the ExportServlet)
*/
private class BulkFetchListener implements EepGet.StatusListener {
private File _tmp;
public BulkFetchListener(File tmp) {
_tmp = tmp;
}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url.substring(0, url.indexOf('?'))) + " successful, importing the data");
File file = new File(outputFile);
ZipInputStream zi = null;
try {
zi = new ZipInputStream(new FileInputStream(file));
while (true) {
ZipEntry entry = zi.getNextEntry();
if (entry == null)
break;
ByteArrayOutputStream out = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
int read = -1;
while ( (read = zi.read(buf)) != -1)
out.write(buf, 0, read);
if (entry.getName().startsWith("meta")) {
BlogInfo i = new BlogInfo();
i.load(new ByteArrayInputStream(out.toByteArray()));
boolean ok = BlogManager.instance().getArchive().storeBlogInfo(i);
if (ok) {
_statusMessages.add("Blog info for " + HTMLRenderer.sanitizeString(i.getProperty(BlogInfo.NAME)) + " imported");
} else {
_statusMessages.add("Blog info at " + HTMLRenderer.sanitizeString(url) + " was corrupt / invalid / forged");
}
} else if (entry.getName().startsWith("entry")) {
EntryContainer c = new EntryContainer();
c.load(new ByteArrayInputStream(out.toByteArray()));
BlogURI uri = c.getURI();
if ( (uri == null) || (uri.getKeyHash() == null) ) {
_statusMessages.add("Blog post " + HTMLRenderer.sanitizeString(entry.getName()) + " was corrupt - no URI");
continue;
}
Archive a = BlogManager.instance().getArchive();
BlogInfo info = a.getBlogInfo(uri);
if (info == null) {
_statusMessages.add("Blog post " + HTMLRenderer.sanitizeString(entry.getName()) + " cannot be imported, as we don't have their blog metadata");
continue;
}
boolean ok = a.storeEntry(c);
if (!ok) {
_statusMessages.add("Blog post " + uri.toString() + " has an invalid signature");
continue;
} else {
_statusMessages.add("Blog post " + uri.toString() + " imported");
}
}
}
BlogManager.instance().getArchive().regenerateIndex();
} catch (IOException ioe) {
ioe.printStackTrace();
_statusMessages.add("Error importing from " + HTMLRenderer.sanitizeString(url) + ": " + ioe.getMessage());
} finally {
if (zi != null) try { zi.close(); } catch (IOException ioe) {}
file.delete();
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);
_tmp.delete();
}
}
public void postSelectedEntries(User user, Map parameters) {
String entries[] = ArchiveViewerBean.getStrings(parameters, "localentry");
if ( (entries == null) || (entries.length <= 0) ) return;
List uris = new ArrayList(entries.length);
for (int i = 0; i < entries.length; i++)
uris.add(new BlogURI(entries[i]));
post(uris, user);
}
private void post(List blogURIs, User user) {
List files = new ArrayList(blogURIs.size()+1);
Set meta = new HashSet(4);
Map uploads = new HashMap(files.size());
String importURL = getImportURL();
_statusMessages.add("Uploading through " + HTMLRenderer.sanitizeString(importURL));
for (int i = 0; i < blogURIs.size(); i++) {
BlogURI uri = (BlogURI)blogURIs.get(i);
File blogDir = new File(BlogManager.instance().getArchive().getArchiveDir(), uri.getKeyHash().toBase64());
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(uri);
if (!meta.contains(uri.getKeyHash())) {
uploads.put("blogmeta" + meta.size(), new File(blogDir, Archive.METADATA_FILE));
meta.add(uri.getKeyHash());
_statusMessages.add("Scheduling upload of the blog metadata for " + HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.NAME)));
}
uploads.put("blogpost" + i, new File(blogDir, uri.getEntryId() + ".snd"));
_statusMessages.add("Scheduling upload of " + HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.NAME))
+ ": " + getEntryDate(uri.getEntryId()));
}
EepPost post = new EepPost();
post.postFiles(importURL, _proxyHost, _proxyPort, uploads, new Runnable() { public void run() { _statusMessages.add("Upload complete"); } });
}
private String getImportURL() {
String loc = _remoteLocation.trim();
int archiveRoot = loc.lastIndexOf('/');
int syndieRoot = loc.lastIndexOf('/', archiveRoot-1);
return loc.substring(0, syndieRoot + 1) + "import.jsp";
}
public void renderDeltaForm(User user, ArchiveIndex localIndex, Writer out) throws IOException {
Archive archive = BlogManager.instance().getArchive();
StringBuffer buf = new StringBuffer(512);
buf.append("<b>New blogs:</b> <select name=\"blog\"><option value=\"ALL\">All</option>\n");
Set localBlogs = archive.getIndex().getUniqueBlogs();
Set remoteBlogs = _remoteIndex.getUniqueBlogs();
int newBlogs = 0;
for (Iterator iter = remoteBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
if (!localBlogs.contains(blog)) {
buf.append("<option value=\"" + blog.toBase64() + "\">" + blog.toBase64() + "</option>\n");
newBlogs++;
}
}
if (newBlogs > 0) {
out.write(buf.toString());
out.write("</select> <input type=\"submit\" name=\"action\" value=\"Fetch metadata\" /><br />\n");
}
int newEntries = 0;
int localNew = 0;
out.write("<table border=\"1\" width=\"100%\">\n");
List entries = new ArrayList();
for (Iterator iter = remoteBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
buf.setLength(0);
int shownEntries = 0;
buf.append("<tr><td colspan=\"5\" align=\"left\" valign=\"top\">\n");
BlogInfo info = archive.getBlogInfo(blog);
if (info != null) {
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, null, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\"><b>" + HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.NAME)) + "</b></a>: " +
HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.DESCRIPTION)) + "\n");
} else {
buf.append("<b>" + blog.toBase64() + "</b>\n");
}
buf.append("</td></tr>\n");
buf.append("<tr><td>&nbsp;</td><td nowrap=\"true\"><b>Posted on</b></td><td nowrap=\"true\"><b>#</b></td><td nowrap=\"true\"><b>Size</b></td><td width=\"90%\" nowrap=\"true\"><b>Tags</b></td></tr>\n");
entries.clear();
_remoteIndex.selectMatchesOrderByEntryId(entries, blog, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
buf.append("<tr>\n");
if (!archive.getIndex().getEntryIsKnown(uri)) {
buf.append("<td><input type=\"checkbox\" name=\"entry\" value=\"" + uri.toString() + "\" /></td>\n");
newEntries++;
shownEntries++;
} else {
String page = HTMLRenderer.getPageURL(blog, null, uri.getEntryId(), -1, -1,
user.getShowExpanded(), user.getShowImages());
buf.append("<td><a href=\"" + page + "\">(local)</a></td>\n");
}
buf.append("<td>" + getDate(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + getId(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + _remoteIndex.getBlogEntrySizeKB(uri) + "KB</td>\n");
buf.append("<td>");
for (Iterator titer = new TreeSet(_remoteIndex.getBlogEntryTags(uri)).iterator(); titer.hasNext(); ) {
String tag = (String)titer.next();
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\">" + tag + "</a> \n");
}
buf.append("</td>\n");
buf.append("</tr>\n");
}
if (shownEntries > 0) {
out.write(buf.toString());
buf.setLength(0);
}
int remote = shownEntries;
// now for posts in known blogs that we have and they don't
entries.clear();
localIndex.selectMatchesOrderByEntryId(entries, blog, null);
buf.append("<tr><td colspan=\"5\">Entries we have, but the remote Syndie doesn't:</td></tr>\n");
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
if (!_remoteIndex.getEntryIsKnown(uri)) {
buf.append("<tr>\n");
buf.append("<td><input type=\"checkbox\" name=\"localentry\" value=\"" + uri.toString() + "\" /></td>\n");
shownEntries++;
newEntries++;
localNew++;
buf.append("<td>" + getDate(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + getId(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + localIndex.getBlogEntrySizeKB(uri) + "KB</td>\n");
buf.append("<td>");
for (Iterator titer = new TreeSet(localIndex.getBlogEntryTags(uri)).iterator(); titer.hasNext(); ) {
String tag = (String)titer.next();
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\">" + tag + "</a> \n");
}
buf.append("</td>\n");
buf.append("</tr>\n");
}
}
if (shownEntries > remote) // skip blogs we have already syndicated
out.write(buf.toString());
}
// now for posts in blogs we have and they don't
int newBefore = localNew;
buf.setLength(0);
buf.append("<tr><td colspan=\"5\">Blogs the remote Syndie doesn't have</td></tr>\n");
for (Iterator iter = localBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
if (remoteBlogs.contains(blog)) {
//System.err.println("Remote index has " + blog.toBase64());
continue;
}
entries.clear();
localIndex.selectMatchesOrderByEntryId(entries, blog, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
buf.append("<tr>\n");
buf.append("<td><input type=\"checkbox\" name=\"localentry\" value=\"" + uri.toString() + "\" /></td>\n");
buf.append("<td>" + getDate(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + getId(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + localIndex.getBlogEntrySizeKB(uri) + "KB</td>\n");
buf.append("<td>");
for (Iterator titer = new TreeSet(localIndex.getBlogEntryTags(uri)).iterator(); titer.hasNext(); ) {
String tag = (String)titer.next();
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\">" + tag + "</a> \n");
}
buf.append("</td>\n");
buf.append("</tr>\n");
localNew++;
}
}
if (localNew > newBefore)
out.write(buf.toString());
out.write("</table>\n");
if (newEntries > 0) {
out.write("<input type=\"submit\" name=\"action\" value=\"Fetch selected entries\" /> \n");
out.write("<input type=\"submit\" name=\"action\" value=\"Fetch all new entries\" /> \n");
} else {
out.write(HTMLRenderer.sanitizeString(_remoteLocation) + " has no new posts to offer us\n");
}
if (localNew > 0) {
out.write("<input type=\"submit\" name=\"action\" value=\"Post selected entries\" /> \n");
}
out.write("<hr />\n");
}
private final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
private String getDate(long when) {
synchronized (_dateFormat) {
return _dateFormat.format(new Date(when));
}
}
private final String getEntryDate(long when) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(when));
long dayBegin = _dateFormat.parse(str).getTime();
return str + "." + (when - dayBegin);
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return "unknown";
}
}
}
private long getId(long id) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(id));
long dayBegin = _dateFormat.parse(str).getTime();
return (id - dayBegin);
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return id;
}
}
}
}

View File

@ -0,0 +1,9 @@
<%@page contentType="text/html; charset=UTF-8" import="net.i2p.syndie.web.ArchiveViewerBean, net.i2p.syndie.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" /><table border="0" width="100%">
<tr><form action="index.jsp"><td nowrap="true">
<b>Blogs:</b> <%ArchiveViewerBean.renderBlogSelector(user, request.getParameterMap(), out);%>
<input type="submit" value="Refresh" />
<input type="submit" name="action" value="<%=ArchiveViewerBean.SEL_ACTION_SET_AS_DEFAULT%>" />
<!-- char encoding: [<%=response.getCharacterEncoding()%>] content type [<%=response.getContentType()%>] Locale [<%=response.getLocale()%>] -->
<%ArchiveViewerBean.renderBlogs(user, request.getParameterMap(), out, "</td></form></tr><tr><td align=\"left\" valign=\"top\">");%></td></tr></table>

View File

@ -0,0 +1,3 @@
<%@page import="net.i2p.syndie.web.ArchiveViewerBean, net.i2p.syndie.*, net.i2p.data.Base64" %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<jsp:useBean scope="session" class="net.i2p.syndie.data.TransparentArchiveIndex" id="archive" />

View File

@ -0,0 +1 @@
<!-- nada -->

View File

@ -0,0 +1,5 @@
<%@page import="net.i2p.syndie.BlogManager" %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<!--
<center>[syndiemedia]</center>
-->

View File

@ -0,0 +1,42 @@
<%@page import="net.i2p.syndie.*, net.i2p.syndie.sml.*, net.i2p.syndie.web.*" %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<td valign="top" align="left" class="syndieTopNavBlogsCell" height="10"><a href="index.jsp">Home</a></td>
<td valign="top" align="left" class="syndieTopNavRemoteCell" height="10">
<a href="remote.jsp">Remote archives</a>
<a href="import.jsp">Import</a>
</td>
<form action="<%=request.getRequestURI() + "?" + (request.getQueryString() != null ? request.getQueryString() : "")%>">
<td nowrap="true" valign="top" align="right" class="syndieTopNavManageCell" height="10"><%
if ("true".equals(request.getParameter("logout"))) {
user.invalidate();
}
String login = request.getParameter("login");
String pass = request.getParameter("password");
String loginSubmit = request.getParameter("Login");
if ( (login != null) && (pass != null) && (loginSubmit != null) && (loginSubmit.equals("Login")) ) {
String loginResult = BlogManager.instance().login(user, login, pass);
if (!user.getAuthenticated())
out.write("<b>" + loginResult + "</b>");
}
%>
<% if (user.getAuthenticated()) { %>
Logged in as: <b><jsp:getProperty property="username" name="user" />:</b>
<a href="<%=HTMLRenderer.getPageURL(user.getBlog(), null, -1, -1, -1, user.getShowExpanded(), user.getShowImages())%>"><%=HTMLRenderer.sanitizeString(ArchiveViewerBean.getBlogName(user.getBlogStr()))%></a>
<a href="<%=HTMLRenderer.getPostURL(user.getBlog())%>">Post</a>
<a href="<%=HTMLRenderer.getMetadataURL(user.getBlog())%>">Metadata</a>
<a href="index.jsp?logout=true">Logout</a><br />
<%} else {%>
Login: <input type="text" name="login" size="8" />
Pass: <input type="password" name="password" size="8" /><%
java.util.Enumeration params = request.getParameterNames();
while (params.hasMoreElements()) {
String p = (String)params.nextElement();
String val = request.getParameter(p);
%><input type="hidden" name="<%=p%>" value="<%=val%>" /><%
}%>
<input type="submit" name="Login" value="Login" />
<a href="register.jsp">Register</a>
<% } %>
</td>
</form>

View File

@ -0,0 +1,47 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.data.*, net.i2p.syndie.web.*, net.i2p.syndie.sml.*, net.i2p.syndie.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<html>
<head>
<title>SyndieMedia</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><td valign="top" align="left" colspan="3"><%
String nameStr = request.getParameter("name");
String locStr = request.getParameter("location");
String schemaStr = request.getParameter("schema");
String name = null;
String location = null;
String schema = null;
try {
name = DataHelper.getUTF8(Base64.decode(nameStr));
location = DataHelper.getUTF8(Base64.decode(locStr));
schema = DataHelper.getUTF8(Base64.decode(schemaStr));
} catch (NullPointerException npe) {
// ignore
}
if ( (name == null) || (location == null) || (schema == null) ) {
out.write("<b>No location specified</b>");
} else if (user.getAuthenticated() && ("Add".equals(request.getParameter("action"))) ) {
out.write("<b>" + BlogManager.instance().addAddress(user, name, location, schema) + "</b>");
} else { %>Are you sure you really want to add the
addressbook mapping of <%=HTMLRenderer.sanitizeString(name)%> to
<input type="text" size="20" value="<%=HTMLRenderer.sanitizeString(location)%>" />, applicable within the
schema <%=HTMLRenderer.sanitizeString(schema)%>?
<% if (!user.getAuthenticated()) { %>
<p />If so, add the line
<input type="text" size="20" value="<%=HTMLRenderer.sanitizeString(name)%>=<%=HTMLRenderer.sanitizeString(location)%>" />
to your <code>userhosts.txt</code>.
<% } else { %><br />
<a href="addaddress.jsp?name=<%=HTMLRenderer.sanitizeURL(name)%>&location=<%=HTMLRenderer.sanitizeURL(location)%>&schema=<%=HTMLRenderer.sanitizeURL(schema)%>&action=Add">Yes, add it</a>.
<% }
} %></td></tr>
</table>
</body>

View File

@ -0,0 +1,34 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.data.*, net.i2p.syndie.web.*, net.i2p.syndie.sml.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<html>
<head>
<title>SyndieMedia</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><td valign="top" align="left" colspan="3">Are you sure you really want to go to
<%
String loc = request.getParameter("location");
String schema = request.getParameter("schema");
String desc = request.getParameter("description");
if (loc != null) loc = HTMLRenderer.sanitizeString(DataHelper.getUTF8(Base64.decode(loc)));
if (schema != null) schema = HTMLRenderer.sanitizeString(DataHelper.getUTF8(Base64.decode(schema)));
if (desc != null) desc = HTMLRenderer.sanitizeString(DataHelper.getUTF8(Base64.decode(desc)));
if ( (loc != null) && (schema != null) ) {
out.write(loc + " (" + schema + ")");
if (desc != null)
out.write(": " + desc);
out.write("? ");
out.write("<a href=\"" + loc + "\">yes</a>");
} else {
out.write("(some unspecified location...)");
}
%></td></tr>
</table>
</body>

View File

@ -0,0 +1,67 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.data.Base64, net.i2p.syndie.web.*, net.i2p.syndie.sml.*, net.i2p.syndie.data.*, net.i2p.syndie.*, org.mortbay.servlet.MultiPartRequest, java.util.*, java.io.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<jsp:useBean scope="session" class="net.i2p.syndie.data.TransparentArchiveIndex" id="archive" />
<html>
<head>
<title>SyndieMedia import</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><td valign="top" align="left" colspan="3"><%
String contentType = request.getContentType();
if ((contentType != null) && (contentType.indexOf("boundary=") != -1) ) {
MultiPartRequest req = new MultiPartRequest(request);
int metaId = 0;
while (true) {
InputStream meta = req.getInputStream("blogmeta" + metaId);
if (meta == null)
break;
if (!BlogManager.instance().importBlogMetadata(meta)) {
System.err.println("blog meta " + metaId + " failed to be imported");
break;
}
metaId++;
}
int entryId = 0;
while (true) {
InputStream entry = req.getInputStream("blogpost" + entryId);
if (entry == null)
break;
if (!BlogManager.instance().importBlogEntry(entry)) {
System.err.println("blog entry " + entryId + " failed to be imported");
break;
}
entryId++;
}
if ( (entryId > 0) || (metaId > 0) ) {
BlogManager.instance().getArchive().regenerateIndex();
session.setAttribute("index", BlogManager.instance().getArchive().getIndex());
}
%>Imported <%=entryId%> posts and <%=metaId%> blog metadata files.
<%
} else { %><form action="import.jsp" method="POST" enctype="multipart/form-data">
Blog metadata 0: <input type="file" name="blogmeta0" /><br />
Blog metadata 1: <input type="file" name="blogmeta1" /><br />
Post 0: <input type="file" name="blogpost0" /><br />
Post 1: <input type="file" name="blogpost1" /><br />
Post 2: <input type="file" name="blogpost2" /><br />
Post 3: <input type="file" name="blogpost3" /><br />
Post 4: <input type="file" name="blogpost4" /><br />
Post 5: <input type="file" name="blogpost5" /><br />
Post 6: <input type="file" name="blogpost6" /><br />
Post 7: <input type="file" name="blogpost7" /><br />
Post 8: <input type="file" name="blogpost8" /><br />
Post 9: <input type="file" name="blogpost9" /><br />
<hr />
<input type="submit" name="Post" value="Post entry" /> <input type="reset" value="Cancel" />
<% } %>
</td></tr>
</table>
</body>

16
apps/syndie/jsp/index.jsp Normal file
View File

@ -0,0 +1,16 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.syndie.web.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<html>
<head>
<title>SyndieMedia</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><td valign="top" align="left" colspan="3"><jsp:include page="_bodyindex.jsp" /></td></tr>
</table>
</body>

133
apps/syndie/jsp/post.jsp Normal file
View File

@ -0,0 +1,133 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.data.Base64, net.i2p.syndie.web.*, net.i2p.syndie.sml.*, net.i2p.syndie.data.*, net.i2p.syndie.*, org.mortbay.servlet.MultiPartRequest, java.util.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<jsp:useBean scope="session" class="net.i2p.syndie.web.PostBean" id="post" />
<html>
<head>
<title>SyndieMedia</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><td valign="top" align="left" colspan="3"><%
if (!user.getAuthenticated()) {
%>You must be logged in to post<%
} else {
String confirm = request.getParameter("confirm");
if ( (confirm != null) && (confirm.equalsIgnoreCase("true")) ) {
BlogURI uri = post.postEntry();
if (uri != null) {
%>Blog entry <a href="<%=HTMLRenderer.getPageURL(user.getBlog(), null, uri.getEntryId(), -1, -1,
user.getShowExpanded(), user.getShowImages())%>">posted</a>!<%
} else {
%>There was an unknown error posting the entry...<%
}
post.reinitialize();
post.setUser(user);
} else {
// logged in but not confirmed...
String contentType = request.getContentType();
if ((contentType != null) && (contentType.indexOf("boundary=") != -1) ) {
// not confirmed but they posted stuff... gobble up what they give
// and display it as a preview (then we show the confirm form)
post.reinitialize();
post.setUser(user);
MultiPartRequest req = new MultiPartRequest(request);
String entrySubject = req.getString("entrysubject");
String entryTags = req.getString("entrytags");
String entryText = req.getString("entrytext");
String entryHeaders = req.getString("entryheaders");
String replyTo = req.getString(ArchiveViewerBean.PARAM_IN_REPLY_TO);
if ( (replyTo != null) && (replyTo.trim().length() > 0) ) {
byte r[] = Base64.decode(replyTo);
if (r != null) {
if (entryHeaders == null) entryHeaders = HTMLRenderer.HEADER_IN_REPLY_TO + ": " + new String(r);
else entryHeaders = entryHeaders + '\n' + HTMLRenderer.HEADER_IN_REPLY_TO + ": " + new String(r);
} else {
replyTo = null;
}
}
post.setSubject(entrySubject);
post.setTags(entryTags);
post.setText(entryText);
post.setHeaders(entryHeaders);
for (int i = 0; i < 32; i++) {
String filename = req.getFilename("entryfile" + i);
if ( (filename != null) && (filename.trim().length() > 0) ) {
Hashtable params = req.getParams("entryfile" + i);
String type = "application/octet-stream";
for (Iterator iter = params.keySet().iterator(); iter.hasNext(); ) {
String cur = (String)iter.next();
if ("content-type".equalsIgnoreCase(cur)) {
type = (String)params.get(cur);
break;
}
}
post.addAttachment(filename.trim(), req.getInputStream("entryfile" + i), type);
}
}
post.renderPreview(out);
%><hr />Please <a href="post.jsp?confirm=true">confirm</a> that this is ok. Otherwise, just go back and make changes.<%
} else {
// logged in and not confirmed because they didn't send us anything!
// give 'em a new form
%><form action="post.jsp" method="POST" enctype="multipart/form-data">
Post subject: <input type="text" size="80" name="entrysubject" value="<%=post.getSubject()%>" /><br />
Post tags: <input type="text" size="20" name="entrytags" value="<%=post.getTags()%>" /><br />
Post content (in raw SML, no headers):<br />
<textarea rows="6" cols="80" name="entrytext"><%=post.getText()%></textarea><br />
<b>SML cheatsheet:</b><br /><textarea rows="6" cols="80" readonly="true">
* newlines are newlines are newlines.
* all &lt; and &gt; are replaced with their &amp;symbol;
* [b][/b] = <b>bold</b>
* [i][/i] = <i>italics</i>
* [u][/u] = <i>underline</i>
* [cut]more inside[/cut] = [<a href="#">more inside...</a>]
* [img attachment="1"]alt[/img] = use attachment 1 as an image with 'alt' as the alt text
* [blog name="name" bloghash="base64hash"]description[/blog] = link to all posts in the blog
* [blog name="name" bloghash="base64hash" blogentry="1234"]description[/blog] = link to the specified post in the blog
* [blog name="name" bloghash="base64hash" blogtag="tag"]description[/blog] = link to all posts in the blog with the specified tag
* [blog name="name" blogtag="tag"]description[/blog] = link to all posts in all blogs with the specified tag
* [link schema="eep" location="http://forum.i2p"]text[/link] = offer a link to an external resource (accessible with the given schema)
* [archive name="name" description="they have good stuff" schema="eep" location="http://syndiemedia.i2p/archive/archive.txt"]foo![/archive] = offer an easy way to sync up with a new Syndie archive
SML headers are newline delimited key=value pairs. Example keys are:
* bgcolor = background color of the post (e.g. bgcolor=#ffccaa or bgcolor=red)
* bgimage = attachment number to place as the background image for the post (only shown if images are enabled) (e.g. bgimage=1)
* textfont = font to put most text into
</textarea><br />
SML post headers:<br />
<textarea rows="3" cols="80" name="entryheaders"><%=post.getHeaders()%></textarea><br /><%
String s = request.getParameter(ArchiveViewerBean.PARAM_IN_REPLY_TO);
if ( (s != null) && (s.trim().length() > 0) ) {%>
<input type="hidden" name="<%=ArchiveViewerBean.PARAM_IN_REPLY_TO%>" value="<%=request.getParameter(ArchiveViewerBean.PARAM_IN_REPLY_TO)%>" />
<% } %>
Attachment 0: <input type="file" name="entryfile0" /><br />
Attachment 1: <input type="file" name="entryfile1" /><br />
Attachment 2: <input type="file" name="entryfile2" /><br />
Attachment 3: <input type="file" name="entryfile3" /><br /><!--
Attachment 4: <input type="file" name="entryfile4" /><br />
Attachment 5: <input type="file" name="entryfile5" /><br />
Attachment 6: <input type="file" name="entryfile6" /><br />
Attachment 7: <input type="file" name="entryfile7" /><br />
Attachment 8: <input type="file" name="entryfile8" /><br />
Attachment 9: <input type="file" name="entryfile9" /><br />-->
<hr />
<input type="submit" name="Post" value="Preview..." /> <input type="reset" value="Cancel" />
<%
} // end of the 'logged in, not confirmed, nothing posted' section
} // end of the 'logged in, not confirmed' section
} // end of the 'logged in' section
%></td></tr>
</table>
</body>

View File

@ -0,0 +1,49 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.data.Base64, net.i2p.syndie.web.*, net.i2p.syndie.sml.*, net.i2p.syndie.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<html>
<head>
<title>SyndieMedia</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><td valign="top" align="left" colspan="3"><%
String regLogin = request.getParameter("login");
boolean showForm = true;
if ( (regLogin != null) && ("Register".equals(request.getParameter("Register"))) ) {
String regUserPass = request.getParameter("password");
String regPass = request.getParameter("registrationpassword");
String blogName = request.getParameter("blogname");
String desc = request.getParameter("description");
String url = request.getParameter("contacturl");
String regResult = BlogManager.instance().register(user, regLogin, regUserPass, regPass, blogName, desc, url);
if (User.LOGIN_OK.equals(regResult)) {
out.print("<b>Registration successful.</b> <a href=\"index.jsp\">Continue...</a>\n");
showForm = false;
} else {
out.print("<b>" + regResult + "</b>");
}
}
if (showForm) {%><form action="register.jsp" method="POST">
<p>To create a new blog (and Syndie user account), please fill out the following form.
You may need to enter a registration password given to you by this Syndie instance's
operator, or there may be no registration password in place (in which case you can
leave that field blank).</p>
<p>
<b>Syndie login:</b> <input type="text" size="8" name="login" /><br />
<b>New password:</b> <input type="password" size="8" name="password" /><br />
<b>Registration password:</b> <input type="password" size="8" name="registrationpassword" /><br />
<b>Blog name:</b> <input type="text" size="32" name="blogname" /><br />
<b>Brief description:</b> <input type="text" size="60" name="description" /><br />
<b>Contact URL:</b> <input type="text" size="20" name="contacturl" /> <i>(e.g. mailto://user@mail.i2p, http://foo.i2p/, etc)</i><br />
<input type="submit" name="Register" value="Register" />
</p>
</form><% } %>
</td></tr>
</table>
</body>

View File

@ -0,0 +1,66 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.syndie.web.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<jsp:useBean scope="session" class="net.i2p.syndie.web.RemoteArchiveBean" id="remote" />
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<jsp:useBean scope="session" class="net.i2p.syndie.data.TransparentArchiveIndex" id="archive" />
<html>
<head>
<title>SyndieMedia</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><form action="remote.jsp" method="POST"><td valign="top" align="left" colspan="3">
<%
if (!user.getAuthenticated() || !user.getAllowAccessRemote()) {
%>Sorry, you are not allowed to access remote archives from here. Perhaps you should install Syndie yourself?<%
} else { %>Import from:
<select name="schema">
<option value="web" <%=("web".equals(request.getParameter("schema")) ? "selected=\"true\"" : "")%>>I2P/TOR/Freenet</option>
<option value="mnet" <%=("mnet".equals(request.getParameter("schema")) ? "selected=\"true\"" : "")%>>MNet</option>
<option value="feedspace" <%=("feedspace".equals(request.getParameter("schema")) ? "selected=\"true\"" : "")%>>Feedspace</option>
<option value="usenet" <%=("usenet".equals(request.getParameter("schema")) ? "selected=\"true\"" : "")%>>Usenet</option>
</select>
Proxy <input type="text" size="10" name="proxyhost" value="localhost" />:<input type="text" size="4" name="proxyport" value="4444" />
<input name="location" size="40" value="<%=(request.getParameter("location") != null ? request.getParameter("location") : "")%>" />
<input type="submit" name="action" value="Continue..." /><br />
<%
String action = request.getParameter("action");
if ("Continue...".equals(action)) {
remote.fetchIndex(user, request.getParameter("schema"), request.getParameter("location"), request.getParameter("proxyhost"), request.getParameter("proxyport"));
} else if ("Fetch metadata".equals(action)) {
remote.fetchMetadata(user, request.getParameterMap());
} else if ("Fetch selected entries".equals(action)) {
//remote.fetchSelectedEntries(user, request.getParameterMap());
remote.fetchSelectedBulk(user, request.getParameterMap());
} else if ("Fetch all new entries".equals(action)) {
//remote.fetchAllEntries(user, request.getParameterMap());
remote.fetchSelectedBulk(user, request.getParameterMap());
} else if ("Post selected entries".equals(action)) {
remote.postSelectedEntries(user, request.getParameterMap());
}
String msgs = remote.getStatus();
if ( (msgs != null) && (msgs.length() > 0) ) { %><pre><%=msgs%>
<a href="remote.jsp">Refresh</a></pre><br /><%
}
if (remote.getFetchIndexInProgress()) { %><b>Please wait while the index is being fetched
from <%=remote.getRemoteLocation()%></b>. <%
} else if (remote.getRemoteIndex() != null) {
// remote index is NOT null!
%><b><%=remote.getRemoteLocation()%></b>
<a href="remote.jsp?schema=<%=remote.getRemoteSchema()%>&location=<%=remote.getRemoteLocation()%><%
if (remote.getProxyHost() != null && remote.getProxyPort() > 0) {
%>&proxyhost=<%=remote.getProxyHost()%>&proxyport=<%=remote.getProxyPort()%><%
} %>&action=Continue...">(refetch)</a>:<br />
<%remote.renderDeltaForm(user, archive, out);%>
<textarea style="font-size:8pt" rows="5" cols="120"><%=remote.getRemoteIndex()%></textarea><%
}
}
%>
</td></form></tr>
</table>
</body>

View File

@ -0,0 +1,3 @@
<%@page contentType="text/css; charset=UTF-8" pageEncoding="UTF-8" %>
<% request.setCharacterEncoding("UTF-8"); %>
<%@include file="syndie.css" %>

View File

@ -0,0 +1,67 @@
.syndieEntrySubjectCell {
background-color: #999999;
font-size: 12px;
font-weight: bold;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
.syndieEntryMetaCell {
background-color: #888888;
font-size: 10px;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
.syndieEntryAttachmentsCell {
background-color: #aaaaaa;
font-size: 12px;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
.syndieEntrySummaryCell {
background-color: #eeeeee;
font-size: 12px;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
.syndieEntryBodyCell {
background-color: #eeeeee;
font-size: 12px;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
.syndieTopNavBlogsCell {
background-color: #888888;
font-size: 14px;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
.syndieTopNavRemoteCell {
background-color: #888888;
font-size: 14px;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
.syndieTopNavManageCell {
background-color: #888888;
font-size: 14px;
margin: 0px 0px 0px 0px;
padding: 0px 0px 0px 0px;
border: 0px;
}
body {
margin : 0px;
padding : 0px;
text-align : center;
font-family: Arial, Helvetica, sans-serif;
background-color : #FFFFFF;
color: #000000;
font-size: 12px;
}

View File

@ -0,0 +1 @@
<%response.sendRedirect("../index.jsp");%>

View File

@ -0,0 +1,9 @@
<% request.setCharacterEncoding("UTF-8"); %>
<%
java.util.Map params = request.getParameterMap();
response.setContentType(net.i2p.syndie.web.ArchiveViewerBean.getAttachmentContentType(params));
int len = net.i2p.syndie.web.ArchiveViewerBean.getAttachmentContentLength(params);
if (len >= 0)
response.setContentLength(len);
net.i2p.syndie.web.ArchiveViewerBean.renderAttachment(params, response.getOutputStream());
%>

View File

@ -0,0 +1,18 @@
<%@page contentType="text/html; charset=UTF-8" pageEncoding="UTF-8" import="net.i2p.syndie.web.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<html>
<head>
<title>SyndieMedia</title>
<link href="style.jsp" rel="stylesheet" type="text/css" />
</head>
<body>
<table border="1" cellpadding="0" cellspacing="0" width="100%">
<tr><td colspan="5" valign="top" align="left"><jsp:include page="_toplogo.jsp" /></td></tr>
<tr><td valign="top" align="left" rowspan="2"><jsp:include page="_leftnav.jsp" /></td>
<jsp:include page="_topnav.jsp" />
<td valign="top" align="left" rowspan="2"><jsp:include page="_rightnav.jsp" /></td></tr>
<tr><td valign="top" align="left" colspan="3"><%
ArchiveViewerBean.renderMetadata(request.getParameterMap(), out);
%></td></tr>
</table>
</body>

View File

@ -0,0 +1,16 @@
<%@page import="net.i2p.syndie.web.ArchiveViewerBean" %><jsp:useBean
scope="session" class="net.i2p.syndie.web.PostBean" id="post" /><%
request.setCharacterEncoding("UTF-8");
String id = request.getParameter(ArchiveViewerBean.PARAM_ATTACHMENT);
if (id != null) {
try {
int attachmentId = Integer.parseInt(id);
if ( (attachmentId < 0) || (attachmentId >= post.getAttachmentCount()) ) {
%>Attachment <%=attachmentId%> does not exist<%
} else {
response.setContentType(post.getContentType(attachmentId));
post.writeAttachmentData(attachmentId, response.getOutputStream());
}
} catch (NumberFormatException nfe) {}
}
%>

32
apps/syndie/jsp/web.xml Normal file
View File

@ -0,0 +1,32 @@
<?xml version="1.0" encoding="ISO-8859-1"?>
<!DOCTYPE web-app
PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.2//EN"
"http://java.sun.com/j2ee/dtds/web-app_2.2.dtd">
<web-app>
<servlet>
<servlet-name>net.i2p.syndie.web.ArchiveServlet</servlet-name>
<servlet-class>net.i2p.syndie.web.ArchiveServlet</servlet-class>
</servlet>
<!-- precompiled servlets -->
<servlet-mapping>
<servlet-name>net.i2p.syndie.jsp.index_jsp</servlet-name>
<url-pattern>/</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>net.i2p.syndie.web.ArchiveServlet</servlet-name>
<url-pattern>/archive/*</url-pattern>
</servlet-mapping>
<session-config>
<session-timeout>
30
</session-timeout>
</session-config>
<welcome-file-list>
<welcome-file>index.html</welcome-file>
<welcome-file>index.jsp</welcome-file>
</welcome-file-list>
</web-app>

View File

@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="ISO-8859-1"?>
<project basedir="." default="all" name="i2p">
<target name="all" >
<echo message="Useful targets: " />
<echo message=" dist: distclean then package everything up (installer, clean tarball, update tarball)" />
@ -28,6 +28,7 @@
<ant dir="apps/routerconsole/java/" target="jar" />
<ant dir="apps/addressbook/" target="war" />
<ant dir="apps/susimail/" target="war" />
<ant dir="apps/syndie/java/" target="jar" /> <!-- not pushed in the update... yet -->
</target>
<target name="buildWEB">
<ant dir="apps/jetty" target="fetchJettylib" />
@ -58,6 +59,9 @@
<copy file="installer/lib/jbigi/jbigi.jar" todir="build" />
<copy file="apps/addressbook/dist/addressbook.war" todir="build/" />
<copy file="apps/susimail/susimail.war" todir="build/" />
<copy file="apps/syndie/syndie.war" todir="build/" />
<copy file="apps/syndie/java/build/syndie.jar" todir="build/" />
<copy file="apps/syndie/syndie.war" todir="build/" />
</target>
<target name="javadoc">
<mkdir dir="./build" />
@ -185,6 +189,7 @@
<copy file="build/routerconsole.war" todir="pkg-temp/webapps/" />
<copy file="build/addressbook.war" todir="pkg-temp/webapps/" />
<copy file="build/susimail.war" todir="pkg-temp/webapps/" />
<copy file="build/syndie.war" todir="pkg-temp/webapps/" />
<copy file="installer/resources/clients.config" todir="pkg-temp/" />
<copy file="installer/resources/eepget" todir="pkg-temp/" />
<copy file="installer/resources/i2prouter" todir="pkg-temp/" />
@ -283,6 +288,7 @@
<copy file="build/routerconsole.war" todir="pkg-temp/webapps/" />
<copy file="build/addressbook.war" todir="pkg-temp/webapps/" />
<copy file="build/susimail.war" todir="pkg-temp/webapps/" />
<copy file="build/syndie.war" todir="pkg-temp/webapps/" />
<copy file="history.txt" todir="pkg-temp/" />
<mkdir dir="pkg-temp/docs/" />
<copy file="news.xml" todir="pkg-temp/docs/" />

View File

@ -14,8 +14,8 @@ package net.i2p;
*
*/
public class CoreVersion {
public final static String ID = "$Revision: 1.36 $ $Date: 2005/07/27 14:04:49 $";
public final static String VERSION = "0.6.0.1";
public final static String ID = "$Revision: 1.39 $ $Date: 2005/08/21 13:39:05 $";
public final static String VERSION = "0.6.0.4";
public static void main(String args[]) {
System.out.println("I2P Core version: " + VERSION);

View File

@ -44,19 +44,19 @@ public class Base64 {
/** added by aum */
public static String encode(String source) {
return encode(source.getBytes());
return (source != null ? encode(source.getBytes()) : "");
}
public static String encode(byte[] source) {
return encode(source, 0, (source != null ? source.length : 0));
return (source != null ? encode(source, 0, (source != null ? source.length : 0)) : "");
}
public static String encode(byte[] source, int off, int len) {
return encode(source, off, len, false);
return (source != null ? encode(source, off, len, false) : "");
}
public static String encode(byte[] source, boolean useStandardAlphabet) {
return encode(source, 0, (source != null ? source.length : 0), useStandardAlphabet);
return (source != null ? encode(source, 0, (source != null ? source.length : 0), useStandardAlphabet) : "");
}
public static String encode(byte[] source, int off, int len, boolean useStandardAlphabet) {
return safeEncode(source, off, len, useStandardAlphabet);
return (source != null ? safeEncode(source, off, len, useStandardAlphabet) : "");
}
public static byte[] decode(String s) {
@ -142,7 +142,7 @@ public class Base64 {
}
public static void main(String[] args) {
test();
//test();
if (args.length == 0) {
help();
return;
@ -152,6 +152,10 @@ public class Base64 {
private static void runApp(String args[]) {
try {
if ("encodestring".equalsIgnoreCase(args[0])) {
System.out.println(encode(args[1].getBytes()));
return;
}
InputStream in = System.in;
OutputStream out = System.out;
if (args.length >= 3) {
@ -399,6 +403,7 @@ public class Base64 {
* replacing / with ~, and + with -
*/
private static byte[] safeDecode(String source, boolean useStandardAlphabet) {
if (source == null) return null;
String toDecode = null;
if (useStandardAlphabet) {
toDecode = source;

View File

@ -21,6 +21,7 @@ import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.io.UnsupportedEncodingException;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
@ -880,5 +881,35 @@ public class DataHelper {
ReusableGZIPInputStream.release(in);
return rv;
}
public static byte[] getUTF8(String orig) {
if (orig == null) return null;
try {
return orig.getBytes("UTF-8");
} catch (UnsupportedEncodingException uee) {
throw new RuntimeException("no utf8!?");
}
}
public static byte[] getUTF8(StringBuffer orig) {
if (orig == null) return null;
return getUTF8(orig.toString());
}
public static String getUTF8(byte orig[]) {
if (orig == null) return null;
try {
return new String(orig, "UTF-8");
} catch (UnsupportedEncodingException uee) {
throw new RuntimeException("no utf8!?");
}
}
public static String getUTF8(byte orig[], int offset, int len) {
if (orig == null) return null;
try {
return new String(orig, offset, len, "UTF-8");
} catch (UnsupportedEncodingException uee) {
throw new RuntimeException("No utf8!?");
}
}
}

View File

@ -50,6 +50,7 @@ public class RouterInfo extends DataStructureImpl {
private volatile boolean _hashCodeInitialized;
public static final String PROP_NETWORK_ID = "netId";
public static final String PROP_CAPABILITIES = "caps";
public RouterInfo() {
setIdentity(null);
@ -298,6 +299,33 @@ public class RouterInfo extends DataStructureImpl {
}
return -1;
}
/**
* what special capabilities this router offers
*
*/
public String getCapabilities() {
if (_options == null) return "";
String capabilities = null;
synchronized (_options) {
capabilities = _options.getProperty(PROP_CAPABILITIES);
}
if (capabilities != null)
return capabilities;
else
return "";
}
public void addCapability(char cap) {
if (_options == null) _options = new OrderedProperties();
synchronized (_options) {
String caps = _options.getProperty(PROP_CAPABILITIES);
if (caps == null)
_options.setProperty(PROP_CAPABILITIES, ""+cap);
else if (caps.indexOf(cap) == -1)
_options.setProperty(PROP_CAPABILITIES, caps + cap);
}
}
/**
* Get the routing key for the structure using the current modifier in the RoutingKeyGenerator.

View File

@ -66,6 +66,15 @@ public class RateStat {
return rv;
}
public double getLifetimeAverageValue() {
if ( (_rates == null) || (_rates.length <= 0) ) return 0;
return _rates[0].getLifetimeAverageValue();
}
public double getLifetimeEventCount() {
if ( (_rates == null) || (_rates.length <= 0) ) return 0;
return _rates[0].getLifetimeEventCount();
}
public Rate getRate(long period) {
for (int i = 0; i < _rates.length; i++) {
if (_rates[i].getPeriod() == period) return _rates[i];

View File

@ -0,0 +1,72 @@
package net.i2p.util;
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import net.i2p.I2PAppContext;
/**
*
*/
public class EepGetScheduler implements EepGet.StatusListener {
private I2PAppContext _context;
private List _urls;
private List _localFiles;
private String _proxyHost;
private int _proxyPort;
private int _curURL;
private EepGet.StatusListener _listener;
public EepGetScheduler(I2PAppContext ctx, List urls, List localFiles, String proxyHost, int proxyPort, EepGet.StatusListener lsnr) {
_context = ctx;
_urls = urls;
_localFiles = localFiles;
_proxyHost = proxyHost;
_proxyPort = proxyPort;
_curURL = -1;
_listener = lsnr;
}
public void fetch() {
I2PThread t = new I2PThread(new Runnable() { public void run() { fetchNext(); } }, "EepGetScheduler");
t.setDaemon(true);
t.start();
}
private void fetchNext() {
_curURL++;
if (_curURL >= _urls.size()) return;
String url = (String)_urls.get(_curURL);
String out = EepGet.suggestName(url);
if ( (_localFiles != null) && (_localFiles.size() > _curURL) ) {
File f = (File)_localFiles.get(_curURL);
out = f.getAbsolutePath();
} else {
if (_localFiles == null)
_localFiles = new ArrayList(_urls.size());
_localFiles.add(new File(out));
}
EepGet get = new EepGet(_context, ((_proxyHost != null) && (_proxyPort > 0)), _proxyHost, _proxyPort, 0, out, url);
get.addStatusListener(this);
get.fetch();
}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_listener.attemptFailed(url, bytesTransferred, bytesRemaining, currentAttempt, numRetries, cause);
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {
_listener.bytesTransferred(alreadyTransferred, currentWrite, bytesTransferred, bytesRemaining, url);
}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_listener.transferComplete(alreadyTransferred, bytesTransferred, bytesRemaining, url, outputFile);
fetchNext();
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_listener.transferFailed(url, bytesTransferred, bytesRemaining, currentAttempt);
fetchNext();
}
}

View File

@ -0,0 +1,189 @@
package net.i2p.util;
import java.io.*;
import java.net.*;
import java.util.*;
import net.i2p.I2PAppContext;
/**
* Simple helper for uploading files and such via HTTP POST (rfc 1867)
*
*/
public class EepPost {
public static void main(String args[]) {
EepPost e = new EepPost();
Map fields = new HashMap();
fields.put("key", "value");
fields.put("key1", "value1");
fields.put("key2", "value2");
fields.put("blogpost0", new File("/home/i2p/1.snd"));
fields.put("blogpost1", new File("/home/i2p/2.snd"));
fields.put("blogpost2", new File("/home/i2p/2.snd"));
fields.put("blogpost3", new File("/home/i2p/2.snd"));
fields.put("blogpost4", new File("/home/i2p/2.snd"));
fields.put("blogpost5", new File("/home/i2p/2.snd"));
e.postFiles("http://localhost:7653/import.jsp", null, -1, fields, null);
//e.postFiles("http://localhost/cgi-bin/read.pl", null, -1, fields, null);
//e.postFiles("http://localhost:2001/import.jsp", null, -1, fields, null);
}
/**
* Submit an HTTP POST to the given URL (using the proxy if specified),
* uploading the given fields. If the field's value is a File object, then
* that file is uploaded, and if the field's value is a String object, the
* value is posted for that particular field. Multiple values for one
* field name is not currently supported.
*
*/
public void postFiles(String url, String proxyHost, int proxyPort, Map fields, Runnable onCompletion) {
I2PThread postThread = new I2PThread(new Runner(url, proxyHost, proxyPort, fields, onCompletion));
postThread.start();
}
private class Runner implements Runnable {
private String _url;
private String _proxyHost;
private int _proxyPort;
private Map _fields;
private Runnable _onCompletion;
public Runner(String url, String proxy, int port, Map fields, Runnable onCompletion) {
_url = url;
_proxyHost = proxy;
_proxyPort = port;
_fields = fields;
_onCompletion = onCompletion;
}
public void run() {
try {
URL u = new URL(_url);
String h = u.getHost();
int p = u.getPort();
if (p <= 0)
p = 80;
String path = u.getPath();
boolean isProxy = true;
if ( (_proxyHost == null) || (_proxyPort <= 0) ) {
isProxy = false;
_proxyHost = h;
_proxyPort = p;
}
Socket s = new Socket(_proxyHost, _proxyPort);
OutputStream out = s.getOutputStream();
String sep = getSeparator();
long length = calcContentLength(sep, _fields);
String header = getHeader(isProxy, path, h, p, sep, length);
System.out.println("Header: \n" + header);
out.write(header.getBytes());
out.flush();
if (false) {
out.write(("--" + sep + "\ncontent-disposition: form-data; name=\"field1\"\n\nStuff goes here\n--" + sep + "--\n").getBytes());
} else {
sendFields(out, sep, _fields);
}
out.flush();
BufferedReader in = new BufferedReader(new InputStreamReader(s.getInputStream()));
String line = null;
while ( (line = in.readLine()) != null)
System.out.println("recv: [" + line + "]");
out.close();
} catch (Exception e) {
e.printStackTrace();
} finally {
if (_onCompletion != null)
_onCompletion.run();
}
}
}
private long calcContentLength(String sep, Map fields) {
long len = 0;
for (Iterator iter = fields.keySet().iterator(); iter.hasNext(); ) {
String key = (String)iter.next();
Object val = fields.get(key);
if (val instanceof File) {
File f = (File)val;
len += ("--" + sep + "\nContent-Disposition: form-data; name=\"" + key + "\"; filename=\"" + f.getName() + "\"\n").length();
//len += ("Content-length: " + f.length() + "\n").length();
len += ("Content-Type: application/octet-stream\n\n").length();
len += f.length();
len += 1; // nl
} else {
len += ("--" + sep + "\nContent-Disposition: form-data; name=\"" + key + "\"\n\n").length();
len += val.toString().length();
len += 1; // nl
}
}
len += 2 + sep.length() + 2;
//len += 2;
return len;
}
private void sendFields(OutputStream out, String separator, Map fields) throws IOException {
for (Iterator iter = fields.keySet().iterator(); iter.hasNext(); ) {
String field = (String)iter.next();
Object val = fields.get(field);
if (val instanceof File)
sendFile(out, separator, field, (File)val);
else
sendField(out, separator, field, val.toString());
}
out.write(("--" + separator + "--\n").getBytes());
}
private void sendFile(OutputStream out, String separator, String field, File file) throws IOException {
long len = file.length();
out.write(("--" + separator + "\n").getBytes());
out.write(("Content-Disposition: form-data; name=\"" + field + "\"; filename=\"" + file.getName() + "\"\n").getBytes());
//out.write(("Content-length: " + len + "\n").getBytes());
out.write(("Content-Type: application/octet-stream\n\n").getBytes());
FileInputStream in = new FileInputStream(file);
byte buf[] = new byte[1024];
int read = -1;
while ( (read = in.read(buf)) != -1)
out.write(buf, 0, read);
out.write("\n".getBytes());
in.close();
}
private void sendField(OutputStream out, String separator, String field, String val) throws IOException {
out.write(("--" + separator + "\n").getBytes());
out.write(("Content-Disposition: form-data; name=\"" + field + "\"\n\n").getBytes());
out.write(val.getBytes());
out.write("\n".getBytes());
}
private String getHeader(boolean isProxy, String path, String host, int port, String separator, long length) {
StringBuffer buf = new StringBuffer(512);
buf.append("POST ");
if (isProxy) {
buf.append("http://").append(host);
if (port != 80)
buf.append(":").append(port);
}
buf.append(path);
buf.append(" HTTP/1.1\n");
buf.append("Host: ").append(host);
if (port != 80)
buf.append(":").append(port);
buf.append("\n");
buf.append("Connection: close\n");
buf.append("Content-length: ").append(length).append("\n");
buf.append("Content-type: multipart/form-data, boundary=").append(separator);
buf.append("\n");
buf.append("\n");
return buf.toString();
}
private String getSeparator() {
if (false)
return "ABCDEFG";
if (false)
return "------------------------" + new java.util.Random().nextLong();
byte separator[] = new byte[32]; // 2^-128 chance of this being a problem
I2PAppContext.getGlobalContext().random().nextBytes(separator);
StringBuffer sep = new StringBuffer(48);
for (int i = 0; i < separator.length; i++)
sep.append((char)((int)'a' + (int)(separator[i]&0x0F))).append((char)((int)'a' + (int)((separator[i] >>> 4) & 0x0F)));
return sep.toString();
}
}

View File

@ -141,14 +141,17 @@ public class LogManager {
public Log getLog(Class cls, String name) {
Log rv = null;
String scope = Log.getScope(name, cls);
boolean isNew = false;
synchronized (_logs) {
rv = (Log)_logs.get(scope);
if (rv == null) {
rv = new Log(this, cls, name);
_logs.put(scope, rv);
isNew = true;
}
}
updateLimit(rv);
if (isNew)
updateLimit(rv);
return rv;
}
public List getLogs() {
@ -642,7 +645,7 @@ public class LogManager {
public void shutdown() {
_log.log(Log.WARN, "Shutting down logger");
_writer.flushRecords();
_writer.flushRecords(false);
}
private static int __id = 0;

View File

@ -59,7 +59,8 @@ class LogWriter implements Runnable {
}
}
public void flushRecords() {
public void flushRecords() { flushRecords(true); }
public void flushRecords(boolean shouldWait) {
try {
List records = _manager._removeAll();
if (records == null) return;
@ -77,11 +78,13 @@ class LogWriter implements Runnable {
} catch (Throwable t) {
t.printStackTrace();
} finally {
try {
synchronized (this) {
this.wait(10*1000);
if (shouldWait) {
try {
synchronized (this) {
this.wait(10*1000);
}
} catch (InterruptedException ie) { // nop
}
} catch (InterruptedException ie) { // nop
}
}
}

View File

@ -1,4 +1,123 @@
$Id: history.txt,v 1.221 2005/08/01 22:26:51 duck Exp $
$Id: history.txt,v 1.234 2005/09/01 01:55:02 jrandom Exp $
* 2005-09-01 0.6.0.4 released
2005-09-01 jrandom
* Don't send out a netDb store of a router if it is more than a few hours
old, even if someone asked us for it.
2005-08-31 jrandom
* Don't publish leaseSets to the netDb if they will never be looked for -
namely, if they are for destinations that only establish outbound
streams. I2PTunnel's 'client' and 'httpclient' proxies have been
modified to tell the router that it doesn't need to publish their
leaseSet (by setting the I2CP config option 'i2cp.dontPublishLeaseSet'
to 'true').
* Don't publish the top 10 peer rankings of each router in the netdb, as
it isn't being watched right now.
2005-08-29 jrandom
* Added the new test Floodfill netDb
2005-08-27 jrandom
* Minor logging and optimization tweaks in the router and SDK
* Use ISO-8859-1 in the XML files (thanks redzara!)
* The consolePassword config property can now be used to bypass the router
console's nonce checking, allowing CLI restarts
2005-08-24 jrandom
* Catch errors with corrupt tunnel messages more gracefully (no need to
kill the thread and cause an OOM...)
* Don't skip shitlisted peers for netDb store messages, as they aren't
necessarily shitlisted by other people (though they probably are).
* Adjust the netDb store per-peer timeout based on each particular peer's
profile (timeout = 4x their average netDb store response time)
* Don't republish leaseSets to *failed* peers - send them to peers who
replied but just didn't know the value.
* Set a 5 second timeout on the I2PTunnelHTTPServer reading the client's
HTTP headers, rather than blocking indefinitely. HTTP headers should be
sent entirely within the first streaming packet anyway, so this won't be
a problem.
* Don't use the I2PTunnel*Server handler thread pool by default, as it may
prevent any clients from accessing the server if the handlers get
blocked by the streaming lib or other issues.
* Don't overwrite a known status (OK/ERR-Reject/ERR-SymmetricNAT) with
Unknown.
2005-08-23 jrandom
* Removed the concept of "no bandwidth limit" - if none is specified, its
16KBps in/out.
* Include ack packets in the per-peer cwin throttle (they were part of the
bandwidth limit though).
* Tweak the SSU cwin operation to get more accurrate estimates under
congestions.
* SSU improvements to resend more efficiently.
* Added a basic scheduler to eepget to fetch multiple files sequentially.
* 2005-08-21 0.6.0.3 released
2005-08-21 jrandom
* If we already have an established SSU session with the Charlie helping
test us, cancel the test with the status of "unknown".
2005-08-17 jrandom
* Revise the SSU peer testing protocol so that Bob verifies Charlie's
viability before agreeing to Alice's request. This doesn't work with
older SSU peer test builds, but is backwards compatible (older nodes
won't ask newer nodes to participate in tests, and newer nodes won't
ask older nodes to either).
2005-08-12 jrandom
* Keep detailed stats on the peer testing, publishing the results in the
netDb.
* Don't overwrite the status with 'unknown' unless we haven't had a valid
status in a while.
* Make sure to avoid shitlisted peers for peer testing.
* When we get an unknown result to a peer test, try again soon afterwards.
* When a peer tells us that our address is different from what we expect,
if we've done a recent peer test with a result of OK, fire off a peer
test to make sure our IP/port is still valid. If our test is old or the
result was not OK, accept their suggestion, but queue up a peer test for
later.
* Don't try to do a netDb store to a shitlisted peer, and adjust the way
we monitor netDb store progress (to clear up the high netDb.storePeers
stat)
2005-08-10 jrandom
* Deployed the peer testing implementation to be run every few minutes on
each router, as well as any time the user requests a test manually. The
tests do not reconfigure the ports at the moment, merely determine under
what conditions the local router is reachable. The status shown in the
top left will be "ERR-SymmetricNAT" if the user's IP and port show up
differently for different peers, "ERR-Reject" if the router cannot
receive unsolicited packets or the peer helping test could not find a
collaborator, "Unknown" if the test has not been run or the test
participants were unreachable, or "OK" if the router can receive
unsolicited connections and those connections use the same IP and port.
* 2005-08-08 0.6.0.2 released
2005-08-08 jrandom
* Add a configurable throttle to the number of concurrent outbound SSU
connection negotiations (via i2np.udp.maxConcurrentEstablish=4). This
may help those with slow connections to get integrated at the start.
* Further fixlets to the streaming lib
2005-08-07 Complication
* Display the average clock skew for both SSU and TCP connections
2005-08-07 jrandom
* Fixed the long standing streaming lib bug where we could lose the first
packet on retransmission.
* Avoid an NPE when a message expires on the SSU queue.
* Adjust the streaming lib's window growth factor with an additional
Vegas-esque congestion detection algorithm.
* Removed an unnecessary SSU session drop
* Reduced the MTU (until we get a working PMTU lib)
* Deferr tunnel acceptance until we know how to reach the next hop,
rejecting it if we can't find them in time.
* If our netDb store of our leaseSet fails, give it a few seconds before
republishing.
* 2005-08-03 0.6.0.1 released

View File

@ -1,6 +1,13 @@
; TC's hosts.txt guaranteed freshness
; $Id: hosts.txt,v 1.149 2005/07/11 18:23:22 jrandom Exp $
; $Id: hosts.txt,v 1.156 2005/08/19 20:19:51 cervantes Exp $
; changelog:
; (1.178) added syndie.i2p and syndiemedia.i2p
; (1.177) added irc.freshcoffee.i2p
; (1.176) added surrender.adab.i2p
; (1.175) added terror.i2p
; (1.174) added irc.arcturus.i2p
; (1.173) added tracker.postman.i2p, hq.postman.i2p
; (1.172) added i2p-bt.postman.i2p
; (1.171) added luckypunk.i2p
; (1.170) added bash.i2p, stats.i2p
; (1.169) added archive.i2p, www.fr.i2p, romster.i2p, marshmallow.i2p, openforums.i2p
@ -404,4 +411,14 @@ marshmallow.i2p=99AaeKrGXmKPUaX256IeRqjvjOLmN4xZnUemTEwFAdanLDa0eMI2UitjrSoq6gy~
openforums.i2p=9fTKl1CC1o2WrxBbE3fvr4xOUGse~qMjmhfn0lpDGXZ0Ohk55ocODIg0AFlPnQd4AchS0VFk0UyWqoZ7B93tyKfOypuVqHl7stucR6PSYMda9QKRlJugX9vwOKKMFOLnu5Xf5T7s7cKqSX8qUDdqeUjDPXyme~xYGhFyn4GftWURF0EesuhY4qP-GTQPEQef2f4DQmTKB1G62~WrRLd6AbvRBmZ4l0qSaKMd7FRjPhwmiXI9JhS~3gsKZ2DUP1Y7th4Lw77lnIdYVejxZ5ZU-PIYd2xnJK~WsazvRp2QOBPa8Vn8X6302IyAbR52zZEB50jucwK4oQ7FbP-Ws4bLA0WoTe1ZxYtDYLo1R5BUqie0cGP~fdc14oR-0NhAEz8VBnTQ4m9CMoz-oaT2R6xWaG4q5MH4671vULudbFeixz2NaGYjuiTO2rCcsvliCibn8sAA~SfRIk~hZF0ZkjyG0SAysVjO~qwjzDBVirSz5YOPp~ZdntbI5hQzjzMIhgxFAAAA
bash.i2p=MsTeG5WIguhorKQ~9xUHVAMHOpia7Y09oid-7vCKS0titHeArGGoal48tkxVD30sXZjxTVBu4lkH9j4oPiQV30wpO0fkSMh1ryiIU1uOTLaAcbNtKCdCixpDchjjJKmxvnTa7LOc1Zft~XkS0ydsxwxsXa9uPU2G3H7WBacJ2v0zxLWyjjJnYIQoJCEHFZrzr~nJ58csCJP-TbJDRegz-0J5LrhZsjbNBFckGrLFtqbRmPwJkS2RU1Kx7SQ0hfN4eFvO7NvratDIo9j3q-OuXyHQm-1RS5XzL5ZxJFHYeaPWgKcLouWHp0MX1OKBnGitGGe1EPzmqf3LwgU8RIohLoFi3CaSc4eO3hYbKLShwz0fzNWFYUeQhLrtLPYA0~fq~lflXpeaXIPOHcJLST0GJd3uWGleS7iV4eXG4LvVlVEEDSYyNNoACmovQRtZOo-DHdUN~iRWy~skNBddX5-hTdAaQyk0ZsfLIF0-pBoWnFW8bFjOcXTF0nNOOgRzXSz9AAAA
stats.i2p=Okd5sN9hFWx-sr0HH8EFaxkeIMi6PC5eGTcjM1KB7uQ0ffCUJ2nVKzcsKZFHQc7pLONjOs2LmG5H-2SheVH504EfLZnoB7vxoamhOMENnDABkIRGGoRisc5AcJXQ759LraLRdiGSR0WTHQ0O1TU0hAz7vAv3SOaDp9OwNDr9u902qFzzTKjUTG5vMTayjTkLo2kOwi6NVchDeEj9M7mjj5ySgySbD48QpzBgcqw1R27oIoHQmjgbtbmV2sBL-2Tpyh3lRe1Vip0-K0Sf4D-Zv78MzSh8ibdxNcZACmZiVODpgMj2ejWJHxAEz41RsfBpazPV0d38Mfg4wzaS95R5hBBo6SdAM4h5vcZ5ESRiheLxJbW0vBpLRd4mNvtKOrcEtyCvtvsP3FpA-6IKVswyZpHgr3wn6ndDHiVCiLAQZws4MsIUE1nkfxKpKtAnFZtPrrB8eh7QO9CkH2JBhj7bG0ED6mV5~X5iqi52UpsZ8gnjZTgyG5pOF8RcFrk86kHxAAAA
luckypunk.i2p=cWFSoPjjQbtISkSzz2-dZf~ZlN1ZwTqG1DDEpcS3pOW8VzrOR6pQaUIQpzdx6MWa8tHplPlFiw~Y-KX6UlXfr557X7L9s-mld9hlQwAy4FGc6eGAe3XmtwKLLCHhmsTozea3nRE9nOIsPwntz7n~f1yu8PrkehzrMJfeqOIPA5teyyfxmeKwa3Jnr4YC8qc2EWSACgh-tAORnjVpMrByuFRaW-J~6cFBJ7gobC4aZfhIlcDFIt96VT4caUlsqKQjgQQ9k8oPrlbFsQHTmNB7331aapstQZ8WJsRIqkSLOUIGrE1XOzUFGthmk5urwcNcsOZmunlMtMRS2RN4rJp2aDFfr~wPGkh5QtueRAT1mSbn116qqRRo6PQvARZQk0oqtm97bqfqdb5SbELHJpbqpJLoAiNQywIgXwgFLP8LYJnQO64EreVIvOjhevZUav5kbQZVE41NjJcT5ZmDtQLHVA~gVsHiYe0KxTRoAOwrlsx~Z3vExq3I1Yd-vhS6ATA5AAAA
luckypunk.i2p=cWFSoPjjQbtISkSzz2-dZf~ZlN1ZwTqG1DDEpcS3pOW8VzrOR6pQaUIQpzdx6MWa8tHplPlFiw~Y-KX6UlXfr557X7L9s-mld9hlQwAy4FGc6eGAe3XmtwKLLCHhmsTozea3nRE9nOIsPwntz7n~f1yu8PrkehzrMJfeqOIPA5teyyfxmeKwa3Jnr4YC8qc2EWSACgh-tAORnjVpMrByuFRaW-J~6cFBJ7gobC4aZfhIlcDFIt96VT4caUlsqKQjgQQ9k8oPrlbFsQHTmNB7331aapstQZ8WJsRIqkSLOUIGrE1XOzUFGthmk5urwcNcsOZmunlMtMRS2RN4rJp2aDFfr~wPGkh5QtueRAT1mSbn116qqRRo6PQvARZQk0oqtm97bqfqdb5SbELHJpbqpJLoAiNQywIgXwgFLP8LYJnQO64EreVIvOjhevZUav5kbQZVE41NjJcT5ZmDtQLHVA~gVsHiYe0KxTRoAOwrlsx~Z3vExq3I1Yd-vhS6ATA5AAAA
i2p-bt.postman.i2p=pL4Xjp3RFu4trp35Z9kLrsOX~pqizwk6x6iydmx1JAjNdhzppPxUvix8NRLEu~n-LURObrsMX6mTF~VHWEDTBBCaxUw6y9NGvKeTRRtCirK6NXFQQIAo-STGIA3z6DPN4G5IGtVsOOKx0ucjh7rkHH0k7p4g4rxnbQ0S3XWSAOqeIpnK-pNQDCr~p9rd2PAGCQVsWLSlOJzUITTrQe2~w2by-eysIJXcO59iiXloj9O3JLY3Yzy3fjjjXHnXH5WeP5kbWOuufXBUF2A9mrEzQo1a0hZNb0bnVFevtdNLMuiB4cNxWE6UfBLZXmqwW2JNwFs799rxgpVEjUwJiYttv2tZoMdESOivlc3IZmYESWa~2LsO-23nOYx55X5nfpJ~HBqipgecTgS37F4GNa5m-d9FaMeAcBy23X-nBDFxCob3PMxBAQ9adpTZj-IE04RvSbara28tu4~cQ9KpzYeRQ7Mt-PPuKo4gzxMZUo4IL1aGXRBUb1U~Ph7Lh4r8LZc4AAAA
tracker.postman.i2p=YRgrgTLGnbTq2aZOZDJQ~o6Uk5k6TK-OZtx0St9pb0G-5EGYURZioxqYG8AQt~LgyyI~NCj6aYWpPO-150RcEvsfgXLR~CxkkZcVpgt6pns8SRc3Bi-QSAkXpJtloapRGcQfzTtwllokbdC-aMGpeDOjYLd8b5V9Im8wdCHYy7LRFxhEtGb~RL55DA8aYOgEXcTpr6RPPywbV~Qf3q5UK55el6Kex-6VCxreUnPEe4hmTAbqZNR7Fm0hpCiHKGoToRcygafpFqDw5frLXToYiqs9d4liyVB-BcOb0ihORbo0nS3CLmAwZGvdAP8BZ7cIYE3Z9IU9D1G8JCMxWarfKX1pix~6pIA-sp1gKlL1HhYhPMxwyxvuSqx34o3BqU7vdTYwWiLpGM~zU1~j9rHL7x60pVuYaXcFQDR4-QVy26b6Pt6BlAZoFmHhPcAuWfu-SFhjyZYsqzmEmHeYdAwa~HojSbofg0TMUgESRXMw6YThK1KXWeeJVeztGTz25sL8AAAA
hq.postman.i2p=P0T4cDiNYzHWcroPCZrH9Co9kvvHeZlLJ0jNn4qwO5rEDTZbNDzAHTuU2GdgUDd9yi~vLTiPC7T6BUMQz2y-Spg-L0D8qiGCBcFZ5ZPWKTDi-Am3WJIbeM~9psndc2X-sG09LrKb7Da3K1hd7WFA37tYXpeiavo5C1LbRZU0H0X2BrGI5fVt~NmUH0D1Vih2P0zY1vx9pb4pjobPyzL~34u5sO~YsLZjfWHm0dK8MUaypFm9bRDNOOKLO5Q32qHPzrDJ9jx6OLZf4g2UkDrJF3F3lzcdO84oFEP0xsYyre5UoMWvtxTQNWc-NdSq5cvvNJeudys3gudhDJft31ypwRPDmI3Z8Jx1~qS1KOQosCtbd3vsArz-ghC9hGFQsbw~leN0XgCZhqHrakNCQ5d7jMXarBtgFYCd51DDS8NbXkli7~C2Yn7-UGMGTlbHgmPB8dsQXT1PfcGVOw3tpHvliMZVUzHX0wjgfWyDzLQzIRKkB1G28e-oDqrA88vEdzz6AAAA
irc.arcturus.i2p=8d5AIidKU7JudLbGi94VCS7Gu~SrSb4rxBp6ADbc9o0MiOp0vrKGVTdrWiDDLmlneRLqVTPIRmBkxB9PAtRZYmdvstiae9zat5twU7T~xA-mf95t1HZXPTMEnxOB6ZX~fSlxLWR3robKDq7L1DOF9aBXT5fw5KcdZmAw2pDeslobozNkB~38siDW3VEKQrNK6SlgkbcQ2ob5fQjUEN3IPnLbyhP4HAap0CppUfX3ix0YU1EP86XHig8ZUgmq2YQ8LpBOGZ21yHQ3pGDHuvgHHGl7bQqz2TV5MnyUaHkCAHu6d~agqdJc4ooORJZMUkWnFIbL4ioJbzZ9zPIPDam81Qw04MuTY5vPEz1Hx9egdWzJX-kCjFv~3-SPX0QVGYAY-cg~fIJfxH0G3jrjXOfGO6NelDiuyTGhvhCR1Y2O6jTqFyVyUc-WZAHAs2qRacfR-TtEpP2-s7fY191aWwxycD3tbXx1F0FG6AYJcnFhUFFp2uoUCryrY7HA6NA5lIDfAAAA
terror.i2p=WJ23KjbgLR19NidOl-0TU4kZDfIljbWoTTen07d9Wi-ZFZhaJpF6Kxid-60vhBmEjzoFId1IKuaCZtLK~42whQ7yvsYD0fMda9gZE-FmpWGGwneRHQxtYRU0Tx8sraSipCZ3Sq9i~cjTbyHpFt9YtObvhWOOi1T2AjWRewvTGWKKcxRL1Bfny2ZpN84ADnqZwJp-gmbhgDPB0CFYi67BzIRTYnRseVILepokUHzPWLiIaJM9GYVgyNp18XpFxkjGZ8NYP6phdz2oJVmsHate6W-3LIAyo5mfIw95nhr5PHezQz1W-AeCubvl2gQ04knFY28DAMuxMyWeyIfDROxuPgcwzUMdw6q8QnEPJKr9vag-dnPh8C8U4Ur~DlS5BgHoJvv5saFkU~I1hvEK9YdPxNx4IOSnHsZZdjCXA8d4zAd7X-kAZ56cV4EABkGzEx9SXwyroA--BT2GeywNGf2HvWTV-ElQ~QkqD-rFgHZspjI-q8iKrcXp4Oz-8CJzKkzjAAAA
surrender.adab.i2p=ReLukCcBIuGvQwm0~oXMYDjIjRSsQXHdRf7ANaJpUoCx5qRPg340308vKWW0DiijdnRNINf-C4SoMxK7mjx5itttBP~AwAK2wJc9ZFX-9qJ9Gg8EBcvxSRn6taasjw5zQCvEukmsPdVwCBQPN~5NGotDv~OZ5TsRtTmAT3ovAnfFFDx~Cu1QsLW3u~K9TwR989H-iVvZyGFV-V~L5V3WISRqx-qnjZ3-fOWfhj6aNT0q3fWMptMKQAqYjt6rCjAwzt0ppDFfIEVrBDc3ik2nziefRiFKv~79tHNjZmMEDrKEiLu~bXn5HNy0hF24dsCtqmoGZ5Dfzst1XriIkiVLvF-cxw6HqdNhsGi9AdoAeNhpVVaYPKYoEiobT31VAGvTMoodbgajT-W3gYMWs74enUlHXJ7eJd31OISV5XWDH-NgsbhpcG1HK8GPjlBeYsGUs4V-lcpig8k8Pibe7Y~Q8eyqjWHGSlcTW7TehRrxZRMskp3lAG-oFLCapHaMVKMYAAAA
irc.freshcoffee.i2p=VM42PrZcVJyDFV6Eqt1WkqZ6G260D72u6wsU8Bxt0oSDd5UtCkcrduYMl-~9bgc6AZJpJ1absO5-opUaSqA~0ypGox6gdlvKVCHqHhxR89VLy8nO-kS-cVBXb8TeUq5MdH3djZIQ8zKv9YL5Q9lGW2Nd1od~re~w2F5-AWM25y9P91Pu6wymokYlZoaIffG3O8aXA~0jBweqnE7epuSK5e~kZ~5omDBnfYlNC2MoNxgEiNbuyQdfXbLf5QjHzEIlIv1-BSym-OY9fCwqRuJc4eCKaXrMg6hy2U-HEkMz80Gq-2gEI-uxqTJHnNG4h276rU7ej1FpCPrsnXS0zSj7ppbksOBCrkqlNEYhy1wrCjoREfbBN9A1kHDTfT9cR73Ym8S2-incCzoQrcyJds-2KmXa5vfr5Pvt2v3SYXkrTKJzZXMhXotLP7CAzItVh~SXYMOQtiVd4NKXTgSmXVarewsHcbxnZuQUr0qimjAsTEJZPZMppQFNkfPAAqIoqz0wAAAA
syndie.i2p=n-9clS0jBJTzYl7QStNR5A34sN2Xq-HmIjDE~MRLpUkMtw-0KSnqYb11S581~~hx0xyby5v9dSETx84Hbkv2g7h3SdjswMlLe-8Z0~S3ssR-bp5Y5L3cpg~fHlgxjEqAC~SGZFCWR~ZyYlRbdd~tnOJ5Fij57acIpuTRRnRjImDWJeboFfKeXqDHeMhcyMi8JTxcCCo3vwAZ6G1qffpGbAgnktriYRnMn97H7BBK~KGCFjqwDo172kUORXpDNHUstCwT-Oi5oCb7EN6vPEwYbyiHVuApOFGzyu0IyBZ5SZvfifrQLUbtXj2enmgrcJByq1XIUy7-E3pqMnNgzdT2~34RzNX4aGVFBe0Rb~zEA3cdb9koLyVgLXYUbWqOmn~blbFDHTPAIyNDiFL1vM~8ZorBYEVTdeD8bvMvQgoPHZRcBIDFrtAmjaT0XXzaHL9T8sr6FYt2j1giLUrWllpWbjYb5eWdMwkPa0ke2PHCOO0lF1JKDjK9G3YiYM9z2MyUAAAA
syndiemedia.i2p=Rj9ZbkNHJEbbzrsuWGVR7jaBrpMvgw5D92WMtkbBDoDLyX8co17tJYJ58YpJ~cMaC9h8tnpwg15ws05U4Bb4lYvFyamnc9wfBSNwwdz2hgbIGHtBRMXh8Lb5NQZJLZdbFQlmd7jVSjpJc8vNVa-8lWGgy4ExryXa0Ps~HQLckIy6Fr0Fc18DS~G7aZk-6kKpvLQheN0ZRyzcMBDoniSG2z5MJlh6MMA0I5ZLAuUT9Ugg-cy9y05eTeIL6uVILc1LG5Dqc-5xbAGdIPi5d0~Ij2nqOA80PIyRR6zUKt4zbFmlO5bm3mndKYmoeL-XcKkw3BvEvXUBK4q6L43pjgH9PQHszGy-2rJXC7h41rxTxTo7Sn3NV04UI0Ixt2zm2ozaCiZdsLcbfrzwm3IER~2Jvsr4QJFCfuJhQKWEayPqKeRU5gRDnxtph3afYvYJ6Syp~OFTRIFEhkHxfg6WorKfuwVJZ~GbY8~Ptt4yMSa2u7RPsjEnarBru-tfIbDTpPt3AAAA

View File

@ -1,14 +1,14 @@
<i2p.news date="$Date: 2005/07/27 15:16:44 $">
<i2p.release version="0.6.0.1" date="2005/07/27" minVersion="0.6"
<i2p.news date="$Date: 2005/08/09 13:55:31 $">
<i2p.release version="0.6.0.4" date="2005/09/01" minVersion="0.6"
anonurl="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/i2p/i2pupdate.sud"
publicurl="http://dev.i2p.net/i2p/i2pupdate.sud"
anonannouncement="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-July/000824.html"
publicannouncement="http://dev.i2p.net/pipermail/i2p/2005-July/000824.html" />
<i2p.notes date="2005/07/26"
anonurl="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-July/000823.html"
publicurl="http://dev.i2p.net/pipermail/i2p/2005-July/000823.html"
anonlogs="http://i2p/Nf3ab-ZFkmI-LyMt7GjgT-jfvZ3zKDl0L96pmGQXF1B82W2Bfjf0n7~288vafocjFLnQnVcmZd~-p0-Oolfo9aW2Rm-AhyqxnxyLlPBqGxsJBXjPhm1JBT4Ia8FB-VXt0BuY0fMKdAfWwN61-tj4zIcQWRxv3DFquwEf035K~Ra4SWOqiuJgTRJu7~o~DzHVljVgWIzwf8Z84cz0X33pv-mdG~~y0Bsc2qJVnYwjjR178YMcRSmNE0FVMcs6f17c6zqhMw-11qjKpY~EJfHYCx4lBWF37CD0obbWqTNUIbL~78vxqZRT3dgAgnLixog9nqTO-0Rh~NpVUZnoUi7fNR~awW5U3Cf7rU7nNEKKobLue78hjvRcWn7upHUF45QqTDuaM3yZa7OsjbcH-I909DOub2Q0Dno6vIwuA7yrysccN1sbnkwZbKlf4T6~iDdhaSLJd97QCyPOlbyUfYy9QLNExlRqKgNVJcMJRrIual~Lb1CLbnzt0uvobM57UpqSAAAA/meeting138"
publiclogs="http://www.i2p.net/meeting138" />
anonannouncement="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-September/000878.html"
publicannouncement="http://dev.i2p.net/pipermail/i2p/2005-September/000878.html" />
<i2p.notes date="2005/08/02"
anonurl="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-July/000826.html"
publicurl="http://dev.i2p.net/pipermail/i2p/2005-July/000826.html"
anonlogs="http://i2p/Nf3ab-ZFkmI-LyMt7GjgT-jfvZ3zKDl0L96pmGQXF1B82W2Bfjf0n7~288vafocjFLnQnVcmZd~-p0-Oolfo9aW2Rm-AhyqxnxyLlPBqGxsJBXjPhm1JBT4Ia8FB-VXt0BuY0fMKdAfWwN61-tj4zIcQWRxv3DFquwEf035K~Ra4SWOqiuJgTRJu7~o~DzHVljVgWIzwf8Z84cz0X33pv-mdG~~y0Bsc2qJVnYwjjR178YMcRSmNE0FVMcs6f17c6zqhMw-11qjKpY~EJfHYCx4lBWF37CD0obbWqTNUIbL~78vxqZRT3dgAgnLixog9nqTO-0Rh~NpVUZnoUi7fNR~awW5U3Cf7rU7nNEKKobLue78hjvRcWn7upHUF45QqTDuaM3yZa7OsjbcH-I909DOub2Q0Dno6vIwuA7yrysccN1sbnkwZbKlf4T6~iDdhaSLJd97QCyPOlbyUfYy9QLNExlRqKgNVJcMJRrIual~Lb1CLbnzt0uvobM57UpqSAAAA/meeting141"
publiclogs="http://www.i2p.net/meeting141" />
<h1>Congratulations on getting I2P installed!</h1>
</i2p.news>

View File

@ -4,7 +4,7 @@
<info>
<appname>i2p</appname>
<appversion>0.6.0.1</appversion>
<appversion>0.6.0.4</appversion>
<authors>
<author name="I2P" email="support@i2p.net"/>
</authors>

View File

@ -20,7 +20,7 @@ tunnel.1.type=client
tunnel.1.sharedClient=true
tunnel.1.interface=127.0.0.1
tunnel.1.listenPort=6668
tunnel.1.targetDestination=irc.duck.i2p,irc.baffled.i2p,irc.postman.i2p
tunnel.1.targetDestination=irc.postman.i2p,irc.freshcoffee.i2p,irc.arcturus.i2p
tunnel.1.i2cpHost=127.0.0.1
tunnel.1.i2cpPort=7654
tunnel.1.option.inbound.nickname=shared clients

View File

@ -1,15 +1,16 @@
<i2p.news date="$Date: 2005/07/28 15:33:27 $">
<i2p.release version="0.6.0.1" date="2005/07/27" minVersion="0.6"
<i2p.news date="$Date: 2005/08/22 08:03:11 $">
<i2p.release version="0.6.0.4" date="2005/09/01" minVersion="0.6"
anonurl="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/i2p/i2pupdate.sud"
publicurl="http://dev.i2p.net/i2p/i2pupdate.sud"
anonannouncement="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-April/000824.html"
publicannouncement="http://dev.i2p.net/pipermail/i2p/2005-July/000824.html" />
<i2p.notes date="2005/07/27"
anonurl="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-July/000823.html"
publicurl="http://dev.i2p.net/pipermail/i2p/2005-July/000823.html"
anonlogs="http://i2p/Nf3ab-ZFkmI-LyMt7GjgT-jfvZ3zKDl0L96pmGQXF1B82W2Bfjf0n7~288vafocjFLnQnVcmZd~-p0-Oolfo9aW2Rm-AhyqxnxyLlPBqGxsJBXjPhm1JBT4Ia8FB-VXt0BuY0fMKdAfWwN61-tj4zIcQWRxv3DFquwEf035K~Ra4SWOqiuJgTRJu7~o~DzHVljVgWIzwf8Z84cz0X33pv-mdG~~y0Bsc2qJVnYwjjR178YMcRSmNE0FVMcs6f17c6zqhMw-11qjKpY~EJfHYCx4lBWF37CD0obbWqTNUIbL~78vxqZRT3dgAgnLixog9nqTO-0Rh~NpVUZnoUi7fNR~awW5U3Cf7rU7nNEKKobLue78hjvRcWn7upHUF45QqTDuaM3yZa7OsjbcH-I909DOub2Q0Dno6vIwuA7yrysccN1sbnkwZbKlf4T6~iDdhaSLJd97QCyPOlbyUfYy9QLNExlRqKgNVJcMJRrIual~Lb1CLbnzt0uvobM57UpqSAAAA/meeting139"
publiclogs="http://www.i2p.net/meeting138" />
Welcome to the new 0.6 series of releases, using the new SSU transport!
anonannouncement="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-September/000878.html"
publicannouncement="http://dev.i2p.net/pipermail/i2p/2005-September/000878.html" />
<i2p.notes date="2005/08/08"
anonurl="http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/pipermail/i2p/2005-July/000826.html"
publicurl="http://dev.i2p.net/pipermail/i2p/2005-July/000826.html"
anonlogs="http://i2p/Nf3ab-ZFkmI-LyMt7GjgT-jfvZ3zKDl0L96pmGQXF1B82W2Bfjf0n7~288vafocjFLnQnVcmZd~-p0-Oolfo9aW2Rm-AhyqxnxyLlPBqGxsJBXjPhm1JBT4Ia8FB-VXt0BuY0fMKdAfWwN61-tj4zIcQWRxv3DFquwEf035K~Ra4SWOqiuJgTRJu7~o~DzHVljVgWIzwf8Z84cz0X33pv-mdG~~y0Bsc2qJVnYwjjR178YMcRSmNE0FVMcs6f17c6zqhMw-11qjKpY~EJfHYCx4lBWF37CD0obbWqTNUIbL~78vxqZRT3dgAgnLixog9nqTO-0Rh~NpVUZnoUi7fNR~awW5U3Cf7rU7nNEKKobLue78hjvRcWn7upHUF45QqTDuaM3yZa7OsjbcH-I909DOub2Q0Dno6vIwuA7yrysccN1sbnkwZbKlf4T6~iDdhaSLJd97QCyPOlbyUfYy9QLNExlRqKgNVJcMJRrIual~Lb1CLbnzt0uvobM57UpqSAAAA/meeting141"
publiclogs="http://www.i2p.net/meeting141" />
&#149; 0.6.0.3 now brings you the new peer testing code - please verify your reachability!
<br />
&#149; A new Irc2p server is now online: irc.freshcoffee.i2p [<a href="http://forum.i2p.net/viewtopic.php?t=911" title="Connect using the vanilla internet">external link</a>] | [<a href="http://forum.i2p/viewtopic.php?t=911" title="Connect via the I2P Network">i2p link</a>]
<br />
</i2p.news>

View File

@ -5,9 +5,8 @@ the number of "Active: " peers rise, and you should see some local "destinations
listed (if not, <a href="#trouble">see below</a>). Once those are up, you can:</p>
<ul>
<li><b>chat anonymously</b> - fire up your own IRC client and connect to the
server at <b>localhost port 6668</b>. This points at one of two anonymously hosted
IRC servers (irc.duck.i2p and irc.baffled.i2p), but neither you nor they know
where the other is.</li>
server at <b>localhost port 6668</b>. This points at one of three anonymously hosted
IRC servers, but neither you nor they know where the other is.</li>
<li><b>browse "eepsites"</b> - on I2P there are anonymously hosted websites -
tell your browser to use the <b>HTTP proxy at localhost port 4444</b>, then
browse to an eepsite -
@ -65,8 +64,7 @@ hand side of the page will show up to help you when necessary).</p>
<p>If you are still having problems, you may want to review the information on the
<a href="http://www.i2p.net/">I2P website</a>, post up messages to the
<a href="http://forum.i2p.net/">I2P discussion forum</a>, or swing by #i2p or
#i2p-chat on IRC at <a href="irc://irc.freenode.net/#i2p">irc.freenode.net</a>,
<a href="http://www.invisiblechat.com/">invisiblechat/IIP</a>, or irc.duck.i2p (they're all
linked together).</p>
#i2p-chat on IRC at <a href="irc://irc.freenode.net/#i2p">irc.freenode.net</a>, irc.postman.i2p, irc.freshcoffee.i2p or
irc.arcturus.i2p (they're all linked together).</p>
<p><b>As a note, you can change this page by editing the file "docs/readme.html"</b></p>

View File

@ -1,4 +1,4 @@
<code>$Id: udp.html,v 1.14 2005/07/27 14:04:07 jrandom Exp $</code>
<code>$Id: udp.html,v 1.15 2005/08/03 13:58:13 jrandom Exp $</code>
<h1>Secure Semireliable UDP (SSU)</h1>
<b>DRAFT</b>
@ -573,8 +573,10 @@ quite simple:</p>
<pre>
Alice Bob Charlie
PeerTest ------------------&gt;
&lt;-------------PeerTest PeerTest-------------&gt;
PeerTest -------------------&gt;
PeerTest--------------------&gt;
&lt;-------------------PeerTest
&lt;-------------------PeerTest
&lt;------------------------------------------PeerTest
PeerTest------------------------------------------&gt;
&lt;------------------------------------------PeerTest
@ -592,7 +594,8 @@ that may be reached are as follows:</p>
up to a certain number of times, but if no response ever arrives,
she will know that her firewall or NAT is somehow misconfigured,
rejecting all inbound UDP packets even in direct response to an
outbound packet. Alternately, Bob may be down.</li>
outbound packet. Alternately, Bob may be down or unable to get
Charlie to reply.</li>
<li>If Alice doesn't receive a PeerTest message with the
expected nonce from a third party (Charlie), she will retransmit
@ -713,4 +716,4 @@ with either Bob or Charlie, but it is not required.</p>
<dd>If the peer address contains the 'B' capability, that means
they are willing and able to serve as an introducer - serving
as a Bob for an otherwise unreachable Alice.</dd>
</dl>
</dl>

View File

@ -156,11 +156,13 @@ public class DatabaseStoreMessage extends I2NPMessageImpl {
int compressedSize = (int)DataHelper.fromLong(data, curIndex, 2);
curIndex += 2;
byte decompressed[] = DataHelper.decompress(data, curIndex, compressedSize);
try {
byte decompressed[] = DataHelper.decompress(data, curIndex, compressedSize);
_info.readBytes(new ByteArrayInputStream(decompressed));
} catch (DataFormatException dfe) {
throw new I2NPMessageException("Error reading the routerInfo", dfe);
} catch (IOException ioe) {
throw new I2NPMessageException("Compressed routerInfo was corrupt", ioe);
}
} else {
throw new I2NPMessageException("Invalid type of key read from the structure - " + _type);

View File

@ -25,6 +25,7 @@ import net.i2p.data.i2cp.SessionConfig;
* @author jrandom
*/
public abstract class ClientManagerFacade implements Service {
public static final String PROP_CLIENT_ONLY = "i2cp.dontPublishLeaseSet";
/**
* Request that a particular client authorize the Leases contained in the
@ -71,6 +72,10 @@ public abstract class ClientManagerFacade implements Service {
public abstract void messageReceived(ClientMessage msg);
public boolean verifyClientLiveliness() { return true; }
/**
* Does the client specified want their leaseSet published?
*/
public boolean shouldPublishLeaseSet(Hash destinationHash) { return true; }
/**

View File

@ -30,6 +30,34 @@ public abstract class CommSystemFacade implements Service {
public int countActivePeers() { return 0; }
public List getMostRecentErrorMessages() { return Collections.EMPTY_LIST; }
/**
* Determine under what conditions we are remotely reachable.
*
*/
public short getReachabilityStatus() { return STATUS_OK; }
public void recheckReachability() {}
/**
* We are able to receive unsolicited connections
*/
public static final short STATUS_OK = 0;
/**
* We are behind a symmetric NAT which will make our 'from' address look
* differently when we talk to multiple people
*
*/
public static final short STATUS_DIFFERENT = 1;
/**
* We are able to talk to peers that we initiate communication with, but
* cannot receive unsolicited connections
*/
public static final short STATUS_REJECT_UNSOLICITED = 2;
/**
* Our reachability is unknown
*/
public static final short STATUS_UNKNOWN = 3;
}
class DummyCommSystemFacade extends CommSystemFacade {

View File

@ -59,6 +59,7 @@ public abstract class NetworkDatabaseFacade implements Service {
public abstract void fail(Hash dbEntry);
public int getKnownRouters() { return 0; }
public int getKnownLeaseSets() { return 0; }
}

View File

@ -35,6 +35,7 @@ import net.i2p.data.i2np.GarlicMessage;
//import net.i2p.data.i2np.TunnelMessage;
import net.i2p.router.message.GarlicMessageHandler;
//import net.i2p.router.message.TunnelMessageHandler;
import net.i2p.router.networkdb.kademlia.FloodfillNetworkDatabaseFacade;
import net.i2p.router.startup.StartupJob;
import net.i2p.stat.Rate;
import net.i2p.stat.RateStat;
@ -291,6 +292,8 @@ public class Router {
stats.setProperty(RouterInfo.PROP_NETWORK_ID, NETWORK_ID+"");
ri.setOptions(stats);
ri.setAddresses(_context.commSystem().createAddresses());
if (FloodfillNetworkDatabaseFacade.floodfillEnabled(_context))
ri.addCapability(FloodfillNetworkDatabaseFacade.CAPACITY_FLOODFILL);
SigningPrivateKey key = _context.keyManager().getSigningPrivateKey();
if (key == null) {
_log.log(Log.CRIT, "Internal error - signing private key not known? wtf");

View File

@ -9,6 +9,7 @@ import net.i2p.data.Hash;
import net.i2p.router.admin.AdminManager;
import net.i2p.router.client.ClientManagerFacadeImpl;
import net.i2p.router.networkdb.kademlia.KademliaNetworkDatabaseFacade;
import net.i2p.router.networkdb.kademlia.FloodfillNetworkDatabaseFacade;
import net.i2p.router.peermanager.Calculator;
import net.i2p.router.peermanager.CapacityCalculator;
import net.i2p.router.peermanager.IntegrationCalculator;
@ -97,7 +98,7 @@ public class RouterContext extends I2PAppContext {
_messageHistory = new MessageHistory(this);
_messageRegistry = new OutboundMessageRegistry(this);
_messageStateMonitor = new MessageStateMonitor(this);
_netDb = new KademliaNetworkDatabaseFacade(this);
_netDb = new FloodfillNetworkDatabaseFacade(this); // new KademliaNetworkDatabaseFacade(this);
_keyManager = new KeyManager(this);
if ("false".equals(getProperty("i2p.vmCommSystem", "false")))
_commSystem = new CommSystemFacadeImpl(this);

View File

@ -15,8 +15,8 @@ import net.i2p.CoreVersion;
*
*/
public class RouterVersion {
public final static String ID = "$Revision: 1.210 $ $Date: 2005/07/31 16:35:27 $";
public final static String VERSION = "0.6.0.1";
public final static String ID = "$Revision: 1.223 $ $Date: 2005/09/01 01:55:01 $";
public final static String VERSION = "0.6.0.4";
public final static long BUILD = 0;
public static void main(String args[]) {
System.out.println("I2P Router version: " + VERSION);

View File

@ -19,6 +19,7 @@ import net.i2p.data.DataHelper;
import net.i2p.stat.Rate;
import net.i2p.stat.RateStat;
import net.i2p.util.Log;
import net.i2p.router.networkdb.kademlia.FloodfillNetworkDatabaseFacade;
/**
* Maintain the statistics about the router
@ -99,10 +100,11 @@ public class StatisticsManager implements Service {
stats.setProperty("core.id", CoreVersion.ID);
if (_includePeerRankings) {
stats.putAll(_context.profileManager().summarizePeers(_publishedStats));
if (false)
stats.putAll(_context.profileManager().summarizePeers(_publishedStats));
includeThroughput(stats);
//includeRate("router.invalidMessageTime", stats, new long[] { 10*60*1000 });
includeRate("router.invalidMessageTime", stats, new long[] { 10*60*1000 });
includeRate("router.duplicateMessageId", stats, new long[] { 24*60*60*1000 });
//includeRate("tunnel.duplicateIV", stats, new long[] { 24*60*60*1000 });
includeRate("tunnel.fragmentedDropped", stats, new long[] { 10*60*1000, 3*60*60*1000 });
@ -122,6 +124,14 @@ public class StatisticsManager implements Service {
//includeRate("router.throttleTunnelProcessingTime1m", stats, new long[] { 60*60*1000 });
includeRate("router.fastPeers", stats, new long[] { 60*60*1000 });
includeRate("udp.statusOK", stats, new long[] { 20*60*1000 });
includeRate("udp.statusDifferent", stats, new long[] { 20*60*1000 });
includeRate("udp.statusReject", stats, new long[] { 20*60*1000 });
includeRate("udp.statusUnknown", stats, new long[] { 20*60*1000 });
includeRate("udp.statusKnownharlie", stats, new long[] { 1*60*1000, 10*60*1000 });
includeRate("udp.addressUpdated", stats, new long[] { 1*60*1000 });
includeRate("udp.addressTestInsteadOfUpdate", stats, new long[] { 1*60*1000 });
includeRate("clock.skew", stats, new long[] { 10*60*1000, 3*60*60*1000, 24*60*60*1000 });
@ -139,6 +149,12 @@ public class StatisticsManager implements Service {
//includeRate("stream.con.receiveDuplicateSize", stats, new long[] { 60*60*1000 });
stats.setProperty("stat_uptime", DataHelper.formatDuration(_context.router().getUptime()));
stats.setProperty("stat__rateKey", "avg;maxAvg;pctLifetime;[sat;satLim;maxSat;maxSatLim;][num;lifetimeFreq;maxFreq]");
if (FloodfillNetworkDatabaseFacade.isFloodfill(_context.router().getRouterInfo())) {
stats.setProperty("netdb.knownRouters", ""+_context.netDb().getKnownRouters());
stats.setProperty("netdb.knownLeaseSets", ""+_context.netDb().getKnownLeaseSets());
}
_log.debug("Publishing peer rankings");
} else {
_log.debug("Not publishing peer rankings");

View File

@ -146,7 +146,7 @@ public class ClientConnectionRunner {
/** current client's sessionId */
SessionId getSessionId() { return _sessionId; }
void setSessionId(SessionId id) { _sessionId = id; }
void setSessionId(SessionId id) { if (id != null) _sessionId = id; }
/** data for the current leaseRequest, or null if there is no active leaseSet request */
LeaseRequestState getLeaseRequest() { return _leaseRequest; }
void setLeaseRequest(LeaseRequestState req) { _leaseRequest = req; }

View File

@ -25,6 +25,7 @@ import net.i2p.data.TunnelId;
import net.i2p.data.i2cp.MessageId;
import net.i2p.data.i2cp.SessionConfig;
import net.i2p.router.ClientMessage;
import net.i2p.router.ClientManagerFacade;
import net.i2p.router.Job;
import net.i2p.router.JobImpl;
import net.i2p.router.RouterContext;
@ -269,6 +270,16 @@ public class ClientManager {
return false;
}
public boolean shouldPublishLeaseSet(Hash destHash) {
if (destHash == null) return true;
ClientConnectionRunner runner = getRunner(destHash);
if (runner == null) return true;
String dontPublish = runner.getConfig().getOptions().getProperty(ClientManagerFacade.PROP_CLIENT_ONLY);
if ( (dontPublish != null) && ("true".equals(dontPublish)) )
return false;
return true;
}
public Set listClients() {
Set rv = new HashSet();
synchronized (_runners) {

View File

@ -164,6 +164,8 @@ public class ClientManagerFacadeImpl extends ClientManagerFacade {
return false;
}
}
public boolean shouldPublishLeaseSet(Hash destinationHash) { return _manager.shouldPublishLeaseSet(destinationHash); }
public void messageDeliveryStatusUpdate(Destination fromDest, MessageId id, boolean delivered) {
if (_manager != null)

View File

@ -110,7 +110,7 @@ public class GarlicMessageBuilder {
msg.setMessageExpiration(config.getExpiration());
long timeFromNow = config.getExpiration() - ctx.clock().now();
if (timeFromNow < 10*1000)
if (timeFromNow < 1*1000)
log.error("Building a message expiring in " + timeFromNow + "ms: " + config, new Exception("created by"));
if (log.shouldLog(Log.WARN))

View File

@ -71,9 +71,6 @@ public class GarlicMessageReceiver {
handleClove(clove);
}
} else {
if (_log.shouldLog(Log.ERROR))
_log.error("CloveMessageParser failed to decrypt the message [" + message.getUniqueId()
+ "]");
if (_log.shouldLog(Log.WARN))
_log.warn("CloveMessageParser failed to decrypt the message [" + message.getUniqueId()
+ "]", new Exception("Decrypt garlic failed"));

View File

@ -44,6 +44,13 @@ public class HandleDatabaseLookupMessageJob extends JobImpl {
private final static int REPLY_TIMEOUT = 60*1000;
private final static int MESSAGE_PRIORITY = 300;
/**
* If a routerInfo structure isn't updated within an hour, drop it
* and search for a later version. This value should be large enough
* to deal with the Router.CLOCK_FUDGE_FACTOR.
*/
public final static long EXPIRE_DELAY = 60*60*1000;
public HandleDatabaseLookupMessageJob(RouterContext ctx, DatabaseLookupMessage receivedMessage, RouterIdentity from, Hash fromHash) {
super(ctx);
_log = getContext().logManager().getLog(HandleDatabaseLookupMessageJob.class);
@ -59,6 +66,8 @@ public class HandleDatabaseLookupMessageJob extends JobImpl {
_fromHash = fromHash;
}
protected boolean answerAllQueries() { return false; }
public void runJob() {
if (_log.shouldLog(Log.DEBUG))
_log.debug("Handling database lookup message for " + _message.getSearchKey());
@ -73,9 +82,11 @@ public class HandleDatabaseLookupMessageJob extends JobImpl {
LeaseSet ls = getContext().netDb().lookupLeaseSetLocally(_message.getSearchKey());
if (ls != null) {
boolean publish = getContext().clientManager().shouldPublishLeaseSet(_message.getSearchKey());
// only answer a request for a LeaseSet if it has been published
// to us, or, if its local, if we would have published to ourselves
if (ls.getReceivedAsPublished()) {
if (publish && (answerAllQueries() || ls.getReceivedAsPublished())) {
getContext().statManager().addRateData("netDb.lookupsMatchedReceivedPublished", 1, 0);
sendData(_message.getSearchKey(), ls, fromKey, _message.getReplyTunnel());
} else {
@ -83,7 +94,7 @@ public class HandleDatabaseLookupMessageJob extends JobImpl {
CLOSENESS_THRESHOLD,
_message.getDontIncludePeers());
if (getContext().clientManager().isLocal(ls.getDestination())) {
if (weAreClosest(routerInfoSet)) {
if (publish && weAreClosest(routerInfoSet)) {
getContext().statManager().addRateData("netDb.lookupsMatchedLocalClosest", 1, 0);
sendData(_message.getSearchKey(), ls, fromKey, _message.getReplyTunnel());
} else {
@ -97,7 +108,7 @@ public class HandleDatabaseLookupMessageJob extends JobImpl {
}
} else {
RouterInfo info = getContext().netDb().lookupRouterInfoLocally(_message.getSearchKey());
if (info != null) {
if ( (info != null) && (info.isCurrent(EXPIRE_DELAY)) ) {
// send that routerInfo to the _message.getFromHash peer
if (_log.shouldLog(Log.DEBUG))
_log.debug("We do have key " + _message.getSearchKey().toBase64()

View File

@ -17,6 +17,7 @@ import net.i2p.data.SigningPrivateKey;
import net.i2p.router.JobImpl;
import net.i2p.router.RouterContext;
import net.i2p.router.Router;
import net.i2p.router.networkdb.kademlia.FloodfillNetworkDatabaseFacade;
import net.i2p.util.Log;
/**
@ -44,6 +45,8 @@ public class PublishLocalRouterInfoJob extends JobImpl {
ri.setPublished(getContext().clock().now());
ri.setOptions(stats);
ri.setAddresses(getContext().commSystem().createAddresses());
if (FloodfillNetworkDatabaseFacade.floodfillEnabled(getContext()))
ri.addCapability(FloodfillNetworkDatabaseFacade.CAPACITY_FLOODFILL);
SigningPrivateKey key = getContext().keyManager().getSigningPrivateKey();
if (key == null) {
_log.log(Log.CRIT, "Internal error - signing private key not known? rescheduling publish for 30s");

Some files were not shown because too many files have changed in this diff Show More