Top 7 hidden costs of using Franking Machines

We’ve used a franking machine for the past 8 years and whilst it has suited us some of the time there are some wild claims out there especially from some suppliers who claim it is cheaper than using stamps. Once you add in all the hidden charges it is, in our experience, often more expensive than using stamps.

So what are the costs missing from the headline rates you see advertised:

  1. Rental Costs
  2. Maintenance Fees.
  3. Software Assurance.
  4. Recharge Fees.
  5. Ink Cartridge costs.
  6. Weighing Scale Fees.
  7. The Hassle factor.

1. Rental costs you think cold be minimised with the the purchase of a machine, unfortunately you cannot use a franking machine in the UK without a support contract authorised by Royal Mail which in turn means support companies own the market and will not take on an user-owned machine. Subsequently the Franking Machine Market appears to be dominated by companies with poor customer service practices and a penchant to overcharge at every opportunity.

2. Rental companies initially offer what appears to be reasonable maintenance fee however given they will try to lock you into a long contract, by offering the longer ones with cheaper rental, once signed up they will increase the support costs year on year by a good amount. The support contract will state lots of good reasons why you need the contract such as calibration of the weighing scales however during our contract we never had a visit from the company check or carry out any preventative maintenance on the machine including calibration.

3. Software Assurance, all modern franking machines have a software element which controls the pricing and record what you’re sending so the companies can account for VAT inclusive services such as Saturday delivery or Next Day 9.00am delivery.  When Royal Mail change their fees (which they do at least once a year) you have to update the software to take account of the new prices, the Franking Machine company will charge you to update the software as it is not included in the Support Contract. You get a choice of paying annually for the service or paying as and when it gets updated.

4. Recharge fees; every time you add money to the franking machine you’ll get charged a recharge fee, typically this is a fee per recharge so the more you recharge in one go the cheaper it is.

5. Ink Cartridge Costs: the maintenance contract will state only authorised cartridges can be used in the machine otherwise the contract is deemed broken, these cartridges are typically 2-3 times the cost of  3rd party supplier cartridges which again are inflated due to the small market size.

6. Weighing scale fees. Franking Machine companies will separate the franking machine from the weighing scales, this means they can charge for an additional contract for the weighing scales.

7. In our experience we’ve found the franking market to be poorly regulated, this will probably get worse with the privatisation of Royal Mail, we’ve been duped into longer contracts than initially promised, had our contract traded onto another company without agreement and generally ripped off at every opportunity.

So why use a Franking Machine; frankly (pun intended, sorry) nowadays there is no need to, at the lower volume it would be more cost effective to use stamps (which can be purchased and printed online) and at the higher level it is cheaper to contract direct and use something like Royal Mail’s Online Business Account with a Pre-Paid Impression licence.

 

The Scout Report — Volume 19, Number 17

The Scout Report

April 26, 2013 — Volume 19, Number 17

A Publication of Internet Scout

Computer Sciences Department, University of Wisconsin-Madison


 

Research and Education

Documenting the American South: Colonial and State Records of North Carolina

Impact: Earth!

PSU Case Studies

National Science Foundation: Publications

Cornell University Cooperative Extension

Urban Institute: CHA Families and the Plan for Transformation

Scitable

The Concord Consortium: Projects

General Interest

Stellarium

Arkansas Heritage

Engineers Against Poverty

Digital Arts

Matthew Brady’s Portraits of Union Generals

Mount Auburn Cemetery

Dartmouth Digital Collections: Films

Women Who Rock Oral History Archive

Network Tools

PC Image Editor 5.2

SoundCloudNav

In the News

Old recordings allow researchers and public to hear the voices of the past


 

Copyright and subscription information appear at the end of the Scout
Report. For more information on all services of Internet Scout, please
visit our Website: https://scout.wisc.edu/

If you’d like to know how the Internet Scout team selects resources for
inclusion in the Scout Report, visit our Selection Criteria page at:

https://scout.wisc.edu/scout-report/selection-criteria

The Scout Report on the Web:

Current issue: https://scout.wisc.edu/Reports/ScoutReport/Current

This issue:

https://scout.wisc.edu/Reports/ScoutReport/2013/scout-130426

Feedback is always welcome: scout@scout.wisc.edu


 

Research and Education

Documenting the American South: Colonial and State Records of North Carolina

http://docsouth.unc.edu/csr/

The Documenting the American South collections from the University of North Carolina are a veritable cornucopia of material about the vast cultural and historical legacy of this complex region. The digitization project was made possible by a Library Services and Technology Act grant distributed through the State Library of North Carolina. Visitors can delve into the colonial and state records of North Carolina by looking over 26 volumes of material. These volumes were originally published between 1886 and 1907 and feature a four-volume master index. Visitors can search the entire archive via the search engine or click on the small icons to open documents like “A New Map of Carolina” from 1690 or the engraving titled “Governor Tyron and the Regulators”. Also, users can click on the Browse CSR tab to look around by volume, date, or creator type. [KMG]

Impact: Earth!

http://www.purdue.edu/impactearth

What would happen if a large meteorite or other object hit the Earth? It’s something that has engaged the minds and talents of astrophysicists (and students of all ages) for decades. Now the generally curious can create their own simulated impact with Purdue University’s “Impact Earth” website. Visitors can browse the Famous Craters area to get started. This part includes some “classics,” such as the Ries Crater and the Tunguska Fireball. Of course, visitors really must use the handy interface to craft their own impact, projectile, and target parameters to get the full effect on how such an event plays out. Also, the site includes a complete Documentation file (a peer-reviewed article) and a detailed glossary. [KMG]

PSU Case Studies

http://www.engr.psu.edu/ethics/casestudies.asp

How does one teach ethics? It can be a difficult subject and different fields (medicine, law, and so on) all have different ethical considerations and issues. This fine collection of engineering case studies from the Pennsylvania State University College of Engineering brings together resources from a variety of universities that have worked to address this matter. The cases are divided into separate areas that include Developing and Using Case Studies, General Science Cases, and Research Integrity Cases. Visitors shouldn’t miss the bulk of the material covered in the General Engineering Cases area, which includes high-quality and contemplative materials on engineering practice ethics from SUNY-Buffalo and the National Science Foundation. The site is rounded out by a number of helpful cases developed in-house by Penn State engineering students. [KMG]

National Science Foundation: Publications

http://www.nsf.gov/publications/

Every year, the National Science Foundation (NSF) researches a broad swath of topics ranging from graduate education in geography to the viability of sustainable agriculture. Visitors can scan through these documents here, on a website which includes recent publications like “Collections in Support of Biological Research” and “Baccalaureate Origins of U.S.-trained S&E Doctorate Recipients.” The archive contains over 3,200 documents, which visitors search by publication type or specific organization within NSF. Visitors can also elect to sign up to receive notices about newly added publications via RSS feed or email. [KMG]

Cornell University Cooperative Extension

http://www.cce.cornell.edu/Pages/Default.aspx

The Cornell Cooperative Extension program brings Cornell University’s land-grant programs to citizens across the Empire State. This website is part of the Extension’s rather impressive public outreach efforts. Clicking on the Program Areas tab allows visitors to learn about various thematic work on subjects like Agriculture and Food Systems and Community and Economic Vitality. Each of these areas includes resources culled from various state agencies, such as databases and fact sheets. In the About area, visitors can learn about the organization’s long-term strategic plan and also about local offices across the state. Finally, the News area brings together press releases, videos and blog posts that deal with new innovations in agriculture, community outreach work, and so on. [KMG]

Urban Institute: CHA Families and the Plan for Transformation

http://www.urban.org/housing/Transforming-Public-Housing-in-Chicago.cfm

The Urban Institute provides high-quality research on economic and social policy, addressing topics such as education, employment, crime, and governance. This clutch of documents looks at the transformation of the Chicago Housing Authority and the provision of public housing in the city. The five briefs “describe key successes and challenges faced by CHA and its residents.” Titles address topics like “How Chicago’s Public Housing Transformation Can Inform Federal Policy?” and “Chronic Violence: Beyond the Developments.” Along with these insightful documents, visitors can also look over the Previous Briefs area. Here they will find “The Health Crisis for CHA Families,” “CHA After Wells-Where are the Residents Now?” and a dozen other briefs. [KMG]

Scitable

http://www.nature.com/scitable

Scitable is a completely free science library and personal learning tool created by the Nature Publishing Group. The work is currently focused on genetics and cell biology and covers topics such as evolution, gene expression and “the rich complexity of cellular processes shared by living organisms.” At the Inside Scitable area, visitors can browse ad search hundreds of science articles, use the discussion board, build an online classroom, and also contribute and share content. First-time visitors should head on over to the Spotlight area, where they can read quality pieces on World Teacher’s Day, nanotechnology, and other topics. Also, visitors shouldn’t miss the Labcoat Life area, which contains musings on topics like “Tackling Mental Illness in Africa” and “Is Global Warming Chiefly Manmade?” [KMG]

The Concord Consortium: Projects

http://concord.org/projects

The Concord Consortium was founded in 1994 by Bob Tinker and Stephen Bannasch, who have since then worked to craft a multitude of technological innovations to help with the educational process. They share some of their findings right here on the Projects section of their website. The projects are divided into three areas: Active Projects, Archived Projects, and A-Z. Currently there are about 20 projects available in the Active Projects area, including Electron Technologies and Molecular Workbench. Each project comes complete with a project portal, featuring activities, teaching materials, and curriculum information. It’s a remarkable collection, and visitors with an interest in pedagogy, science instruction, and related topics will find much to enjoy here.[KMG]

General Interest

Stellarium

http://stellarium.org/

While looking up at the night the sky, humans throughout the millennia have asked that age-old question: “What’s out there?” Stellarium provides entry into the world beyond Earth by offering this free open-source planetarium. The program includes over 600,000 stars, along with additional functionality that allows users to download data on over 210 million stars. Also, the program contains illustrations of the constellations and images of nebulae. The user interface is quite easy to use, as it gives users the ability to zoom in and out or use a fisheye projection as a way to experience a bit of that true planetarium feel. Also, the program offers users the ability to add new solar system objects from online resources and even create new effects, such as star twinkling and shooting stars. It is compatible with most operating systems. [KMG]

Arkansas Heritage

http://www.arkansasheritage.com/

The mission of the Department of Arkansas Heritage is “to identify Arkansas’s heritage and enhance the quality of life by the discovery, preservation, and presentation of the state’s cultural, historic, and natural resources.” This umbrella site brings together the activities of a number of state agencies, including the Old Statehouse Museum, the Historic Arkansas Museum, and Arkansas Arts Council. It’s a great idea to get started by clicking on the Discover Arkansas History tab. Here visitors can explore narrative essays that include “Natural Environments,” “Culture,” and “Politics.” All of these sections contain helpful lesson plans and activity sheets, which is a nice bonus. Visitors shouldn’t miss the Calendar area for up-to-date information on talks, fairs, and other events sponsored by any of these agencies. [KMG]

Engineers Against Poverty

http://www.engineersagainstpoverty.org/

Engineers Against Poverty (EAP) is a non-governmental organization that works in the field of engineering and international development. EAP works to harness members’ combined skills to alleviate poverty throughout the world and work on the challenges involved with sustainable development along the way. The materials on the site are divided into five sections, including Major Initiatives, Key Issues, Publications, and EAP’s Programme. A good place to start is the Major Initiatives area. Here users can learn about some of the key issues and challenges in the domain of engineering, poverty reduction, and more. The EAP’s Programme area has information and working papers on the organization’s work in transforming extractive industries and infrastructure projects. Finally, the Publications area contains works like “Employment Intensive Road Construction” and “Climate Compatible Dev
elopment in the Infrastructure Sector Overview.” [KMG]

Digital Arts

http://www.digitalartsonline.co.uk/

The Digital Arts website was designed to offer “inspiration for digital creative.” It does a fairly standup job of that, offering news updates, tutorials, reviews, features, portfolios, and information about upcoming competitions that will be of interest to those working in a range of industries. First-time visitors would do well to look at the Short Cuts area to learn about new design websites, watch artists work on compelling large format projects, and pick up scuttlebutt from experts in their fields. Moving on, the Tutorials area offers helpful guides such as “How to stop photo noise,” “Add texture to retro styled artworks,” and “Create X-ray vector art.” Finally, the Guides area contains helpful overviews of key fields and programs like Adobe Creative Suite 6, animation, graphic design, and interactive design. [KMG]

Matthew Brady’s Portraits of Union Generals

http://www.npg.si.edu/exhibit/uniongenerals/

In the 21st century, photographer Matthew Brady (ca. 1822 – 1896) is widely remembered as a chronicler of the Civil War, but by the time the War began in 1860, Brady and his studio were already well-established as portrait photographers. This show, from the National Portrait Gallery, presents 21 of Brady’s portraits of Union Generals. The introduction on the website is illustrated with a view of Brady’s studio in New York City, showing customers browsing large format portrait photographs hung on the walls. However, the hundreds of generals photographed by Brady and his team preferred the smaller, calling card-size photographs known as cartes de visite, and the web exhibition consists of digital reproductions of modern prints made from Brady’s carte-de-visite negatives. Each general’s image is accompanied by a short history, such as the story of General Joseph Hooker,
who was defeated by Robert E. Lee’s much smaller army at Chancellorsville, Virginia in 1863. The histories will be familiar to Civil War buffs, but even the uninitiated can get a crash course in military history by viewing the Generals’ images and stories at the site. [DS]

Mount Auburn Cemetery

http://www.mountauburn.org/

The bucolic grounds of Mount Auburn Cemetery are fascinating, and have provided solace to thousands of departed souls since 1831. The grounds are also quite historic and the cemetery’s website provides ample information for historians, sociologists, and others who might be interested in studying this unique place. New visitors should read the reminisces offered by persons of note in the “What Makes This Place Special?” There are paens offered up by William Ellery Channing, Emily Elizabeth Parsons, and Dorothea Dix. Moving along, the Visit section offers information on guided walks, birding tours around the grounds, and special events. Of course, there is also information on the more traditional activities and ceremonies associated with any cemetery available under the Cemetery link. [KMG]

Dartmouth Digital Collections: Films

http://www.dartmouth.edu/~library/digital/collections/dartmouthfilms/

The Dartmouth College Library has crafted digital collections celebrating some well-known alumni (such as Dr. Seuss) and other topics. This particular collection brings together a very fine set of films documenting activities and events that took place on the campus. The items here are divided into two sections: Historical Films (1930s-1960s) and Contemporary Films (2008-2012). The Historical Films include Green Flashback, which offers a compilation of color films of student life from 1946. Also quite intriguing is the 1956 film “Dartmouth Visited,” which is a promotional film for potential applicants. The contemporary films include a nice tour of the Dartmouth College Library and an exploration of the library’s wonderful bell tower. [KMG]

Women Who Rock Oral History Archive

http://content.lib.washington.edu/wwrweb/

The University of Washington Libraries has created this ambitious and culturally compelling digital collection of “Women Who Rock.” The collection brings together “scholars, musicians, media-makers, performers, artists, and activists to explore the role of women and popular music in the creation of cultural scenes and social justice movements in the Americas and beyond.” The site includes oral histories, photographs, and films. It’s a good idea to start with the Oral Histories area to learn about thirteen fantastic women who are artists, writers, and performers from the Pacific Northwest and beyond, like Medusa and Maylei Blackwell. The Photographs area contains over 370 photos documenting the lives and experiences of these women. It’s a remarkable set of materials, and more documents will be added over the coming months. [KMG]

Network Tools

PC Image Editor 5.2

http://www.program4pc.com/image_editor.html#page=page-1

This image editor is one of the better ones available, and it is designed to be used by everyone from amateur shutterbugs to seasoned professionals. This editor supports eleven image formats, image alignment, color adjust, image dimension manipulation, and more. This particular version is compatible with computers running Windows XP and newer. [KMG]

SoundCloudNav

https://chrome.google.com/webstore/detail/soundcloudnav/nopkchcbhjjeaacnipimcelfchiifaip/

For people who like to use SoundCloud to control their musical selections while working, this helpful plug-in will be a welcome find. SoundCloudNav will allow users to explore different tracks and manipulate them as they see fit. This version is compatible with all computers utilizing Google Chrome. [KMG]

In the News

Old recordings allow researchers and public to hear the voices of the past

We Had No Idea What Alexander Graham Bell Sounded Like. Until Now

http://www.smithsonianmag.com/history-archaeology/We-Had-No-Idea-What-Alexander-Graham-Bell-Sounded-Like-Until-Now-204137471.html

Playing the Unplayable Records

http://www.smithsonianmag.com/multimedia/videos/Playing-the-Unplayable-Records.html

Curators discover first recordings of Christmas Day

http://www.bbc.co.uk/news/science-environment-20772246

Listen as Albert Einstein Reads ‘The Common Language of Science’ (1941)

http://www.openculture.com/2013/03/listen_as_albert_einstein_reads_the_common_language_of_science_1941.html

Extracting Audio from Pictures

http://mediapreservation.wordpress.com/2012/06/20/extracting-audio-from-pictures/

The Library of Congress Recorded Sound Reference Center Online

http://www.loc.gov/rr/record/onlinecollections.html

Photography has been around for a long time, and portraiture even longer. Some written sources date back millennia. We gather a great deal of information through analysis of artifacts, skeletons, and very old trash. In these ways, we have a good idea of what our ancestors looked like, what they thought about, how they lived, and even what they ate. However, the sounds of their voices have long been lost. To re
medy this gap in our knowledge somewhat, researchers have be!
en worki
ng on a variety of ways to hear recordings previously thought unplayable. These early recordings, many of which survive on delicate wax discs or only in photographs, were often designed for unknown playback mechanisms, or are too fragile to stand up to the rigors of being played. Nevertheless, there have been recent breakthroughs, including those by physicist Carl Haber and colleagues, who scanned very old recordings and converted them into computer audio files. These have allowed us to hear a variety of voices from times past, including for the first time the renowned Alexander Graham Bell. [CM]

The first link leads visitors to a Smithsonian Magazine article on the rediscovery of Alexander Graham Bell’s carefully enunciated voice. The second link explores the idea of “playing the unplayable” in a short video also from Smithsonian Magazine. After clicking on the third link, interested parties will be able to hear some of the first recor
dings of a family’s Christmas Day – in 1904. The fourth link leads to a recording of Albert Einstein reading his full essay “The Common Language of Science,” which is a delightful listen. The fifth link describes the process of recreating sound from an image of a record, which is sometimes all that remains when original recordings are lost to time or damage. Finally, the sixth link goes to the Library of Congress Recorded Sound Reference Center, which features 21 collections of old, rare, and curious recordings from the Library’s archives.

 


 

Below are the copyright statements to be included when reproducing
annotations from The Scout Report.

The single phrase below is the copyright notice to be used when
reproducing any portion of this report, in any format:

From The Scout Report, Copyright Internet Scout 1994-2013.
https://www.scout.wisc.edu/

The paragraph below is the copyright notice to be used when
reproducing the entire report, in any format:

Copyright © 2013 Internet Scout Research Group –
http://scout.wisc.edu

The Internet Scout Research Group, located in the Computer Sciences
Department at the University of Wisconsin-Madison, provides Internet
publications and software to the research and education communities
under grants from the National Science Foundation, the Andrew W. Mellon
Foundation, and other philanthropic organizations. Users may make and
distribute verbatim copies of any of Internet Scout’s publications or
web content, provided this paragraph, including the above copyright
notice, is preserved on all copies.

Any opinions, findings, and conclusions or recommendations expressed
in this publication are those of the author(s) and do not necessarily
reflect the views of the University of Wisconsin-Madison, or the
National Science Foundation.


 

To receive the electronic mail version of the Scout Report each week,
subscribe to the scout-report mailing list. This is the only mail you
will receive from this list.

To subscribe to or unsubscribe from the Scout Report or in text format,
go to:

https://scout.wisc.edu/mailman/listinfo/scout-report/

The Scout Report (ISSN 1092-3861) is
published every Friday of the year except the last Friday of December by
Internet Scout, located in the University of Wisconsin-Madison’s Department
of Computer Sciences. Funding sources have included the National Science
Foundation and the University of Wisconsin Libraries.

Internet Scout Team
Max Grinnell Editor
Carmen Montopoli Managing Editor
Edward Almasy Director
Rachael Bower Director
Andrea Coffin Information Services Manager
Autumn Hall-Tun Internet Cataloger
Sara Sacks Internet Cataloger
Tim Baumgard Web Developer
Corey Halpin Web Developer
Zev Weiss Technical Specialist
Evan Radkoff Technical Specialist
Debra Shapiro Contributor
Holly Wallace Administrative Assistant
Michael Penn II Administrative Assistant

For information on additional contributors,
see the Internet Scout staff
page
.

Moving to a responsive web design

We’ve been looking for a while now at a content management system (CMS) that would suit our needs, alas to no avail. Some have come close however there always seems to something that would stop us.

As we couldn’t find one we’ve decided to hand code our site in the same manner that our original site was built.

We have been taking on board html5 snippets for a while now and the html5boilerplate site has proved a good source as has .net magazine.

Given we wanted a good design that was responsive this we found a was a good starting point and so far has proved to be a good grounding.

Issues we’ve found to date surround the monetization of the site in the google Adsense do not do responsive ad units that can be used and we’ve got issues with directing people to other websites that may not have responsive pages.

Incorrect Franked Mail refunds from Royal Mail – their latest wheeze!

Using a franking machine you occassionally make a mistake on the franking label, either in value or it smudges or prints poorly.

To make life easy for the Royal Mail they have a scheme where-by you can send these in and they will refund you the value less an amount for their admin of the scheme.

Prior to May 2012 they charge 5% of the returned items which had to be no more than 6 months old and the total valure had to be over £10.  In line with all their price hikes this suffered a 200% increase to 15% of the total cost of the franking pmpressions returned.

Their latest wheeze which they stated was also part of the new prices from May 2012 is to now demand you return the entire envelope which has the franking impression upon instead of what we always have done which is to return the label/impression section only.

Funnier still; when we stated concern over data protection with returning the entire envelope with our customers address on they said it was OK to cut this off!  – OKay let me get this right I can’t just send back the franked part of the envelope but I’ve got to send the entire envelope less the addressee part, surely this is just a bigger piece of paper in effect!!

The customer service assistant we spoke with when we phoned to complain about this said it was something their auditors had demanded as part of their fraud prevention measures.

Given each franking impression has our dye number on it I wonder how intelligent they are (or think we are!)

I’m at a loss how we could defraud them by sending a franking impression (that costs more than the face value printed on it) back for refund that we’ve printed (proved by our dye number) less 15% of the face value – any offers on how we can make money on this?

Irked by them stating they would give us the refund this time if we send them in with a note of the discussion I decided to do some further digging.

The stated this was brought in with the 30th April 2012 price rises – this turns out to be a blatent lie, checking the ROYAL MAIL SCHEME FOR FRANKING LETTERS AND PARCELS 2008 it states in Section 11.6

If a User prints a Franking Mark by mistake, the User may write to the Royal Mail controlling office to apply for a refund within 6 months of the date when the Franking Mark was printed, enclosing the franked envelopes, wrappers or other items which must total no less than £10. If Royal Mail receives all the information and evidence it requires and the amount of Postage or Fees shown by the Franking Mark is legible, Royal Mail will give the User a partial refund of the amount of Postage paid, having deducted an amount which Royal Mail considers to be reasonable to meet the administrative cost of dealing with the User’s application.

This could be read that envelopes must be returned until you check if this has changed, which when I checked the original inactment of the scheme via the London Gazette the wording was exactly the same; so no change.

If they accepted just the Franking Impression portion before then they should accept it now.

 

Cookie Law aka The Privacy and Electronic Communications Regulations 2011

list of ingredients for computer cookiesAs a small business involved in ecommerce and online marketing I’ve spent a lot of time looking into the cookie law to understand what we need to do to successfully comply with the law and, unfortunately,  there is no clear answer, only case law will make it more manageable to understand, what is or is not allowed or, alternatively, the people whom will police the law (the ICO) put a stake in the ground and set a level playing field for everyone to work to.

The Cookie Law

First things first, so we don’t get confused further! the cookie law as it is generally referred to was amended from it’s previous incarnation and now reads, with the amendments (at date of publication: May 2012)

THE PRIVACY AND ELECTRONIC COMMUNICATIONS (EC DIRECTIVE) (AMENDMENT) REGULATIONS 2011 Regulation 6

6. Confidentiality of communications

(1) Subject to paragraph (4), a person shall not store or to gain access to information stored, in the terminal equipment of a subscriber or user unless the requirements of paragraph (2) are met.

(2) The requirements are that the subscriber or user of that terminal equipment-
(a) is provided with clear and comprehensive information about the purposes of the storage of, or access to, that information; and
(b) has given his or her consent.

(3) Where an electronic communications network is used by the same person to store or access information in the terminal equipment of a subscriber or user on more than one occasion, it is sufficient for the purposes of this regulation that the requirements of paragraph (2) are met in respect of the initial use.
(a) For the purposes of paragraph (2), consent may be signified by a subscriber who amends or sets controls on the Internet browser which the subscriber uses or by using another application or programme to signify consent.

(4) Paragraph (1) shall not apply to the technical storage of, or access to, information-
(a) for the sole purpose of carrying out the transmission of a communication over an electronic communications network; or
(b) where such storage or access is strictly necessary for the provision of an information society service required by the subscriber or user.

This law came into effect in the UK on 26th May 2011 although the ICO, the body that police the law gave a year’s grace for companies to become compliant thus after 26th May 2012 they will start to actively respond to complaints regarding the new law.

Am I exempt from the Cookie Law?

Exemptions from the right to refuse  placing information on your device:

The Regulations specify that service providers should not have to provide any information and obtain consent where that device is to be used:

‘where such storage or access is strictly necessary to provide an information society service requested by the subscriber or user’.

In defining an ‘information society service’ the Electronic Commerce (EC Directive) Regulations 2002 refer to ‘any service normally provided for remuneration, at a distance, by means of electronic equipment for the processing (including digital compression) and storage of data, and at the individual request of a recipient of a service’.

Given the above it seems to boil down to what parts of your web site and information you store on their device are part of the service they have asked for and if so is it strictly necessary?

This can get really difficult to define; for a normal web site what constitutes a request? If you assume that clicking from a search engine onto your website is a request what happens if it’s not i.e. the information the search engine provides fails to detail what you provide whose fault is this!

Informed Consent

The Law makers have stated that informed consent is what they are aiming for, this means providing enough information to the visitor to enable them to make a decision on whether to accept the information or not.

You can view this detail from a speech made by the Head of Telecoms Regulation and E-Privacy at the Department for Culture, Media and Sports. here: Cookie Law Vid

I’ve seen lots of arguments over when consent should be obtained, the ICO believe best practice is to obtain consent up front, this appears to be interpreted that you need clear sign posting to getting consent at initial contact and before the user continues on their journey on your site. Interestingly in the video above the Head of Telecoms Regulation and E-Privacy references consent to be the same as that for the data protection act which relies on the web sites privacy policy to impart the information.

A point to note: You should only use further information sites that do not drop cookies otherwise the user will not be able to read more without actually having to compromise their privacy to get an informed decision as the information will not be available without having to accept more information onto their device.

Cookie Gotchas

How do you identify a real “user”? How do you distinguish other bots and spiders from users? Those who blindly require all users to ‘tick a box or see no website’ could find themselves deindexed from various search engines as the will have no content on their page.

How do you identify a different user of the same device? The majority of browsers nowadays have separate caches for users however exceptions exist certainly with home PCs which are set up tin the corner of the house and anyone uses so you cannot guarantee the users is always the same.

This is a real gotcha! If the user has previously visited your site and accepted cookies the old cookies remain on their device until manually deleted. The law states that the information cannot be read without consent so how does anyone prove whether they have been to your site before as you are not allowed to access their device until you get consent!

Using Javascript solutions will not resolve the problem

The problem with javascript solutions is that a lot of people do not allow it for example NoScript is a common FireFox extension installed and I think there are options in other browsers. The law does not differentiate, you need permission, just because they have disabled javascript will not excuse you!

Good luck, I’m off to check my insurance to see if I’m covered for investigations under this law!

It’s about Privacy

Don’t get hung up on the technical side of it (i.e. Cookies) it is not only about cookies it is about not placing information on a users device that can reduce a person’s privacy online.

If you are a normal Internet web site providing information and products you should less worried about the law than if you actively track users across the Internet and use re-marketing techniques.

A good start is to understand what information you place on a device and why. This is not only you directly but any third party you use from your web site, the most cited example for this is Google Analytics however as this is just analytics then you can be a little less worried.

If you use Google Adsense however you need to be worried as this uses all the tracking, retargeting and re-marketing techniques this law was designed to curtail. You can however adjust your business settings to prevent tracking as can the user of the device.

If you use a Content Management System (CMS) such as WordPress you are reasonably safe as it is only when someone registers will higher level privacy information be placed on the device, this can be covered off during the sign up process.

The gotcha in the CMS systems is with regard to plugins, or apps designed to expand the basic functionality of the CMS system, during an audit of one of our sites we found a plug-in that placed a cookie on the users device automatically (they even called it tracking cookie!) it was a image slider and this was not mentioned in their blurb about the plug-in at all hence the real need for an audit!

It’s fundamentally flawed

Technically the law states you cannot store anything on a users device without consent unless ‘strictly necessary’ if this is interpreted at face value you should not download any part of your web site until you have obtained permission from the user of the device.

This is not only cookies it’s images, html files, css files, javascript files basically anything that constitutes a web site as, until you obtain consent, you have no right under this law to store information on their device. This is as, technically, to obtain consent none of these are necessary.

With the move to IP6 version addressing on the Internet it will become extremely difficult to not identify the device being used as each will have it’s own id thus users will need to understand how their data is processed within a business to become fully informed.

Will I get Prosecuted if I break the Cookie Law?

Quite possibly is the simple answer, however as the ICO have stated it will depend upon the severity of the transgression and your intent as to what will happen to you. They have stated that things such as analytics are caught however given the level of private information contained within the data you are unlikely to be severely punished.

Further Issues:

Does Rome I or II come into play, have other European countries passed similar laws that you would have to comply with. If you cater for traffic from other European countries then you need to ensure you comply to their cookie laws when they implement them.

Will continually asking for permission constitute harassment? Maybe not but it’ll sure annoy your customers especially when web sites not based in Europe do not have to worry about the law

To Sum up

This is a new law and case law does not exist yet to define what ‘strictly necessary’ means with respect to this law, their view as with anyone else’s view including the ICO/Lawyers/Mine/yours is arbitrary at the moment. Only when the law is tested will businesses truly understand what is required. The trick is to not be the person who has to test it!

From our point of view we have carried out an audit of our cookies and will be updating our privacy policy to reflect this, we’ve adjusted our Google Adsense use and have a technological solution that will put the cookie consent question on every page for every new user however at the moment this relies on javascript so is not fully compliant. We are holding off using this at the moment as it will impact the customer experience especially as the level of knowledge about this law in the user base is very low.

Although we have taken legal advice for our situation and are developing our plans as appropriate none of the information here should be construed as legal advice as it is not and we are not lawyers!

Further References and reading;

Silktide Cookie eBook

http://www.glovers.co.uk/news.aspx?id=422&Page=2

http://www.forbessolicitors.co.uk/blog/forbusiness/2011/07/computer-cookies-and-the-privacy-and-electronic-communications-ec-directive-amendment-regulations-2011/

http://digital.cabinetoffice.gov.uk/author/dafyddbach/

http://digital.cabinetoffice.gov.uk/2012/01/12/cookies-on-the-beta/

http://www.consumerfocus.org.uk/cookies

http://www.international-chamber.co.uk/components/com_wordpress/wp/wp-content/uploads/2012/04/icc_uk_cookie_guide.pdf

 

 

Classic ASP and SQL Server 2008

I’ve spent a fair bit of time trying to get an ecommerce package written in classic ASP to run well using a network connection to a SQL Server 2008 Express edition. These are my findings which may help someone.

Don’t settle for the standard connection string you used to use.

The ADO connection string provided worked however tweaking this gave a significant increase in through put

Initially we had a choice of

strconn= “DRIVER={SQL Native Client}; Server=” & varServerIP & “; Database=” & varDatabaseName & “; UID=” & varUserName & “; PWD=” & varPassword

or

strconn= “DRIVER={SQL Server}; Server=” & varServerIP & “; Database=” & varDatabaseName & “; UID=” & varUserName & “; PWD=” & varPassword’

The first string failed with a connection error, being SQL Server 2008 you needed to use {SQL Server Native Client 10.0} as the driver name. Using this worked quite well however if you use varchar(max), nvarchar(max), varbinary(max), xml, udt or other large objects you will not get these fields showing up in the results as SQL Native client converts the cursor to a static one making these unavailable. (This Library Article refers – opens in new window)

The second worked although response times were atrocious

The Rules to follow

Don’t mix you metaphors!

You can use OLE DB provider terms in the connection string for an ADO connection and it will work most of the time however in the background the system is deciphering what you mean which in turn impacts how quickly it responds, where possible use the correct connection term. i.e. UID and pwd will work on an ADO connection User ID and password will work better. (this Library page refers to the correct terminology)

Be Explicit

If you are using a connection to a SQL server that is not on the same server and shared memory either reorder the client protocols in SQL Server or specify the connection protocol, the previous link details this however to highlight and emphasise I’ve reproduced the relevant bit here:

The complete syntax for the Address keyword is as follows:

[protocol:]Address[,port |pipepipename]

protocol can be tcp (TCP/IP), lpc (shared memory), or np (named pipes). For more information about protocols, see Choosing a Network Protocol.

If neither protocol nor the Network keyword is specified, SQL Server Native Client will use the protocol order specified in SQL Server Configuration Manager. [My emphasis]

port is the port to connect to, on the specified server. By default, SQL Server uses port 1433.

Thus as we were using a tcp connection we preceded the ip address with ‘;tcp:’

Again if you do not specify this the server has to test what connection is going to work, thus more time is consumed identifying the connection method required.

The correct provider for an ADO connection to a SQL Server 2008 is ‘SQLNCLI10’ use it!

Don’t waste your time with Network Packet size unless you are trying to squeeze every last bit of speed out of the system (in which case rewrite the application/web site in a modern code form!)

Define DataTypeCompatibility ADO works best if set to =80

Final connection string that worked:

strconn= “Provider=SQLNCLI10; Data Source=tcp:ipaddress,post_noSQLExpress;DataTypeCompatibility=80;MARS Connection=True; Initial Catalog=databasename;User ID=SQL username; Password=SQLUserpassword;”

The instance string (i.e. sqlexpress) appeared to assist in responsiveness thus I would suggest it is included every time.

If you are using pre-existing ASP code then very little will be gained from including the MARS connection=true as your code will be designed to close the connection on a single use basis. To gain advantage you may need to alter the code to not close the connection until all the sql interaction has completed.

 

SEO and VPASP Ecommerce Software Ask a Question Page

We continue to tinker with the VPASP Ecommerce engine, the latest issue we looked at related to the SEO capabilites and was highlighted by, in my humble opinion, the very good SEO Module you can get as a free add on for IIS.

As an aside I discovered the module will SEO analyise any web site you specify, which is great for analysing your competition!

The issue we kept reported was on the shopquestion.asp page which exists to allow customers (or potential customers) to ask a question about a specific product and is reporduced (if turned on) for every product in the database.

The page has been the same for years and is poorly put together as it has no unique title, empty meta description and no keywords. Now in the past this was not a major issue as the page was more functional rather than of benefit to the whole SEO environment but following the Google Panada Technology implementation thin pages could start to impact your site especially if they do not meet with the Google quality guide lines.

As a simple and quick temporary solution I have blocked the pages from being indexed from search engines by placing a rel=”nofollow” in the hyper link to the page to prevent indexing.

If you want to also it’s quite easy, in a text editor or WDE open shopfileio.asp, find the sub Handle_question (it was the final sub in my file) and change the line

strMessage = “<a href=””” & strURL & “””>” & GetLang(“langQuestion”) & “</a>”

to read:

strMessage = “<a href=””” & strURL & “”” rel=””nofollow””>” & GetLang(“langQuestion”) & “</a>”

Save the file and upload your revised file to your server. If you think that the pages that have already been index are causing you a issue you may want to go to Google’s webmaster central and ask for them to be de-indexed.

When we get some spare time we’ll look to improving the page design, so watch this space (but not to often ;0)

BT Connect Hosting – poor help files but good email service

I’ve been meaning to update our free domain we use to talk about anniversary gifts that we get from BT for a while now and I just managed to do it with a bit of hassle. We’ve upgraded our broadband a couple of times since we put the original site live and could not access the site using the security information provided.

We tried the help files or chocolate fire guards as I called them to no avail so emailed the broadband support team. The page said they were responding in about 5-6 hours, to my surprise I got a response within an hour which although did not cure it was both knowledgeable and courteous.

After a few more emails back and forth I managed to find the issue was with how access was gained as I could connect using the link  ‘Access FTP’ however this must be from a IE 6 or earlier browser as later versions do not allow it. Thus I knew the passwords and user names were fine.

Following the Help pages and using Microsoft’s Web expressions software I could not access the FTP site, so I started experimenting with the settings. Eventually I got access by setting the Locations Directory to ‘/pages’ ensure the slash is included otherwise it won’t work. When connected the url will show up as …btconnect.com/%2fpublic but the connection works.

 

Update:  June 2017 – We’ve removed these pages now from the BT servers as the domain it was hosted on (home2.btconnect.com) was flagged through Google as hosting Malware, this was as a result of someone else’s site being hosted on this platform being comprimised.

Adding VPASP ecommerce store to Facebook

We’ve been using VPASP ecommerce software for a number of years, it is open source and uses active server pages (asp) as its software language, this language is very mature and hence knowledge is widely available.

With the emergence of social media and Facebook becoming the dominant social media channel we were looking at how to increase our presence in this channel, if you read the facebook integration help sections it all seems quite daughting whereas in fact it is quite simple to get a very basic integration up and running, here is how we did it.

We subscribe to the .NET magazine, an excellent publication that has a lot of useful snippets; one of these from the Feb 2011 edition covered how to integrate a web page with Facebook, following this guide we worked out that the simplist way to achieve putting our store on facebook was to use the iframe integration.

Facebook use the term Canvas instead of iframe and hence once you register for app development you would set up your apps canvas page to point to your desired store URL.

I should state at this point that we use the deluxe version 7 of VPASP (with service pac applied) and hence have access to all of VPASP’s facilities so lesser versions may restrict this capability and we have not tested any to see if they will.

You cannot merely use an iframe to display your current store front as the iframe on facebook is limited to 758 pixels wide at a maximum and you would end up with users having to use scroll bars to see the entire page width, a very poor user experience.

Our solution to this was to copy the entire web site to a sub directory, set up an additional store in VPASP configuration and use the default 2 column layout that comes with Version 7.

This almost worked ‘straight out of the box’ however the width was just over so we then adjust the #wrap width in this sub-directory to be 740px wide in CSS (file:layout.css of the relevant template folder being used). With this set up we used the URL to this sub directory as the Canvas page URL and set the status within the apps directory to live.

The additional benefit of using a sub-directory is your SSL certificates will remain valid thus allow the complete checkout cycle to be available to be managed in the Facebook page.

I must stress this is a basic integration, I’m working on the autocheckout side of it to see if we can take user details for auto filling in the address details/email etc. The site may need a further tweak also as to embed the site into a page (fan page etc.) you are limited to a width of 520pixels

I’ve added a Robots.txt file to exclude all spidering as effectively the site would be virtually the same as your main site so to avoid duplication issues it’s easyiest  to exclude the facebook site.

Database Driven Web Solutions

There are many ways of delivering content for your web pages, with a database back end solution it is possible to ensure your web pages are dynamic, interactive and deliver up to date information at a lower cost than having to update static pages.

Using a database can significantly reduce costs for your web presence where a single page with a database back end can deliver hundreds or thousands of different pages that would otherwise need to be developed and maintained using static html pages.

Responding to changes in products or content can also be carried out significantly quicker with a single change on the database being reflected in all pages once the transaction is committed in the database.

The cost to implement a database back end need not be expensive either depending upon your target market and its potential size you could implement a database solution using MS Access or MySQL database which are very powerful and relatively low cost.

As way of example the web site we developed Anniversary Ideas is currently running with a MS Access database and MS SQL 2008 back end and using two files delivers over 1600 pages of different content which in turn generates tens of thousands of links for customers visiting the site, all of which are relevant to their specific needs.

If you would like to discuss your needs with an initial free consultation then please contact us either by phone on 0870 765 3436 or email us