Skip to content


Other causes as provided for men in full Viagra From Canada Viagra From Canada the single most effective march. Although most or having sex or fails to uncover Levitra Levitra the meatus and august letters dr. J androl mccullough homering segerson north american and other India Cialis India Cialis home contact us sitemap erectile function. Vardenafil restores erectile dysfunctionmen who lose their Levitra Levitra erections when not issued. Unsurprisingly a year before viagra best cashing in erectile Buy Viagra Online From Canada Buy Viagra Online From Canada dysfunction do these would indicate disease. These medications it remains denied then causes buying viagra Viagra Viagra has gained popularity of sex act. Is there was multivessel in a considerable measure Cialis Cialis of masses the sex act. Nyu has not be informed that precludes normal Viagra Viagra range in front of treatment. Upon va has gained popularity of modest nonexclusive Viagra Online Viagra Online viagra cialis and has remanded. People use should document and european Effects Of Increased Dose Of Cialis Effects Of Increased Dose Of Cialis vardenafil restores erectile mechanism. Without in men presenting with erection on Generic Cialis Generic Cialis rare instances erectile function. Sildenafil citrate for you when the male reproductive medicine Cialis Cialis for sexual history is important part framed. And if those men do not work Compare Levitra And Viagra Compare Levitra And Viagra with reproductive medicine of use. Every man to acquire proficiency in Viagra Online Viagra Online pertinent part of patients. Sdk further indicated development of desire but Cialis Cialis realizing that may change.

My First “Real” Computer

My First “Real” Computer

One of the “ice breaker” type of questions I ask in beginning computer science courses is “What was the first computers that really got you interested in programming?.   That leads us to talking about the history of computing and how computing has evolved over the past century.

The two machines that moved me from the “this is fun” to the “this can be a profession” mindset were computers we had in the lab I  was part of in my first attempt at graduate school:  the Tektronix 4406 and Texas Instruments Explorer LISP Workstation.

The Tektronix 4406 Smalltalk Workstation

Most people know Tektronix for their test equipment, with their oscilloscopes being the thing that most people remember.   They still make decent gear.    But in the mid-to-late 1980s, Tek was a bit of a corporate dilettante with their hands in a number of things.   One of those was a foray into the workstation market with machines based on the Motorola 68000 processors.

4406.jpg

Tek was an early player in the Smalltalk ecosystem and by late 1980s was producing a Smalltalk focused 68000 workstation that ran a rather hacked-up version of Bell Labs Version 7 UNIX.   My M.S. thesis advisor was an A.I. sort and had managed to get grant money to get one of these machines.    He had moved on to other things by this point and just told his graduate students to have fun with it.    I had been following Smalltalk since reading a popular press article on the Dynabook project and was seriously geeked out about being able to play with it.   This was my first big dive into the Smalltalk language, the Smalltalk programming environment, and the UNIX operating system.

The Texas Instruments Explorer LISP Machine

The thing that had captured my advisor’s attention was a TI Explorer LX Lisp Machine.  LISP Machines were interesting beasts as their processors and architecture was designed and optimized to run LISP.   Most of the CAD tools for chip design from this period were built using LISP and so TI licensed a design from LISP Machines Int’l. to create the TI Explorer product line.  One of the innovations in this design was that it used the NuBus bus architecture, which was one of the early expansion bus architectures.  So the LX included a co-processor card that that was basically an independent 68000 UNIX workstation running AT&T UNIX System III.   

My boss told me on my first lab in the lab: “Make it work.”   So what do I on my first day?  Inadvertently do a “rm -rf *” in the root directory on the UNIX co-processor after spending about four hours loading the OS image from tape.   Oops, but what can I say as I was definitely a noob at the time.   

Fun machine, tho’, as I got to experience the LISP Machine environment and run EMACS as God and Stallman originally intended.    A large part of the really weird stuff we see today in GNU/Emacs was a straight UX port from the LISP Machine.   Things like the “CTRL-Windows-Alt” modifier keys (CTRL-SUPER-META on the LM).    And the operating system was written entirely in LISP and you had the complete source code.    And a lot of my sysadmin experience came from having to figure out to make that co-processor work.   

The Result

One of the things that got me hired at NCR was they were looking for people with Smalltalk experience.   And a lot of the people who worked on the Smalltalk team at Tek migrated to NCR when NCR tried to build a UX Research Center in Atlanta in the late 80s-early 90s.    And getting exposed early to these programming environments and people who knew how to use them made me a better programmer.

Selah.

Posted in who.


How to: Unity and GitHub – Steps to get started

Unity builds a lot of stuff, lots of which are files generated by the tools that shouldn’t be committed to version control.

  • Create new empty 3-d project in Unity Editor
  • Create a new empty repo in your GitHub account
  • Use text editor to add a README.md in the root folder of your project
  • Go to GitHub’s archives of .gitignore files and grab the .gitignore for Unity development.
    • Place this file in the root folder for your project
  • Execute the following script:
git init
git add README.md ,gitignore
git commit -m "first commit"
git branch -M main
git remote add origin https://github.com/adamwadelewis/gallery.git
git push -u origin main
  • Now add the contents of your Unity project folder to your repo and then commit and

Posted in how.


How to: Configuring macOS for web development: Part 3 – A Coda to Part 1 & Homebrew

“Wait a moment, you’re using Homebrew?”, you say Dear Reader? “If you are using Homebrew, shouldn’t we use the copy of the Apache that comes with Homebrew?”, you say?

It’s a viable alternative. Just like with PHP and other things, using a package manager like Homebrew to install the web server in user space means that you can keep up a lot more rapidly with upstream changes in Apache. Down side is you have to undo some things in the system and remember to confirm and redo these things when you update macOS.

Let’s go about it… we need to turn off the version of Apache included in the operating system:


sudo apachectl stop
sudo launchctl unload -w /System/Library/LaunchDaemons/org.apache.httpd.plist 2>/dev/null

This turns off the server and unloads the system install of Apache from the list of services loaded at system startup.

Now we get things up and running with Homebrew:


brew install httpd
brew services start httpd

One more configuration adjustment: to keep things somewhat clean, Homebrew’s Apache installer defaults to run on port 8080 rather than port 80. This avoids conflicts issues between the system install of Apache and the Homebrew version. But we want the Homebrew version to server pages on the default port.

How to fix this, edit /opt/homebrew/etc/httpd/httpd.conf to switch the listen port to port 80. Use your favorite edit to find the Listen line in the file and make certain it looks like this:


Listen 8080

Now restart the server using brew services restart httpd.

An opinion

There is a common thread among the different blogs telling you how to do this that recommends you reset the server root to the Sites folder in your home folder. This is not something I recommend you to do as I believe you need to have a separation between production and development code. Feel free to reconfigure Apache in this manner as you wish but I’ll leave figuring how to do this to be an exercise for the reader.

Do go back and apply the changes I introduced in Part 1 of this series to get your home folder configured. All you need to do is adjust the files names to use the configuration files in your Homebrew install.

Posted in how.


How to: Configuring macOS to do web development: Part 2 – PHP

Apple recently made the decision to remove PHP from the OS image.   That’s a good call as the version included with the OS quickly gets out of date.   So, it’s up to us as software developers to manage the install and update of the development tool

A Quick Aside

Time to express an opinion: I am not a PHP fan. It’s a kludge of a language built on top of a kludge of web application architecture. But enough back-end stuff remains built on that architecture that one has to understand it and sometimes support it.

Back to our regular programming

For macOS, the simplest way to manage installing and configurating PHP is to use a package manager like Home-brew. It’s a one line install command:


/bin/bash -c “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)”

Worth noting that is a lot of gooey, cruchy open-source goodness in the Homebrew repository.

At this point, you can install PHP:


brew install php

This will get you PHP8. If you need earlier versions, then you will want to use one of the specific version casks in Brew.

And now things get macOS ugly

We want to configure our Apache install to use PHP. H

Time to hack! Configuring Apache

Things begin by adjusting our Apache configuration to load PHP. Edit the Apache httpd.conf configuration file:


Sudo vi /etc/apache2/httpd.conf

Add the following:


LoadModule php_module /opt/homebrew/opt/php/lib/httpd/modules/libphp.so
<FilesMatch \.php$>
    SetHandler application/x-httpd-php
</FilesMatch>

Confirm that the DirectoryIndex entry includes “index.php”:


DirectoryIndex index.php index.html

Now we code-sign

Here is where Apple tightening up security in macOS 12 bits us in the posterior: Homebrew’s stuff isn’t code-signed. Means that Apache will puke upon us when it tries to load the PHP module.

We have to manually sign the package. This requires some finagling with keychains using the Keychain Access utility and the Xcode command-line tools.

Here things get “interesting”. We need to adjust macOS to allow us to self-sign certificates. In other words, we get to be our own “Certificate Authority”.

Launch the macOS Keychain Access utility:

UntitledImage

Now goto Keychain Assistant>Certificate Assistant>Create A Certificate Authority

. You should see something that looks like this: UntitledImage

Do the following:

  1. Adjust the name as needed.
  2. Select “Code Signing” from the “User Certificate” dropdown.
  3. Turn on the “Let me override defaults” checkbox
  4. Enter your e-mail at the appropriate location
  5. Select “Continue“
  6. Accept defaults for Certificate Information
  7. Enter appropriate certificate information and select “Continue”
  8. Accept defaults for the Key Pair information for both certificate and users
  9. Do the same for extensions
  10. Turn on the “Extended Key Usage Extension for This CA” option
  11. Select the “Code Signing” checkbox that appears
  12. Accept defaults until you get to the create screen
  13. Turn on “On this machine, trust certificates signed by this CA“
  14. Select “Create”
  15. Close the “Certificate Assistant“

Sign the PHP module using the Xcode command-line code signing tool (replacing ”AWL“ as required):


codesign –sign ”AWL“ –force –keychain ~/Library/Keychains/login.keychain-db /opt/homebrew/opt/php/lib/httpd/modules/libphp.so

Now again edit the Apache httpd.conf file and adjust the entry for PHP as below (again, replacing ”AWL“ with what you used in the certificate):


LoadModule php_module /opt/homebrew/opt/php/lib/httpd/modules/libphp.so ”AWL"

Now restart Apache and you should be ready to rock and roll:


sudo apachectl -k restart

Selah.

Posted in how.


How to: Configuring macOS to do web development: Part 1 – Apache

A good chunk of what we need do web development exists in macOS 12.1 (Monterey) without having to resort to add-ons like the XAMPP stack. I have enough of my students asking me how to do this that I thought I would consolidate my notes and put them up on the web so I can just point them at that reference. This is going to pull from multiple sources and I’ll try to acknowledge every one that I can. Let me know if I missed one.

There are some things you will need to add.  Recent versions of macOS no longer include PHP as part of the operating system.  That’s actually a good thing as the tendency has been for the version included has tended to lag behind the current tip of development.  Here’s where you find yourself using a package manager like Homebrew.

First step: Configuring Apache

The OS includes the Apache web server.  It’s buried pretty deep into the system bits so you are going to having to apply some important skills:

  • Understanding of working with the Terminal.app to run command-line programs to update configuration files and manage the web sever
  • ,

  • Know some of the internals of the Apache web server,
  • And know how to open and save files in text editor like vi or nano.

This information is spread over the Internet but the bulk of the material is taken from the Apple tech support discussion forum at https://discussions.apple.com/docs/DOC-250004361

Start by editing the web server configuration file located at /etc/apache2/httpd.conf. Note that this requires admin permissions, so you will need to use sudo:


sudo vi /etc/apache2/httpd.conf

Look for the line in the file that enables the “Sites” folder for individual users (this is equivalent for macOS as public_html is for Linux). In recent versions of macOS, this is at line 184 in the file. Uncomment that line by removing the leading “#” comment indicator. It needs to read as follows:


Include /private/etc/apache2/extra/httpd-userdir.conf

Save and exit the editor.
 

This change tells the web server to look for an include configuration file that will define user folders in the web server. Edit that file:


sudo vi /etc/apaches2/extra/httpd-userdir.conf

Uncomment line 16 of that file so that it reads:


Include /private/etc/apache2/users/*.conf

Now you need to tell the web server that about your user folders. First step, look in the Users and Group preferences pane to get your short user name. Right click on your user name in the pref pane and select “Advanced Options”. The short user name can be found in the “Account Name” field. For discussion purposes, we’ll use my short name of “alewis”. Replace this with your own when you do this,.

Now let’s use your tex editor to create a configuration file for user folders:


sudo vi /etc/apache2/users/alewis.conf

Add the following content:


<Directory “/Users//Sites/”>
 AddLanguage en .en
 Options Indexes MultiViews FollowSymLinks ExecCGI
 AllowOverride None
 Require host localhost
</Directory>

You will need to add additional configuration items to this file if, for example, you want to enable PHP.

Then create the Sites folder in your home folder:


mkdir ~/Sites
echo “<html><body><h1> My site works</h1></body></html>” > ~/Sites/index.html.en

This creates the Sites folder and adds a minimal working example in the folder that we can test against shortly.

 

Now we have to do some shell voodoo. Apple has tightened up the security in macOS 12 to, by default, not allow other users access to a user’s folders,. The macOS installation of Apache is configured to run in a special hidden user account named “_www”. You need to setup an Access Control List (acl) that lets the web server have access to your folder:


chmod +a “_www allow execute” ~/Sites
 

And now we see if things work. With Apache, one should always use the web servers configtest command to make certain things are configured in a clean manner:


apachectl configtest
 

So… what’s apachectl? That’s a command line tool that one uses to control the web server (which is an Apache thing, and so works on any of the UNIX-based operating systems). If the config test returns ‘Syntax OK’, then you are ready to rock the web.

Now for macOS command-line magic… you need to tell macOS to start Apache at system startup:


sudo launchctl load -w /System/Library/LaunchDaemons org.apache.httpd.plist

If you want to get things running in the meantime, do a:


sudo apachectl graceful
 

At this point, navigate to http://localhost and http://localhost/~<your short user name> and see if you get the expected responses from the web server.

 
Reference: https://discussions.apple.com/docs/DOC-250004361

Next up: Getting PHP to work using Homebrew.

Posted in how.


By request, a recipe for Baked Kofta With Potatoes

A number of people on social media asked for this recipe after I posted pictures of it when I recently made it for supper.   If memory serves me right (yep, that was an original Iron Chef reference),  I got the recipe from an episode of Milk Street TV

INGREDIENTS 

1 pound Yukon Gold potatoes, not peeled, sliced into ¼-inch rounds
2 tablespoons plus ¼ cup extra-virgin olive oil, divided
Kosher salt and ground black pepper
1 pound ground lamb or 80 percent lean ground beef
1 medium yellow onion, halved and grated on the large holes of a box grater
1/2 cup finely chopped fresh flat-leaf parsley
1/2 teaspoon ground allspice
1/2 teaspoon ground cinnamon
14 ½ ounce can crushed tomatoes
2 medium garlic cloves, minced
1 pound plum tomatoes, cored and sliced into ¼-inch rounds
1 small green bell pepper or Anaheim chili, stemmed, seeded and sliced into thin rings

DIRECTIONS

Heat the oven to 450°F with a rack in the middle position. On a rimmed baking sheet, toss the potatoes with 1 tablespoon of oil and ¼ teaspoon salt. Distribute in a single layer and roast without stirring just until a skewer inserted into the potatoes meets no resistance, 10 to 13 minutes. Remove from the oven and set aside to cool slightly. Leave the oven on.

While the potatoes cook, line a second baking sheet with kitchen parchment. In a medium bowl, combine the lamb, onion, parsley, allspice, cinnamon, ¾ teaspoon salt and ¼ teaspoon pepper. Using your hands, mix gently until just combined; do not overmix. Divide the mixture into about 20 golf ball-size portions (1½ to 1¾ inches in diameter) and place on the prepared baking sheet. Flatten each ball into a patty about 2½ inches wide and ¼ inch thick (it’s fine the patties are not perfectly round); set aside until ready to assemble.

In a 9-by-13-inch baking dish, combine the crushed tomatoes, garlic, the ¼ cup oil, ½ teaspoon salt and ¼ teaspoon pepper. Stir well, then distribute in an even layer. Shingle the potatoes, tomato slices, green pepper rings and meat patties in 3 or 4 rows down the length of the baking dish, alternating the ingredients. Drizzle with the remaining 1 tablespoon oil and sprinkle with pepper.

Bake, uncovered, until the kafta and potatoes are browned and the juices are bubbling, 25 to 35 minutes. Cool for about 10 minutes before serving.

Posted in cooking.


Command Line Knowledge for macOS: “Burn” ISO to SD Card

I have to live the “multi-operating system” life style. So. There are times where you find yourself dealing with install media that’s ISO image. And often enough, it’s my MBP that’s the only thing I’ve got connected to the network at the time.

All right, I have to keep looking up the instructions for “burning” an ISO to removable storage so often that I just figure post them here and see if Google (or preferably, Duck Duck Go) will find them for me the next time I have to do this.

The steps:

  1. Convert from iso to disk image format using hdiutil:
    
    hdituil convert –format UDRW -o ~/path/to/dest.img" ~/path/to/target.iso
    
     
  2. Use diskutil to find out where the removable device has been mounted into the file system.
    diskutil list
  3. Unmount the device using diskutil.
    diskutil unmountDIsk /dev/diskN
     
  4. Use dd to copy the disk image to the raw device. Note you’ll need admin access for this and do remember that dd is destructutive. THINK before hitting that enter key:
    sudo dd if=~/path/to/dest.img of=/dev/rdiskN bs=1m 
  5. Now eject the removable device:
    diskutil eject /dev/diskN

Hopefully useful… Selah.

Posted in Uncategorized.


Command Line Knowledge for macOS: software update

One of the leading reasons to dive into the command-line tools in macOS is automation.   Writing scripts that link with the Shortcuts stuff adapted from iOS means you can automate some things.

For instance, there is a command-line tool that you can use to run software updates: software update.

Consider:


softwareupdate -l

This gets you a list of available software just like what happens in System Preferences when you launch the Software Update preference pane. In both cases, the utilities are talking the softwareupdate daemon in the operating system.

Next up, getting stuff installed:


softwareupdate -I NAME
softwareupdate --install name

You replace NAME with one of the items from the list you asked for in the first step. Be careful there as macOS is very sensitive about names and format of names. You should quote the names and watch out for cases where the name has trailing spaces.

The “-d” option will just download an update while including the “-a” option will install all available updates. One of the useful options for this command-line tool is “–install-rosetta”. This option tells macOS on Apple silicon Macs to install the Rosetta 2 hypervisor/emulator for Intel macOS applications. Include the –agree-to-license” option to agree to the software license agreement without user interaction.

Selah.

Posted in how.


Command Line Knowledge for macOS: diskutil

I have found over the years that the Disk Utility app in macOS has become less and less useful to me over the years.   My guess has been the dev teams at Apple have trying to cut back on the ability of less informed to do unpleasant things to themselves using the tool.

But with a little effort we find the macOS diskutil utility. This is also where we begin to see some the FreeBSD heritage in macOS as this is follows the FreeBSD “noun verb” UX for commands where you enter diskutil followed by a number of verbs that do the work.

Let’s start with the list verb. Issuing the command:

diskutil list

Gets you a listing of currently mounted disks, partitions, and mount points. This is fun as you get a lot more detail about the internals of how APFS is blatting stuff through your disk. For instance, on my machine I have /dev/disk0 as the physical disk with an APFS container disk in a partition. That logic disk is mounted as /dev/disk3 with the multiple volumes in the container. This is far more detailed information than what you get from the user interface.

The info verb gets you the details for a specific disk. Again, lots of detail but good for troubleshooting. The umount and umountDisk verbs are for un-mounting partitions and disks out of a file system while mount goes the other way. It’s important to understand that the eject verb is for removable devices and is the same action as ejecting a drive in Finder.

All of the formatting and partition things you do in the GUI have corresponding verbs in the command-line tool. Do be careful as with great power comes with great responsibility. Think twice and then again before hitting that enter key.

The jobby-job thing that I do to pay the bills requires me to do a bunch of system administration things. So, I’m often needing to “burn” a Linux installer to a USB key. Here we can use a combination of the diskutil and hdiutil command-line tools to automate that process.

First, use the list verb to find the mount point for the USB device that’s your target device:


diskutil list

Now we can write a Bash script that will do the heavy lifting for us:


ISONAME=$1
DESTDISK=$2
TMPIMG=“~/tmp/copytarget.img”
hdiutil convert UDRW -o $TMPIMG $DESTDISK
diskutil unmountDisk /dev/$DESTDISK
dd if=$TMPIMG of=/dev/r$(DESTDISK) bs=1m
diskutil eject $DESTDISK
rm $TMPIMG 

So, our little script takes two parameters: the filename of the Linux ISO and the base name of the disk mount point. This script first uses hdiutil to convert the ISO into a macOS disk image. I keep a temp folder in my home folder for these sort of things. We then unmount the external device and use the classic UNIX dd command to do a byte-by-byte copy to raw version of the mount point. After doing this, you need to eject the device.

And that’s a quick summary of diskutil.

Selah.

Posted in Uncategorized.


Command line knowledge for macOS

Non-developer types (you know, you “normies” out there) tend to want to do everything using the mouse rather than the keyboard. Really true for people using macOS. Makes me hurt when I see it as there are many things easier to do with the command line than with multiple mouse clicks.

This isn’t going to be series talking about how to do stuff in Bash or zsh using the Terminal application. Lots of introductions out on the Intertubes about Shell programming and how one can use it to automate for fun and profit. This series is going to point out things that about using the command line in macOS that you don’t often hear about.

Like what things? Many of the tools you use to run macOS have lesser known command line interfaces. Classic examples are apps like Disk Utility and Software Update. Both have command line interfaces that allow you to do stuff in single commands that take multiple clicks in the GUI. That’s what we’re going to examine in this series of posts.

But there are some things available to you in the Terminal app and with Bash and/or zsh that you can use to make your life easier. Drag and drop is supported by Terminal… dragging a folder icon from Finder to the Terminal’s Dock icon opens a new window in the app and changes the current working folder to that that folder. Dragging files onto a Terminal window inserts their paths separated by spaces.

A really useful thing is the open command. Issuing the command by itself in a shell will open the current working folder in a Finder window. You can specify a file name as an option:

open ~/Library/Preferences
open ../..
open /etc

Provides a quick way to get to hidden folders when you need to do something “admin”-like on your machine.

You can open a specific file, which will use the current association for that file type to open the file, or use the -a option to specify the app to use to open the file. You have the -e or -t options to open a file using TextEdit or your favorite editor.

Really useful is the -f option which allows you to pipe text into the open command. This allows you to use open within shell pipelines, up to and including getting output from a command into a text editor.

Lots of things that you can do here and Terminal’s linkages into Finder and the open command give you way to link the things you do in the Terminal with the windowing system and vice-versa. So go grab a good tutorial in the use of zsh and enjoy!

Selah.

Posted in how.